Another way to approach this:
For a projectile launched at x = 0 the path the projectile takes is a parabola of the form y = Ax^2 + Bx, with the constants A and B governed by the initial velocity and angle of the shot. At any point along that arc the line from the launch point to the projectile has a slope of y/x, or Ax + B. The direction of travel of the projectile is given by dy/dx = 2Ax + B. Now, the projectileis moving away from the origin if the angle between the direction of travel and the line from the origing is greater than 90 degrees. If that angle is less than 90 degrees then the distance from the origin to the projectile is decreasing. So what we want o know is whether at any point along its path this angle is exactly 90 degrees, or perpendicuar. Remember that the sloped of two lines that intersect perpendicularly are the negative inverse of each other, so what we want to know is whether at any point does dy/dx = x/y, or
2Ax + B = 1/(Ax +B)
I haven't taken it further than this, but perhaps ths will help.
