I have a flat earth problem of a missile that needs to return to launch pad. The solution to this problem (using convex optimization in case you are interested) is then meant to be fed to a simulator that uses a NED spherical coordinate system. It also uses longitude latitude and altitude. What I need to do is be able to freely convert between the two coordinate systems.

From: http://walter.bislins.ch/bloge/index.asp?page=Globe+and+Flat+Earth+Transformations+and+Mappings

Conversion from geographic flat earth to cartesian flat earth is the following:

$X=(\frac{\pi}{2}-\phi)R(cos(\lambda))$

$Y=(\frac{\pi}{2}-\phi)R(sin(\lambda))$

$Z=(\frac{\pi}{2}-\phi)R(h)$

$\phi$ is the latitude and $\lambda$ is the longitude.

but this doesn't seem right to me as the altitude (Z) will then explode as you are multiplying the radius of the Earth $R$ by the altitude $h$.

Edit: To better explain the conversion, I begin with a stationary observer at the launch pad on the Earth that is considered the origin and target for the missile. This then has to be converted into a flat earth definition where the distances projected on the Earth are considered as the x and y distances in the flat earth problem. These distances, in my intuition, should be distances formed from the minor circle arcs, that is the longitudinal and latitudinal lines from the observer, and the point the missile makes on the ground.

## Best Answer

The solution was to represent the downrange and crossrange with the equations above to determine how far it was from the launch pad in metric distances. For the algorithm, the inputs required where just the distances in the ECI frame (x,y,z).