It doesn't strike me as a particularly difficult math problem. Probably harder to describe that it would be to solve it if I could give you a diagram.

Draw a circle - this represents the earth. Then draw a line segment from the centre straight out to the height you want to measure from. (Let's say it's 1000 miles.) Then draw a tangent to the circle that just goes through the far end of the line. (A tangent is a line that just touches the circle at one point - think of laying a yardstick on top of a basketball, it's touching the ball but doesn't stick into the ball - that's a tangent.)

Anyway, these three points - T, where the tangent touches the earth; X, where the observer is above the earth, and C, the centre of the earth - form a right triangle.

So let's say the observer is 1,000 miles above the earth. Then the distance between him and the centre of the earth, XC, is 1,000 + r, where r is the radius of the earth. That's the hypotenuse of a right triangle. The other two sides of the right triangle are CT, which is a radius of the earth and so is of length r, and XT, which is the observer's straight line view to the horizon. (Hopefully, it'll make more sense if you draw a picture of it.)

The angular distance will be given by angle TCX, where:

cos(angle(TCX)) = r/(1000+r)

(Of course, the total coverage that the observer can see will be a circle on the surface of the earth.)

Multiply angle(TCX) (in degrees) by 60 to get the number of nautical miles (along the earth's surface).

The actual straight-line distance that the observer can see is the length of line XT, which is obtained using Pythagoras's theorem:

XT = sqrt((1000+r)^2 - r^2)

I hope this is what you were looking for, and Chris K. doesn't have to shut down this thread for "too much math" <img src="/images/graemlins/laugh.gif" alt="" />

_________________________
"The mind is not a vessel to be filled but a fire to be kindled."
-Plutarch