Dan Gleesak
Well-known member
I still don't understand the "logic":
If the horizon dips 8" in a mile, why not 16" in 2 and so on.
To suggest there's a logarithmic increase defies my lil' ol' brain's capacity for reason.
I'd really appreciate it if someone could explain this to me in laymonkey's terms.
I made the same error as you referencing the Joe Rogan video actually, in terms of the total “drop” at 100 miles. I simplified it too much (Though I do not believe it to be 6000 whatever feet)
When someone solves for the drop after one mile, we use the radius of the earth, and one mile as the constants, and are solving for the 3rd line which is the drop, in this case the hypotenuse.
However if you change it up and use the radius and the drop as the constants, and solve for the sight distance, it shows that it isn’t exactly 8” every mile. It just happens to be 8” for one mile
Sorta make sense? Why that is, Couldn’t really tell ya man lol
All this gets more confusing because line of site depends on a lot of different environmental variables as well
Last edited: