Your arm is about ten times longer than the distance between your eyes. That fact, together with a bit of applied trigonometry, can be used to estimate distances between you and any object of approximately known size.
Imagine, for example, that you’re standing on the side of a hill, trying to decide how far it is to the top of a low hill on the other side of the valley. Just below the hilltop is a barn, which you feel reasonably sure is about 100 feet wide on the side facing you.
Hold one arm straight out in front of you, elbow straight, thumb pointing up.
Close one eye, and align one edge of your thumb with one edge of the barn.
Without moving your head or arm, switch eyes, now sighting with the eye that was closed and closing the other.
Your thumb will appear to jump sideways as a result of the change in perspective.
How far did it move? (Be sure to sight the same edge of your thumb when you switch eyes.)
Let’s say it jumped about five times the width of the barn, or about 500 feet.
Now multiply that figure by the handy constant 10 (the ratio of the length of your arm to the distance between your eyes).
Now you get the distance between you and the barn—5,000 feet, or about one mile. The accompanying diagram should make the whole process clear.
With a little practice, you’ll find that you can perform a quick thumb-jump estimate in just a few seconds, and the result will usually be more accurate than an out-and-out guess. At a minimum, it will provide some assurance that the figure is in the ballpark—which, in many cases, is as close as you need to get.