*Michael Fowler*

*U. Va. Physics*

Link to Physics 109N Home Page and Other Topics!

In this lecture, we shall show how the Greeks made the first real measurements of astronomical distances---the size of the earth and the distance to the moon, both found rather well, and the distance to the sun, where their best estimate fell short by a factor of two.

The first reasonably good measurement of the earth's size was done by Eratosthenes, a Greek who lived in Alexandria, Egypt, in the second century B.C. He knew that far to the south, in the town of Syene (present-day Aswan, where there is now a huge dam on the Nile) there was a deep well and at midday on June 21, the sunlight reflected off the water far down in this well, something that happened on no other day of the year. The point was that the sun was exactly vertically overhead at that time, and at no other time in the year. Eratosthenes also knew that the sun was never vertically overhead in Alexandria, the closest it got was on June 21, when it was off by an angle he found to be about 7.2 degrees, by measuring the shadow of a vertical stick.

The distance from Alexandria to Syene was measured at 5,000 stades (a stade being 500 feet), almost exactly due south. From this, and the difference in the angle of sunlight at midday on June 21, Eratosthenes was able to figure out how far it would be to go completely around the earth.

Of course, Eratosthenes fully recognized that the Earth is spherical
in shape, and that "vertically downwards" anywhere on
the surface just means the direction towards the center from that
point. Thus two vertical sticks, one at Alexandria and one at
Syene, were not really parallel. On the other hand, the rays of
sunlight falling at the two places *were* parallel. Therefore,
if the sun's rays were parallel to a vertical stick at Syene (so
it had no shadow) the angle they made with the stick at Alexandria
just how far around the Earth, in degrees, Alexandria was from
Syene.

According to the Greek historian Cleomedes, Eratosthenes measured the angle between the sunlight and the stick at midday in midsummer in Alexandria to be 7.2 degrees, or one-fiftieth of a complete circle. It is evident on drawing a picture of this that this is the same angle as that between Alexandria and Syene as seen from the center of the earth, so the distance between them, the 5,000 stades, must be one-fiftieth of the distance around the earth, which is therefore equal to 250,000 stades, about 23,300 miles. The correct answer is about 25,000 miles, and in fact Eratosthenes may have been closer than we have stated here---we're not quite sure how far a stade was, and some scholars claim it was about 520 feet, which would put him even closer.

How do we begin to measure the distance from the earth to the moon? One obvious thought is to measure the angle to the moon from two cities far apart at the same time, and construct a similar triangle, like Thales measuring the distance of the ship at sea. Unfortunately, the angle difference from two points a few hundred miles apart was too small to be measurable by the techniques in use at the time, so that method wouldn't work.

Nevertheless, Greek astronomers, beginning with Aristarchus of Samos (310-230 B.C., approximately) came up with a clever method of finding the moon's distance, by careful observation of a lunar eclipse, which happens when the earth shields the moon from the sun's light.

To better visualize a lunar eclipse, think of holding up a quarter
(diameter one inch approximately) so that it just blocks out the
sun's rays from one eye. (Of course you shouldn't try this---you'll
damage your eye! You *can* try it with the moon, which happens
to be the same apparent size in the sky as the sun.) It turns
out that the right distance is about nine feet away, or 108 inches.
If it is further away than that, it cannot block out all the sunlight.
If it is closer than 108 inches, it will totally block the sunlight
from some small circular area, which gradually increases in size
as one gets closer to the quarter. Thus the part of space where
the sunlight is totally blocked is conical, like an icecream cone,
with the point 108 inches behind the quarter.

Now imagine you're out in space, looking at the earth's shadow.
It must also be conical, just like that from the quarter. And
it must also be *similar* to the quarter's in the technical
sense---it must be 108 earth diameters long! That is because the
point of the cone is the furthest point at which the earth can
block all the sunlight, and the ratio of that distance to the
diameter is determined by the angular size of the sun being blocked.
This means the cone is 108 earth diameters long, the far point
864,000 miles from earth.

2

Now, during a total lunar eclipse the moon moves into this cone
of darkness. It can still be dimly seen, because of light scattered
by the earth's atmosphere. By observing the moon carefully during
the eclipse, and seeing how the earth's shadow fell on it, the
Greeks found that the diameter of the earth's conical shadow at
the distance of the moon was about two-and-a-half times the moon's
own diameter. Of course, this doesn't immediately lead to the
answer, because they didn't know the moon's diameter. But they
*did* know one crucial fact we have not so far used---*the
moon is the same apparent size in the sky as the sun*. In the
figure below, this means the angle ECD is the same as the angle
EAF. The Greeks found by observation that the ratio of FE to ED
was 2.5 to 1, so looking at the similar isosceles triangles FAE
and DCE, we deduce that AE is 2.5 times as long as EC, from which
AC is 3.5 times as long as EC. But they knew that AC must be 108
earth diameters in length, and taking the earth's diameter to
be 8,000 miles, the furthest point of the conical shadow, A, is
864,000 miles from earth. From the above argument, this is 3.5
times further away than the moon is, so the distance to the moon
is 864,000/3.5 miles, about 240,000 miles. This is within a few
percent of the right figure. The biggest source of error is likely
the estimate of the ratio of the moon's size to that of the earth's
shadow as it passes through.

This was an even more difficult question the Greek astronomers asked themselves, and they didn't do so well. They did come up with a very ingenious method to measure the sun's distance, but it proved too demanding in that they could not measure the important angle accurately enough. Still, they did learn from this approach that the sun was much further away than the moon, and consequently, since it has the same apparent size, it must be much bigger than either the moon or the earth.

Their idea for measuring the sun's distance was very simple in principle. They knew, of course, that the moon shone by reflecting the sun's light. Therefore, they reasoned, when the moon appears to be exactly half full, the line from the moon to the sun must be exactly perpendicular to the line from the moon to the observer (see the figure to convince yourself of this). So, if an observer on earth, on observing a half moon in daylight, measures carefully the angle between the direction of the moon and the direction of the sun, the angle a in the figure, he should be able to construct a long thin triangle, with its baseline the earth-moon line, having an angle of 90 degrees at one end and a at the other, and so find the ratio of the sun's distance to the moon's distance.

The problem with this approach is that the angle a turns out to differ from 90 degrees by about a sixth of a degree, too small to measure accurately. The best Greek attempts found the sun's distance to be about half the correct value (92 million miles). They all realized, however, that it was tens of millions of miles away.

The presentation here is similar to that in Eric Rogers, *Physics
for the Inquiring Mind*, Princeton, 1960.