I started working on this problem the other day and I thought it would be a piece of cake, but now it’s driving me crazy. The question is: how bright would the Moon be if it was a flat cardboard cutout instead of a round ball? Would the full moon look just the same?It’s a funny question because to me, the moon always did look like a cardboard cutout. I never get a strong impression of roundness, or sphericity. When I look at the moon, it just looks like a flat disk to me. In my mind, I had already assumed that this was some general property of reflectivity, emissivity, and angles: for a uniformly illuminated object, the angle of viewing does not affect the apparent brightness. That was the general principle, roughly speaking, and I pretty much assumed you could go to the bank on that. Was I right?
Here’s a thought experiment. Let’s say you take a big sheet of painted drywall and hang it in outer space directly overhead. When it gets dark at night, have someone (!?) slowly rotate the drywall this way and that way. What can you conclude about the drywall? All you see is a white rectangular shape. No, that’s not even true; it won’t even be necessarilly rectangular. A parallelogram at best. The question is: what can you tell me about the drywall? Can you tell how far away it is? Can you tell how big it is? Can you tell what angle it is tilted at? Can you tell its true shape? Or is it just a featureless white quadrilateral occupying a measurable angular fraction of the sky?Like I said already, I had this general principle in my mind whereby you’d see exaclty the same white color no matter which way you tilted the drywall, so it wouldn’t make any difference how far away or what angle. That’s why the disk of the moon is uniformly white. It doesn’t matter what angle the sun hits it or how you view it, the sand which covers the moon is all illuminated to the idential brightness, and that’s all you can see from the earth.
That’s what I thought, and I had a little math problem I wanted to solve (I’ll tell you what the problem was eventually) so I tried to apply this principle. To my great annoyance, things just wouldn’t add up. The total available radiant power surely depended on the angle at which the sunlight hit the drywall, so obviously an obliquely oriented sheet couldn’t capture as much sunlight as one tilted straight on. So therefore the difference must be compensated for by the angle of viewing: the sheet which intersects less radiant energy must at the same time present a larger profile to the viewer, and vice-versa. So everything balances out.But this was crazy! There are two angles in the problem and they are completely independent: the angle of illumination and the angle of viewing. There is something called “Lambertian emission”, or ideal scattering, and it is indeed the principle that something in uniform illumination looks the same no matter what angle you view it from. It works like this: an illuminated sheet puts out, let’s say, 100 watts per square meter per solid radian (figure it out!) when viewed head on, but only 71 watts per square meter per solid radian when viewed from an angle of 45 degrees. The result is that you see exactly the same brightness from whichever angle you view it: you don’t need to get 100 watts when you view it at a 45 degree angle, because the apparent size of the sheet is is only 71% of it’s true size. On account of the cosine of 45 degrees. It all adds up.
(I’m going to come back later to that business of the watts per square meter per solid radian. It might be important…)So the illuminated sheet looks the same from all angles. But not if you tilt the sheet! You can move around all you want, and you’ll see the same apparent brightness from any viewing angle. But if you tilt the sheet with respect to the sun, everything changes! You’re not intercepting the same amount of radiant energy, so you can’t expect to have the same brightness. That’s the difference. I got those two things mixed up.
But if that’s how it goes, why does the moon look uniformly bright? On a full moon, the center is illuminated directly by the sun, and the fringes only obliquely. A given square meter of lunar surface near the edges is intercepting less illumination than a square meter in the middle of the disk, so how can it look just as bright? Something doesn’t add up.I checked Wikipedia, and they actually comment on this very question, and I quote:
“…if the moon were a Lambertian scatterer, one would expect to see its scattered brightness appreciably diminish towards the terminator due to the increased angle at which sunlight hit the surface. The fact that it does not diminish illustrates that the moon is not a Lambertian scatterer, and in fact tends to scatter more light into the oblique angles than would a Lambertian scatterer.”
So Wikipedia agrees with my perception of uniform brightness, and they agree with me that this conflicts with Lambertian scattering. But is their explanation correct? Now, I think Wikipedia is a phemonenal resource, and the quality of science and math is generally first-rate. But this explanation is a little to easy. There is a natural way for objects to randomly scatter light, and it is called Lambertian. The moon, according to Wikipedia, deviates from this natural scattering pattern in some random arbitrary way, and as a result, appears…randomly blotchy?...no!...it appears perfectly uniform! How can this be? It’s too perfect.Lambertian scattering predicts a global effect whereby the object appears uniformly bright no matter the viewing angle. What then is the concise princible whereby an object might appear uniformly bright regardless of illumination angle??? It’s totally farfetched to think you might get such a neat tidy result simply on account of “deviation from Lambertian scattering”, without some bigger principle at work. In fact, I just don’t buy it.
So I googled images of the moon, and found some beautiful shots. Check out this article buy a guy named Kash Farooq at The Thought Stash . Now look at the moon. It looks round! Yes, we all know it’s round, but I mean it looks spherical! The camera doesn’t lie. Could the flatness we are all used to be the result of psycho-visual effects rather than pure physics?
I think what’s going on is that in the usual viewing conditions, the brightness of the moon saturates the eye’s receptors. Once you hit white, it’s white, and it doesn’t get brighter by adding more white. I have to admit I’m not completely comfortable with this theory, but that’s the best I can come up with.
In Grade Seven we had a poem on the curriculum called “The Highwayman” that begins with the very memorable line:
“The moon was a ghostly galleon tossed on stomy skies…”
I’m sorry if my moon turns out to be just a sheet of drywall, but in physics if you want to get to the bottom of things you sometimes have to reduce things to the lowest common denominator. We’ve got some interesting calculations coming up, and I’m not yet sure if they’re going to work out. I told you there was a problem that got me going on this topic, and it has to do with the surface area of a sphere. The question is: is there anything in the physics of emissivity that relates to the fact that the true area of an illuminated spherical surface just happens to be exactly twice the cross-sectional area? We’ll return to this quesiton on another day.