Image of the day

Captured by
Eric Jeziorczak

Crescent Nebula

My Account

New to Astromart?

Register an account...

Need Help?

% light loss due to chromatic aberration?

Started by pkamm, 07/05/2002 01:23PM
Posted 07/05/2002 01:23PM Opening Post
Does anyone know of any data that would indicate what percentage of light loss one effectively experiences with achromats of various sizes & focal ratios, etc. since certain proportions of certain wavelengths get scattered -- i.e. violet light gets scattered the most in many achromats, but is there a way to know what percentage of the total incoming white light that represents? Is it meaningful? Is it on the order of <1%, 1%, 5% ?

Posted 07/29/2002 06:28PM #1
Our eyes visual response (Response versus wavelength)looks similar to the colour correction plot for an achromat which is why they work so well.

Our eyes response to unfocused blue and red light is not as good as green light. So I would qualitatively guess that an achromat will be close to an apochromat of similar aperture.

Especially if it is longer in focus.

My experiences using long focus achromats's bears this out. The differences are slight.

Kevin Barker
Posted 11/12/2002 01:09AM #2
I'd agree with Kevin that it isn't important. Visible diffraction disc does consist of all the diffraction discs formed by wavelengths from 400nm to 700nm of the visible spectrum mixed together.

Difraction disc formed by, say, 486nm wavelength (blue
F-line) makes only about 0.1% of the total light energy of a white-light source. All the light from the 400nm violet to the 490nm blue-green make only for about 5% of the total
light that can be seen with eyes adapted to bright light conditions. All the light from the red/orange 630nm to 700nm makes for another 7-8%. But whether it will be a "loss" or not depends on the object observed. For instance, a white 6th magnitude star in a 6" aperture will have visible diffraction disc of less than 1/2 of the Airy disc diameter. In other words, wavelengths to which the eye is ~50% or less sensitive will be lost to the eye anyway, chromatism or not. Average eye sensitivity to red/blue light is only about 10% of the max at ~550nm green. In order just to start scrathing the blue F-line visually with a 6" aperture, we have to go one magnitude up, and to get most of this line we need a 3rd magnitude star. But this is assuming that we have a compact, text-book Airy disc for this wavelength (at the green focus), which we don't. Even apos have up to twice larger blue/red blur (sometimes more than that) due to spherochromatism, and achromats have it up to several times larger than that, due to both secondary spectrum and spherochromatism. This makes it even more hard for the eye to detect it. That's why we start seeing these colors (in medium to fast achromats, anyway) only on very bright objects, like zero magnitude stars, bright planets, or Moon. Strictly talking, that light isn't lost from images of bright extended objects: it's still in, only smeared over them, softening their contrast, and beeing most obvious around the outer edge, against darker background.

On the other hand, it isn't lost in any way from the images of fainter objects, because its low intensity would escape eye's attention regardless of the degree of chromatism. And for those very bright objects where a small nominal loss exists, it mostly doesn't matter since there is plenty of light to begin with.

Main problem of faster achromats is that chromatic defocus moves into spectral areas of greater eye sensitivity which, helped by greater correction error and spherochromatism, causes more serious blurring of the diffraction disc and lower image quality. More off-green colors visible around bright objects is merely the most immediate manifestation of the magnitude of secondary spectrum.