Why a flat field frame is useful in astrophotography
Category: Astrophotography method
Posted by: Tom How
I has discussed before the importance of using field fields to calibrate your images in astrophotography. In this article I've presented some sample images to demonstrate why a flat field is so useful in astrophotography, especially when dealing with images of nebula.
When we process our raw frames from the CCD camera, the usual process is to stack many exposures together to reduce noise, and then use image processing tools such as photoshop to stretch the image to bring out the areas of fainter signal without ruining and saturating the areas of high signal (e.g. stars).
A pixel in an image might have any value from 1 to about 60,000 depending on the camera. The stack of raw frames might only use values between 1 and 2,000. The image appears dim and very low contrast. Stretching will take those values between 1 and 2,000 and redistribute them across the whole 60,000 possible values, increasing contrast.
At the dark end of this stretch, we need to set what is called a black point. We need to choose a value in the image which is black - this is usually part of the background that has no features. The highest values are normally in the centre of stars. We then stretch the data between these two values. These stretches are often non-linear, but that is outside the scope of this discussion.
If our background was totally uniform, we'd end up with a smooth dark background with the target object standing out with good contrast. However, to pull this off, you need to have a uniform background across the whole image.
Optics are not perfect, they normally illuminate the centre of the CCD chip more than the edges, so the background is not uniform. It is dark at the edges, and brighter in the middle (right where the target is!). As we try to stretch the image, the background develops a glow in the middle. Below is a stack of light frames which have been aggressively stretched.
We can clearly see that the top and right hand sides of the image are very dark, whereas there is a brightening in the middle where more light has hit the sensor. The target is clearly visible, but the overall image is ruined because the background is not uniform. If the background was "flatter" then we could actually stretch the image more, but we can't: We are limited by the background
If we take a flat field frame of these optics, by hanging a white shirt over the lens and shining a lamp at it, we get an image like this.
This shows how lopsided the illumination of the sensor is. It is poorly illuminated in the bottom right. There is a dust mark in the middle, a couple of scratches on the ccd and some odd white mottling (damaged sensor). We can clearly see the uneven background and the brightening towards the centre.
If this flat field is used to calibrate the raw frames, and then they are stacked and stretched, we get something like this.
We can see that the background is now much more uniform and we can stretch the object more strongly. The glow down to the bottom left is actually nebulosity, and not an artifact. The image is vastly improved.
There are many tools for correcting non-uniform backgrounds on astronomy images, but always these are trying to correct for faulty data with no knowledge of the equipment used.. It is infinitely more preferable to have the correct data in the first place which has been calibrated to the equipment used to take the image.
Fundamentally, using flat field calibration in astrophotography results in easier processing, more detail and better images!
The final processed image of this nebula looks like this, if you are interested. The Jelly fish nebula IC 443.
When we process our raw frames from the CCD camera, the usual process is to stack many exposures together to reduce noise, and then use image processing tools such as photoshop to stretch the image to bring out the areas of fainter signal without ruining and saturating the areas of high signal (e.g. stars).
A pixel in an image might have any value from 1 to about 60,000 depending on the camera. The stack of raw frames might only use values between 1 and 2,000. The image appears dim and very low contrast. Stretching will take those values between 1 and 2,000 and redistribute them across the whole 60,000 possible values, increasing contrast.
At the dark end of this stretch, we need to set what is called a black point. We need to choose a value in the image which is black - this is usually part of the background that has no features. The highest values are normally in the centre of stars. We then stretch the data between these two values. These stretches are often non-linear, but that is outside the scope of this discussion.
If our background was totally uniform, we'd end up with a smooth dark background with the target object standing out with good contrast. However, to pull this off, you need to have a uniform background across the whole image.
Optics are not perfect, they normally illuminate the centre of the CCD chip more than the edges, so the background is not uniform. It is dark at the edges, and brighter in the middle (right where the target is!). As we try to stretch the image, the background develops a glow in the middle. Below is a stack of light frames which have been aggressively stretched.
We can clearly see that the top and right hand sides of the image are very dark, whereas there is a brightening in the middle where more light has hit the sensor. The target is clearly visible, but the overall image is ruined because the background is not uniform. If the background was "flatter" then we could actually stretch the image more, but we can't: We are limited by the background
If we take a flat field frame of these optics, by hanging a white shirt over the lens and shining a lamp at it, we get an image like this.
This shows how lopsided the illumination of the sensor is. It is poorly illuminated in the bottom right. There is a dust mark in the middle, a couple of scratches on the ccd and some odd white mottling (damaged sensor). We can clearly see the uneven background and the brightening towards the centre.
If this flat field is used to calibrate the raw frames, and then they are stacked and stretched, we get something like this.
We can see that the background is now much more uniform and we can stretch the object more strongly. The glow down to the bottom left is actually nebulosity, and not an artifact. The image is vastly improved.
There are many tools for correcting non-uniform backgrounds on astronomy images, but always these are trying to correct for faulty data with no knowledge of the equipment used.. It is infinitely more preferable to have the correct data in the first place which has been calibrated to the equipment used to take the image.
Fundamentally, using flat field calibration in astrophotography results in easier processing, more detail and better images!
The final processed image of this nebula looks like this, if you are interested. The Jelly fish nebula IC 443.