Professional Documents
Culture Documents
Astrophotography
1 Photography
Photography is the process in which light is collected to record an image.
• Light enters a camera system through the objective element.
• Gets focused onto a focal plane where an image forms.
• A light sensitive image sensor records the image.
Focal length (f)
Image Focus
Sensor
Lens
Objective Image
Diameter (D) Focal ratio (f/ number) = f/D
Camera Controls
The simplest cameras just have a button to take a picture, with the internal settings all auto-
matic. For astrophotography, we will want more control. The important settings are as follows:
Shutter Speed Also known as exposure, this is how long you expose the sensor to light. The
longer the camera can stare at something, the more light you can collect.
Sensor Gain This controls how the camera converts light to numbers. For most consumer
cameras, this is calibrated to the film speed scale in which case it is known as the ISO
setting or exposure index. If it is not calibrated, it will just be known as gain (in the case
of webcams). More sensitive settings often give more image noise.
Aperture Otherwise known as the f-stop or f-number. For consumer level cameras, there is
an adjustable diaphragm inside the lens that allows you to control the effective objective
W EEK 5: A STROPHOTOGRAPHY
diameter and make it smaller to let in less light (useful in the day). For telescopes, we
want more collecting area so the f-number of a telescope is fixed.
A good interactive site to understand how these factors play together is at http://photography-mapped.
com/interact.html. To see it in action without a DSLR, some phone cameras may also allow
manual controls (sometimes through a third party app).
2 Image Sensors
Image sensors record the image formed on the focal plane by a variety of methods. The image
sensor may be a semiconductor device, chemical layer or biological material.
• Semiconductor detectors consist of light sensitive pixel sites that convert light into an
electrical signal.
• Photographic film (or plate) is coated with an emulsion of silver halide (often silver chlo-
ride or silver bromide) particles that are light sensitive.
• The retina consists of light sensitive cells that are connected to the optic nerve.
The recorded image on a sensor can then be stored or processed.
• In a semiconductor sensor, the image is read out pixel by pixel.
• In photographic film, the image remains on the film.
• In the retina, nerve endings transmit image data to the brain.
Most astrophotography now uses semiconductor detectors.
• Photographic plates were common in the past but now have been replaced by large for-
mat CCD detectors.
• Consumer level photographic film has been mostly replaced by digital sensors of varying
sizes.
2
W EEK 5: A STROPHOTOGRAPHY
However, while larger sensors may seem better, sometimes the telescope is not big enough to
light up a large sensor!
• The telescope may not have enough field of view for the entire sensor (the edges of the
picture will be black). This is known as vignetting.
Colour Cameras
All consumer level cameras take pictures in colour.
• However, CCDs and sensors are normally sensitive to most of the visible spectrum, and
some infrared. This is why some sensors are monochrome.
• To get a colour image, filters are placed over each pixel to make it sensitive to different
colours.
• The most common of these filters is the Bayer format filter where each group of 4 pixels
come together to give 3 colours.
• To actually get 3 colours per pixel, interpolation is used to fill in the missing information.
This is where software can reconstruct the missing colour information from the adjacent
pixels.
Bayer format colour sensor Monochrome sensor
The use of interpolation for colour information means that the spatial resolution for colour
information is half that of the spatial resolution of monochrome information.
• The human eye has specific sensors for red, green, blue (cones) and brightness (rods).
People will see changes in brightness better than changes in colour.
• Interpolated colour information is imperceptible to the human eye.
• Interpolating colour information is a very common practice in consumer level imaging
and is used for colour TV broadcasts: The colours you see on TV are all interpolated.
Resolution
Resolution is defined as the smallest feature an optical system can resolve.
• This is measured in angular units (most commonly arcseconds for astrophotography).
• The “resolution” that you are familiar with (number of pixels) comes from the idea that
if you have more pixels in a picture, you can resolve finer details. Note that this is not
always so!
3
W EEK 5: A STROPHOTOGRAPHY
Étendue
The Étendue of the system describes the illumination of the sensor-telescope system.
• This describes how much light is collected and sent to the focal plane.
– A large field of view means that light is collected over a large patch of sky.
– A large collecting area means more area for starlight to fall on.
• A larger Étendue means that more light is concentrated onto the focal plane (and expo-
sures are shorter).
Étendue for a camera-telescope system depends on the pixel scale (for the field of view) and
collecting area (for the amount of light collected). This is proportional to
pixel size 2
f /#
This means slower optics (larger f /#), which have a longer focal length or smaller objective,
will not concentrate light onto a sensor as effectively as faster optics (smaller f /#).
• Faster telescopes are often better when used to image extended objects. However, faster
telescopes also tend to have shorter focal lengths and thus a wider field of view.
4 Photons to Pictures
There are many steps before starlight becomes a photograph. We will focus on semiconductor
detectors because that’s what most people now use. In general, the simplest way to take a
picture is as follows:
1. Open the shutter and let light onto the sensor.
4
W EEK 5: A STROPHOTOGRAPHY
2. Pixels are made of a light-sensitive semiconductor that ideally sets aside an electron (pho-
toelectron) for each photon that falls on it.
3. Close the shutter after the preset exposure time.
4. Transfer the image (in the form of accumulated photoelectrons) out of the sensor.
5. Amplify and measure the electrical signal, and convert it into digital numbers.
6. Organise the numbers and write it into a file on the memory card.
The pixel values on each captured image represents the amount of light falling on the pixel
(and some noise). The captured image is not always perfect, and at each step of the way,
imperfections and tradeoffs can affect the final image.
Reading Out
Once the exposure is over, the image needs to be transferred off the sensor and digitised.
• In this step, the photoelectrons are counted and converted to a digital number. The num-
ber is then written to a file which repesents the pixel brightness.
• To count the photoelectrons, the signal needs to be processed and amplified.
This results in a readout gain and bias level:
Readout Gain This is the amplification factor, expressed as how much light corresponds to a
digital unit. This is controlled by (and hidden behind) the camera’s ISO setting.
Bias Level In order to reliably measure and record the signal, the camera will add a number
to the pixel value. This is known as the bias level and represents black in the image.
The readout process is not completely perfect and will introduce a small amount of noise into
the image
• This noise is not noticeable in ordinary photography (lots of light), but can make a big
difference in astrophotography (not enough light).
• In practice, this means the bias level will fluctuate slightly about an average value.
To measure the bias level, we take a picture with zero (or as short as possible) exposure time
and the shutter closed (or lens cap on). This is known as a bias frame (Some authors use offset
frame or zero frame). The bias frame defines what is a black pixel and sets the black point of the
image.
When reading out the image, the camera will add a fixed amount to the pixel value so that it
can reliably record the signal. This is known as a bias value and means that black in the image
does not correspond to a 0 pixel value. We measure this by taking a picture of negligible
duration with the shutter closed or lens cap on. This is known as a bias frame and tells us
what pixel value represents black.
Dynamic Range
During an exposure, photoelectrons are stored on pixels on the chip.
• The number of photoelectrons that a pixel can hold is known as the full well capacity.
• When the pixels are full (too much light → too many photoelectrons), the pixel becomes
saturated.
5
W EEK 5: A STROPHOTOGRAPHY
The full well capacity, read noise and largest value possible for a pixel gives the dynamic range
of the sensor. This is the ratio between the brightest object it can record (without saturating) to
the faintest object it can reliably record.
• The faintest object it can reliably record depends on the read noise: Too faint (not enough
electrons) and the signal gets confused with the read noise. This is also known as under-
exposed and the pixel will appear black.
• The brightest object it can record depends on the full well capacity: Too bright and the
pixel is full (all white or saturated). This is also known as overexposed.
5 Imperfections
There are always imperfections in the image:
• Manufacturing processes cannot make precisely perfect sensor chips so each pixel may
respond to light differently.
• Dust on the optics can also degrade the sensitivity of specific areas.
• Optical aberrations may mean that the sensor is not evenly illuminated. In many systems,
the centre of the frame is brighter than the edges (known as vignetting).
To work around these irregularities, we have to characterise each pixel of the camera-optics
system.
• To do so, we measure the response of each pixel to a uniform light source. This involves
taking flat frames.
• Flat frames can be taken by pointing the telescope to the twilight sky and taking a picture.
Stars are not out yet, and the sky (in the small field of view) is mostly evenly illuminated.
• A flat frame measures the response of the sensor-optics combination at each pixel.
• We can correct for vignetting and other uneven illumination problems with a good flat
frame.
We will save the flat field images for later post-processing. These flat field images allow us to
correct for slight differences in pixel sensitivity.
For various reasons, each pixel may respond to light slightly differently. To account for these
differences, we take a picture of a uniformly illuminated patch of sky. This helps to charac-
terise each pixel and is known as a flat frame.
A flat frame contains the light response of each pixel, dark noise and the bias level.
6 Long Exposures
The stars are faint, so long exposures are needed before enough photoelectrons build up in the
sensor. This can range from a few seconds to tens of minutes. Most DSLRs have a maximum
exposure of 30 minutes.
6
W EEK 5: A STROPHOTOGRAPHY
During the exposure, the sensor is connected to the power supply and powered up.
• Electrons may move about because they are warm (room temperature).
• These electrons may find their way into the sensor sites and join the photoelectrons. This
is known as a dark current because it will appear even in the dark, with the lens cap on.
• Dark current will occur at random. However, we can measure the average dark current
(in electrons per hour).
The dark current depends on the temperature of the sensor chip.
• A colder chip will have less dark current: Colder electrons have less thermal energy to
move about.
• To reduce dark current, sensors can be cooled.
• A DSLR can be cooled by blowing air over it, or letting it cool down for a while after
taking a long exposure.
• Most commercially available cooled sensors are thermoelectrically cooled.
– This uses the thermoelectric effect to remove heat from a cold pad and transfer it to
a warm pad.
– A heatsink on the warm pad then dissipates the heat out into the air.
Although we can try to reduce dark current by cooling the sensor, we cannot completely re-
move it: The sensor has to be at absolute zero for the dark current to be zero.
• We will have to try and subtract out an averaged dark current. This dark current shows
up in the image in the form of noise known as dark noise.
• In a DSLR image, the noise shows up as bright coloured pixels.
• To get an averaged dark current, we take a number of exposures with the shutter closed
and the sensor at the same temperature. This gets us a dark frame.
• Each exposure should have the same duration as the sky exposure (or we will have to
scale the dark noise based on the exposure time).
• The dark frames are then averaged to get a master dark frame.
During the exposure, thermal movements of electrons in the system contribute to the dark
noise. We can measure and subtract the average amount of dark current by taking long
exposures with the same duration and temperature, and with the shutter closed. This is
known as a dark frame.
A dark frame contains dark noise, dark current and the bias level: The dark current photo-
electrons need to pass through the same readout electronics as the image.
7
W EEK 5: A STROPHOTOGRAPHY
8
W EEK 5: A STROPHOTOGRAPHY