WWW.ASTRONOMY.COM 53
For 20 years, I have been
using charge-coupled device
(CCD) cameras, and I cur-
rently own the top-of-the-line
SBIG STX-16803. But while
studying two images I recent-
ly made using the latest QHY
410C CMOS camera, I had to
wonder: Is CCD dead?
For years, I lectured about
the asymptotic boundary of
noise in CCD images. In a
basic sense, this means that
no matter how many frames
you take to increase your
signal-to-noise ratio for a
cleaner image, you will always
run into a wall of noise when
you stretch your image to
bring out deep shadows. But
with QHY’s new CMOS cam-
era, this troublesome wall of
noise is nonexistent.
The QHY 410C is a one-
shot color camera that utilizes
the back-illuminated Sony
IMX410 CMOS chip found
in high-end cameras like the
Nikon Z6 and the Sony A7
III. But the 410C has taken
the full-frame (35 millimeter)
24-megapixel chip and
mounted it in a camera with
regulated cooling and zero
amplifier glow, helping drive
the noise to such a low level.
There also aren’t any
noticeable cosmic ray hits,
despite high quantum effi-
ciency. (Quantum efficiency is
the percentage of photons that
are converted into recorded
data.) With my SBIG STX-
16803 CCD, quantum effi-
ciency is around 60 percent;
with a back-illuminated
CMOS, it’s about 80 percent.
The commercial availabil-
ity of these back-illuminated
chips has been a big game-
changer in photography. In
a normal, front-illuminated
chip, all the supporting elec-
tronics surround each light
receptor (pixel). Thus, some
of the area that receives light
does not record it. In a back-
illuminated chip, the
supporting electronics are on
the back of the chip, allowing
100 percent of the light recep-
tors to absorb and record
light. This results in a chip
with much higher sensitivity,
which is ideal for astronomy
and astrophotography.
Color vs. mono
CMOS cameras come in
two basic f lavors: color and
mono. A color camera splits
the light into red, green, and
blue (RGB) colors using a
Bayer matrix. This typically
consists of a microscopic
array of colored filters —
assembled in a pattern of
two green, one blue, and
one red — placed over the
light receptors. Each of these
clusters equals one color
pixel. There are millions of
clusters, so software must
know the exact pattern used
to create the colored pixels
from the matrix through a
process called deBayering.
This is done automatically in
DSLR and phone cameras.
But for astrophotography, the
raw Bayered image must be
deBayered using computer
software like Astro Art.
This leaves you with a raw
16-bit colored image that you
can then stretch like a CCD
image. Some photographers
are concerned that color cam-
eras have less resolution than
mono cameras. But in prac-
tice, I have found that the lim-
iting factors are the seeing,
the guiding, and the optics,
not the type of camera.
Before deBayering, all you
see on your monitor is little
black and white squares,
which brings us to mono
CMOS cameras. You can
order CMOS chips that do not
have a Bayer matrix installed.
These mono chips are meant
for use with color filters
placed in front of them.
Separate red, green, and blue
exposures are made with the
mono chip, then these frames
are combined using software
to create color.
So, what’s the difference?
With a mono chip, you’re
using all the pixels on the chip
to capture each particular
color of light. As a result, you
will reveal more detail and
possibly achieve a higher sig-
nal-to-noise ratio. But the
same chip in a color model is
split into four receptors per
color pixel. For example, with
only one red receptor per pixel,
The Iris Nebula (NGC 7023) snaps into view in this shot taken with a PlaneWave CDK 17-inch telescope and QHY 410C
camera from the author’s California home last fall. Technical details: Six 15-minute subs binned 1 by 1; gain=0, offset=50;
images deBayered and combined in AstroArt to a raw 16-bit TIFF file; stretched and processed in Adobe Photoshop CS6.
NGC 7023