The
Volcano
logists
LENS
sensors, in the visible range they’re really
sensitive, the quantum efficiency could
be 70–80%, which is incredible, but you’d
expect that efficiency to drop in the UV,
mostly because of all these filters, but the
quantum efficiency of the Raspberry Pi
version 1 camera at 310 nanometres is
40%, so it’s absolutely unbelievable.
It’s comparable or better than some of
the commercial systems. I’ve seen one
quoted at 8% at 310 nm. Which is why
this has worked so well. There are other
issues to do with noise as
well. That is just absolutely
amazing. It’s why it’s
worked. It shouldn’t have
worked, but it did.
HS What’s next?
TP We’ve bought in some
stuff from Pimoroni. The
Enviro+ has all sorts of
interesting applications. There’s a mini
gas sensor on it, which I’m not sure will
be useful for us, but the thing that will be
useful is that particulate matter sensor
that you can add on to it. That’s looking
at the size of particles in air pollution, but
I don’t see any reason why we can’t use it
to detect ash particles on a drone flying
through a plume.
A lot of work I’ve been doing recently
has been on using mini cameras, flying
through gas plumes at the same time
with mini gas samplers.
The other massive advantage of all
this, from a volcano perspective, is that
things get blown up. Just this summer
there was an explosion and loads
of kit got wiped out. If that kit costs
thousands of dollars less to replace, it
just makes sense.
in lower-cost units, but maybe sending
multiple rover units to the moon. They
reached out to us to say that they had
an application that requires a very low
weight UV sensitive detector; would you
like to partner with us?
We developed
some sensors for
them. With Raspberry
Pi sensors, they’ve
developed prototypes for
spectrometers to go to
the moon.
There’s a limit to what
I can reveal, but the
whole unit that they’ve
built is about 150 × 100 × 100 mm in size,
and it’s got to come in under 60 grams,
so it needs to be really lightweight. I
suspect it isn’t 3D-printed, like ours.
But our contribution has been to deliver
the sensor module. With the project,
they’ve got to technology readiness level
4, so it’s already reasonably mature in
terms of where it’s got, and it’s passed a
whole series of tests. But the amazing
thing about it is that NASA asked us to
look at the quantum efficiency of the
sensor. If you get 100 photons falling
on the detector, how many electrons
are generated? With these kinds of
”
”
The quantum efficiency of the
Raspberry Pi camera version 1
is absolutely unbelievable – it’s
comparable or better than some
of the commercial systems
Left
On the right, the Raspberry Pi Camera module in its
natural state; on the left, the module with its lens and
Bayer filter removed to expose the sensor
Left
Tom’s written a GUI for the software, so you don’t
need to be an expert in Python programming on
Linux filesystems to study volcanoes