APRIL 2020 MACWORLD 105
artificially hearing the outside world. It’s a
dramatically different sound and I’ve heard
a lot of people say they appreciate being
able to listen to audio while also having
the sounds of the real world accessible.
What strikes me about Transparency,
though, is that Apple seems to be
adjusting the sound from the outside
world very little, if at all. When I use
Transparency, I don’t just hear people
talking or the sound of a car coming down
the street—I hear a background hum from
traffic on a nearby freeway.
Now, imagine a future
version of AirPods Pro, with a
little more processing power.
In addition to Transparency
mode, perhaps there’s a
Smart Transparency mode
that takes a cue from all the
audio processing software
out there to do things like
remove unchanging
background noise and even
remove room echo so that
what you hear is clearer than
it might be if you heard it
unfiltered. The algorithms are
there today, measuring the
reflectivity of the room on
the fly and cancelling
echoes; it’s just a matter of
building hardware powerful
enough to process all
the data in real time.
I recently read a story about the quest
for “smart” hearing aids (go.macworld.com/
smai) that suggested that algorithms can
do a pretty good job of filtering out
background conversations, and might
even be able to figure out how to
emphasize the voices of specific speakers
based on who a person is looking at. The
challenge, once again, is processing
power—and it’s hard to imagine that Apple
won’t be able to keep progressing the
power of the AirPods Pro.
The black patch on the AirPods Pro is a microphone used
for noise cancelllation.