AviationWeek.com/awst AVIATION WEEK & SPACE TECHNOLOGY/NOVEMBER 3/10, 2014 43
G600 announced Oct. 14. The new business
jets, set for first deliveries in 2018 and 2019,
respectively, will have a Honeywell-inspired
Gulfstream Symmetry flight deck with 10
touchscreen controllers in the overhead
panel and on the center pedestal for sys-
tem controls, flight management, commu-
nications, checklists and other functions.
A touchscreen controller will replace the
traditional multifunction cockpit display
unit (MCDU) in the center pedestal, a de-
vice that provides input to the FMS via a
keypad on the bottom or bezel keys along
the sides of the display. Also available on
both aircraft either as standard kit or an
option will be another feature born in the
Advanced Technology labs—3-D airport
moving maps (see page 46).
For the tablet prototype, ground rules allowed no changes to
mechanical interfaces or wiring on the flight deck, constraints
that led to the creation of the Honeywell innovation prototyp-
ing environment, or HIPE. McCullough demonstrated it in the
G650 cockpit rig. Inside HIPE—a form-fitting replacement
for the MCDU—is an Arinc 429 converter with a USB input
for the tablet, which then becomes a high-resolution MCDU
display with touchscreen or voice-control input. The connec-
tion could also be made with wireless technology. The tablet
has an Arinc 729 protocol decoder that translates touch or
voice commands from the tablet into FMS commands. Using
a commercially available headset, McCullough demonstrated
how a verbal “change approach” command brings up the ap-
proach page on the tablet turned surrogate MCDU. She says
the technology also was tested in the cockpit of an Embraer
E-Jet—a regional aircraft equipped with a Primus Epic cock-
pit—and that the voice commands worked despite the exter-
nal noise environment in the cockpit. Feedback from pilots
is also leading to other potential uses for connected tablets,
including as an aid for preventing confusion about autoflight
system modes. McCullough says the display could take on a
diferent “look and feel” when mode changes occur.
Another benefit of the high-resolution portable display,
other than pilots being able to place the tablet in the most
convenient location, is that the system can allow for a “cross-
cultural cockpit,” which MCDU labels in a selected language.
Honeywell has experimented with Mandarin and Portuguese.
Some of the HIPE technologies are slated to transition into
the product development organization next
year.
Prototypes of other input devices in the
FD-X come straight from the gaming world.
Engineer Steve Grothe demonstrated a mix
of voice-, eye- and gesture-control devices on
a simulated flight deck that could help pilots
with certain tasks, although this is at the very
early stage. Rather than using a traditional
cursor, Grothe demonstrated how he could
use a commercially available Leap Motion
gesture controller to pan a 2-D navigation
map to the left or right, using his fist, or
zoom in or out using a clockwise or counter-
clockwise twirl of his index finger. The drop
of a finger caused the display to pan back
to the current location of the aircraft. Leap
Motion uses infrared cameras to sense hand
motion. Another use for a gesture controller could be to silence
certain noise sources in the cockpit. Grothe discussed how a
“halt” hand gesture, similar to what a pilot would use to signal
silence to another pilot, could be used to mute nonessential
radio transmissions, including automatic terminal information
service (ATIS) reports for a certain duration. “Rather than hav-
ing to fumble for the audio panel, you can wave your hand into
gesture space to mute the ATIS,” says Grothe.
Challenges to be tackled include the ability to come up
with a “very small” vocabulary of movements for the gesture
device, as well as the optimum number and placement of the
devices to catch the required movements. Another question
is whether a pilot will need to stabilize his or her arm to ac-
curately signal the device, particularly in turbulence.
Based on FD-X visits, Honeywell finds that voice control can
be beneficial when used in combination with other modalities.
For example, if a pilot’s hands are busy with a task, speech can
be a good alternative for some actions, a concept pilots must
generally experience to believe. “We asked leading customers,
‘Do you think speech in the cockpit is a good idea?’” Jha says.
The consensus was “No.” “We did some rapid prototyping...
and let pilots experience it.” One chief pilot “who was against
the whole thing” became a believer when he tested a prototype.
Grothe demonstrated one of the mixed-mode prototypes
in FD-X. Using eye-tracking, he was able to highlight one of
three screens in a mock flight deck and select that screen (the
boundary of the screen turned green once selected) using voice
control. Grothe then used a swiping motion with his right hand
to move the screen’s contents to the other screens with a swipe
gesture in the appropriate direction. The same could be done
with sub-windows within a display, he says. The transfer did
not always go as planned, highlighting some of the limitations
of a rapid prototype setup. He also showed that transfers or
manipulation on the screen could be accomplished via touch-
screen controls, highlighting another modality option.
From a broader standpoint, Honeywell sees all the various
technologies coming together to create an automation holism
wherein humans team with machines rather than interact
with them. “They will complement each other in transpar-
ent ways,” says Jha. “Consider the machine to be another
pilot, who is nearly human.” The future flight deck is a place
where automation lowers workload but also ensures that the
crew is never faced with a situation where they are unsure of
what’s going on and what they need to do. “That’s situational-
awareness nirvana,” says Witwer. c
Honeywell is exploring ways to
use various languages (Manda-
rin, above) for its avionics.
圀漀爀氀搀䴀愀最猀⸀渀攀琀圀漀爀氀搀䴀愀最猀⸀渀攀琀
圀漀爀氀搀䴀愀最猀⸀渀攀琀