Autonomous driving levels and enablers Chapter | 17 193
The Autopilot ADAS, automates the steering, braking, and throttle com-
prises a Traffic-Aware Cruise Control (TACC) system and an autosteer. The
TACC is fed with information from the forward camera and the radar sensor
and determines the distance and speed of a leading vehicle in the same lane. If
the lane is clean the TACC keeps the driver defined speed, whereas in case of
a leading vehicle, the TACC controls the throttle pedal to keep a safe distance
from it. The autosteer system uses information from the forward camera, the
radar and ultrasonic sensors, detects lane markings, other vehicles, and objects
and provides automated lane-centering steering control. Both systems are rec-
ommended to be used on dry, straight roads, such as highways and freeways and
avoided on city streets.With the use of LiDAR and front or rear cameras being a
standard in autonomous driving, new vision and sensing technologies are gain-
ing space such as the rotating LiDAR sensor, which allows creating 3D sur-
round views and provides obstacle detection and side collision prevention. The
next big thing in computer vision for autonomous cars is the research around
laser-based (or even camera-based) systems and AI algorithms that fuse sen-
sor data in order to “see around corners”. This is done by combining informa-
tion from reflections and shadows in nearby walls (Saunders, Murray-Bruce &
Goyal 2018 O’Toole, Lindell & Wetzstein 2018). Sensor fusion is a key compo-
nent of Level 5 autonomous vehicles, necessary for getting a better understand-
ing of the vehicle’s environment. The most challenging part is data processing
and the interpretation of images and signals collected by the vehicle in real time.
One of the main enablers for autonomous vehicles of the middle level is
the user-machine interface. A smart human-machine interface has to monitor
the driver engagement with the driving task, provide information to the driver
about the vehicle and road conditions as well as about the system limitations,
in order to minimize the risk of mode confusion. At the same time, it must
restrict the availability of certain automation when conditions do not apply. For
these purposes, the user interface continuously provides feedback from vehicle
dynamics and warns the driver when unknown operating conditions are met. In
the latter case, the system restricts the availability of the automation in order
to prevent confusion. For example, Tesla uses a dialog box to notify the driver
to “Always keep hands on the wheel” and to “Be prepared to take over at any
time” when the autosteer system is active. Also, information about the distance
and speed of the leading car is available on the dashboard. The same car uses an
escalating warning series in order to validate the driver’s attention to the road
and alertness.
Another key technology for self-driving vehicles is the connected car tech-
nology, which allows vehicles to communicate with others on the road. The term
“connected car” encompasses Internet of Things and communication technolo-
gies in tandem, and assumes that the car sensor data are transmitted to nearby
vehicles (V2V), to the road infrastructure (V2I), to the network (V2N), and
to pedestrians (V2P) and the main objective is to enhance safety for vehicles,
drivers and pedestrians, making self-driving technology infinitely safer than