34 PART | I ITS technology enablers
in order to provide ADAS. In a slightly different case, the 2015 BMW 7 Series
introduced an innovative solution that is based on deep learning technology for
recognizing driver’s audio commands, without the need to communicate with a
central server or a cloud service. Many more deep learning applications in the
automotive industry are under development, and they include fault diagnostic
applications, fuel and emissions management, intrusion detection mechanisms
that protect the vehicle’s internal network, etc. (Falcini & Lami, 2017)
As far as it concerns hardware for the coordination of the multiple models and
the data fusion among them, new high-performance applications have emerged
that offer safety-related or driver assistance solutions, such as the forthcoming
microchips Mobileye’s EyeQ5, which will perform sensor fusion and will allow
holistic vision control for supporting Autonomy Level 5 (fully autonomous)
driving. The power consumption of such chips is increased, so in order to meet
its performance objectives, EyeQ is designing the 5th generation of microchips
using advanced VLSI process technology to the level of 7 nm FinFET.
Regarding electronic components that are already in the market, Intel intro-
duced the Xeon Phi chip as an answer to Nvidia’s Tegra chip. Nvidia’s Tesla
V100 is today able to perform 120 TFlops, using a total of 640 Tensor cores,
which are specially designed for carrying out deep learning tasks. Nvidia also
has NVidia DRIVE™ PX, for edge computing applications, which allows
developing vehicles that offer many advanced AD functions. AMD offers an
X86 server processor equipped with a GPU. This can be a reliable alternative
to HPC and machine learning workload processing on the edge and can be a
budget solution compared to Intel Xeon or Nvidia Tesla, which are already
established in the market of hybrid computing. In parallel, Tesla is developing a
new processor for AI applications in ADAS and automated vehicles in collabo-
ration with AMD (Etherington, 2017). Finally, NXP offers the S32V234 vision
processor, which is designed for ADAS, vehicle and industrial automation, and
offers several machine learning and sensor fusion capabilities, such as front
camera stream processing and object detection, surround view, etc. The proces-
sor design offers reliability, security, and functional safety (https://github.com/
basicmi/AI-Chip).
3.3 Solution approaches for automotive eHPC platform
3.3.1 Overview
Apart from the evolution of microprocessors and automotive microelectronics,
it is important to improve the connectivity of the embedded modules and the
integrated architecture of the automotive computing platforms. Multicore pro-
cessors specifically designed for real-time automotive applications along with
fail-safe process for data fusion among modules and decision-making models
for the interpretation of the results will guarantee operational and fully func-
tional automotive systems in all conditions.