Paper Submitted by Thomas Wong, Director of Marketing, Automotive Segment, Design IP Group, Cadence

Automotive Market Trends

In 2016, McKinsey published a report (“Automotive revolution—perspective towards 2030”, McKinsey & Company, January 2016) outlining their perspective of automotive trends through 2030. They described four distinct trends that will drive the automotive industry: connectivity, autonomy, car sharing and electrification. At that time, there was much uncertainty about how much progress could be made in the short term given the technical and business challenges. The connectivity required to support vehicle-to-vehicle (V2V), vehicle to infrastructure (V2I) or vehicle to everything (V2X) was practically non-existent. While there were many ideas, there was no clear industry consensus on which wireless standards would be adopted. There were very few vehicles equipped with advanced driver-assistance systems (ADAS) and even fewer autopilot vehicles. Full autonomy (no steering wheel and driverless) seemed to be science fiction. The car-sharing phenomenon had just started and there was talk about robotaxis, which were an aspirational goal at best. To put things into perspective, about 400,000 electric vehicles were sold worldwide in 2016.

Fast Forward to 2019

There is more clarity on the wireless standard, with a major auto manufacturer deploying dedicated short-range communications (DSRC) in production vehicles now. Cellular V2X will come later. With this clarity, we can expect to see more deployment of systems using DSRC. Systems on chip (SoCs) that support DSRC are commercially available today. China is moving ahead with major initiatives and investment to support V2I.

ADAS is capable of supporting adaptive cruise control (ACC), and automatic emergency braking (AEB), lane departure, blind-spot detection, automated parking, 360° surround view, etc. are readily available. Level 2 autonomy is quite common, not only in high-end vehicles, but also in the $25,000 to $30,000 price-point range. At least one Level 3 (autonomy)-capable car can be purchased today. There will be several more as other manufacturers up their game in 2020.

Robotaxis are not quite here yet, but Waymo, Cruise and Uber are investing heavily and operating experimental fleets in the US. Many financially well-heeled companies in China are conducting similar activities.

Electric vehicles are the big story here. In China, we expect electric vehicle sales to top 2 million units in 2019. In the U.S. market, electric vehicles will top 1 million units, with the bulk of the volume coming from the Tesla Model 3. Battery cost is reaching the inflection point of $100/kWh—the price point where electric (battery-powered) vehicles can be competitive with those having an internal combustion engine (ICE). We are also seeing electric vehicles setting all kinds of speed records—winning the Pike’s Peak Challenge by a wide margin (covering a 12.42-mile course in under 8 minutes) < https://us.motorsport.com/hillclimb/news/dumas-shatters-pikes-peak-record-in-electric-vw-1047461/3128846/ >; and breaking the lap record at Nürburgring (7m 32s lap time around a 12.9-mile course) < https://www.extremetech.com/extreme/298587-did-tesla-just-set-an-electric-car-speed-record-at-the-nurburgring

From a semiconductor perspective, these trends (connectivity, autonomy, car sharing and electrification) are increasingly being addressed and supported by a new generation of SoCs. The most important applications today for advanced automotive SoCs are infotainment, ADAS and automated driving systems (ADS). Given the complexities and high-performance requirements, these applications will require the most advanced semiconductor processes such as 16 nanometer (nm) and 7nm. The introduction of advanced electronics into automotive design is causing a massive disruption in SoC design, which until very recently hummed along like a finely tuned sports car. The rapid push toward autonomous driving has changed everything.

Sensors, ADAS and Autonomous Driving

The human eye is a wonderful thing. We can see color, recognize street signs, achieve excellent peripheral vision (wide angle), estimate speed, estimate distance and experience wide dynamic range (from brightness to lowlights), etc. In developing autonomous vehicles, we are asking the sensors to replicate human vision (perception) and also asking the computer (or SoC) to perform signal processing and decision-making to guide/navigate the vehicle to where it needs to go. This safe path planning will require gathering data from multiple sensors and utilizing the capabilities of different types of sensors in a sensor fusion environment. It will also require the SoC to be able to perform fast calculations with low latency, as well as make massive use of AI acceleration. These are the challenges in developing ADAS and ADS SoCs.

Figure 1. Situational awareness enables an autonomous vehicle to perform safe path planning

In autonomous driving systems, perception means the ability to see, become aware of and identify the vehicle’s surroundings through sensors. This is known as situational awareness, which is illustrated in Figure 1. Safe path planning means the ability to plan a route given the perceived information and to safely maneuver the vehicle. In every autonomous driving system, there are three major computational considerations: sensing (image and signal processing), perception (data analysis) and decision-making. All modern SoCs for autonomous driving must deal with all of these.

In 2019, Level 3 autonomy will begin hitting the streets. Behind the scenes, work is underway to design SoCs for Level 4. But how these chips get built, by whom and using what intellectual property (IP) isn’t always so obvious. Level 2 autonomous driving systems rely predominantly on cameras and radar. As we move into Level 3 and Level 4 capabilities, data from images will come not only from cameras and radar, but will be supplanted by data captured from lidar and ultrasonic sensors. Perhaps we may even need to contend with thermal sensors (especially important in pedestrian-safety applications) in a sensor fusion environment.

Today, lidar is still at a price point where each unit costs thousands of dollars, clearly too expensive for commercial deployment. One big trend in sensors is the increasing performance and resolution of radar. The big question is whether radar is getting better faster than lidar is getting cheaper, and thus whether lidar will ever be deployed in mainstream vehicles. Let’s take a look at the pros and cons of popular sensor technologies (Figure 2).

 

Figure 2. Comparison of sensor capabilities

These sensors are good for distance measurement, traffic signs, lane detection, segmentation, and mapping. A couple of critical points: only cameras can “see” traffic lights, and only radar can cut through rain and fog. So you will always need cameras and radar, even if you have lidar and ultrasound (which is only short range).

 

All radar is not created equal, and there is a big difference between current radar technologies and the capabilities of next-generation radar. Short-range radar can replace ultrasound. Medium-range radar can detect cars alongside for lane changes and blind-spot detection. Long-range radar can detect cars ahead, and their speed and direction, for ACC and, eventually, more autonomous driving.

 

Figure 3. Autonomous driving chip architectures

Another big change is the move toward having more centralized sensor fusion. Instead of each sensor having its own signal processing and functions like object recognition, the basic signal processing is done in a central unit, and the perception and decision process is also centralized (Figure 3). This requires much higher bandwidth in-vehicle networks. It is still debatable just how much processing should be done at the sensor, with the tradeoff being duplicating sensor processing at each sensor versus network bandwidth and reliability requirements.

ADAS SoC Requirements

Now let’s take a look at a simplified block diagram of an ADAS/ADS SoC (Figure 4). It is based on a heterogenous architecture using a NoC (fabric) as a backbone.

The main building blocks are:

Figure 4. Simplified block diagram of an ADAS/ADS SoC

  • Camera processing
  • Radar processing
  • Lidar processing
  • GPS/GNSS
  • Wireless communications (DSRC, cellular V2X)
  • High-performance compute and AI acceleration

Camera, radar and lidar signal processing are usually performed by high-performance, low-power DSP cores, such as the Cadence® Tensilica® Vision, Fusion and ConnX processors. High- performance compute is performed by multi-core CPU and neural network processors such as the Cadence Tensilica DNA 100 processor IP for AI inference. ADAS SoCs also require a slew of high-performance interfaces to communicate with the rest of the computer. These interfaces include:

  • LPDDR4/4X
  • LPDDR5
  • DDR4/5
  • GDDR6
  • MIPI® D-PHYsm
  • MIPI A-PHYsm (future)
  • Gigabit Ethernet (GbE) with time-sensitive networking (TSN) and audio-video bridging (AVB)
  • 5G, 5G and 10G automotive Ethernet
  • PCIe® 4.0/3.0
  • USB3/1
  • eMMC/SD/UFS/ONFi

All these IP are required to meet AEC-Q100 temperature specifications and adhere to the requirements of functional safety as stipulated in ISO 26262:2018 specifications. The current generation of ADAS SoCs uses LPDDR4 at the 4266 speed grade. New designs will likely use LPDDR4X at 4266 to take advantage of lower I/O voltage to further reduce power consumption. Next-generation systems will adopt LPDDR5, which will accelerate when the LPDDR4-versus-LPDDR5 memory/price crossover occurs. Due to the high-computation and low-latency demands for AI acceleration, we will see systems that will utilize DDR4/5 in conjunction with GDDR6 to balance the need for high performance (speed), high density and cost effectiveness. Camera interfaces will continue to rely on the trusty MIPI D-PHY interfaces. As sensor interfaces proliferate throughout the car, there will be demand to move a greater volume of data through the in-vehicle networks, resulting in the need for much higher speed interfaces. Could MIPI A-PHY fill this need? In addition, MIPI D-PHY was used heavily in smartphones, where the camera is located a couple of centimeters away from the applications processor. In an automotive environment, signals must traverse a much longer distance in the in-vehicle harness, where a 10m to 20m distance is quite common. GbE is being adopted as we speak. It is likely that 2.5G, 5G and even 10G Ethernet will be coming to your next new vehicle. For storage, some form of eMMC, SD, UFS and ONFi will be needed depending on the application.

Semiconductor Technology

The earliest chips designed to support automotive applications were designed more than 40 years ago, an eternity in the technology business. These chips were developed by semiconductor integrated device manufacturers (IDMs) because they could control every step of the design, manufacturing, testing and qualification. This approach was useful for electronic control units (ECUs), such as engine and transmission control, and it still makes sense today for these applications.

But as electronic components have evolved beyond the engine compartment, so too have the semiconductor processes needed to support them. Some in-vehicle networking can be implemented with acceptable performance in 40nm semiconductor technology, and mainstream infotainment SoCs need, at a minimum, the performance obtainable from 28nm technology. However, ADAS systems require a performance level only obtainable in 16nm technology, with future requirements clearly trending toward 10/7nm.

Conclusion

Looking at the evolution of the automotive industry over the past several years, there is a clear path to Level 3 and Level 4 autonomy. It is not so clear that we will see Level 5 anytime soon, but that is a good aspirational goal. Aside from the challenges of using sensors to replicate perception capabilities of the human eye and using AI computation to help make decisions for safe path planning, there are other important aspects that we must still pay attention to. Unlike semiconductor chips used in consumer products where the product life can be just two to three years, semiconductor chips used in cars are expected to last approximately 10 to 15 years, requiring them to be much more robust and reliable. All automotive SoC designs must adhere to higher temperature requirements, more stringent electromigration (EM) analysis and considerations for aging effects, as well as follow AEC-Q100 specifications. On the functional safety side, they must follow the recent ISO 26262:2018 specifications. Every design organization will need to manage the 3Ps (people, process and product) of functional safety in order to have a successful outcome.