A Lattice Semiconductor White Paper
In the automotive market as in all industries, competition breeds innovation. Over the last 100 years, this truth has transformed the horseless carriage into one of the most useful and ubiquitous tools in the modern world. Automobiles have long been understood as mechanical devices, but recently, continuing innovation has transformed them into increasingly electronic systems. One obvious example of that is the rapid electrification of the car. But equally significant has been the evolution of entertainment from simple radio players and tape decks to powerful Automotive Infotainment Systems (AIS), and more recently the introduction of Advanced Driver Assistance Systems (ADAS) that help protect drivers with enhanced electronic safety systems.
Over the last decade, one of the primary drivers of innovation has not come from inside the automotive market, but from the consumer electronics market. The rapid rise of the smartphone has forced carmakers to quickly adapt to a new device that has quickly become ubiquitous. Since the 1950s, the automotive entertainment system has been primarily based on radio, tapes, or CDs. While some navigation and other so called “infotainment” systems had already been introduced by the time the iPhone was unveiled in 2007, their functionality was limited and very expensive.
The introduction of the smartphone upended this paradigm, introducing a platform and ecosystem of applications into the car that the car manufacturer had no control over. Key apps, such as Google Maps that could use the GPS built into the phone to provide turn-by-turn directions had a transformative effect on the driver’s experience. Entertainment changed too. Instead of being forced to listen to the radio, or whatever CDs they had lying around in the car, drivers could now access millions of songs on custom playlists. The challenge from smartphones accelerated trends that were already taking place in the automotive space and in response the entertainment system in a car increasingly became an Automotive Infotainment System (AIS), with a goal not only to entertain, but to inform.
At the same time, automobile manufactures had been developing new ways to protect their occupants. ADAS includes a range of solutions such as Automatic Braking to Lane Detection and Birds-Eye Parallel Parking that use advanced technology and processing to enhance driver safety.
To power both their AIS and ADAS solutions, automotive manufacturers have looked to the mobile space. The processors and systems powering smartphones are a good candidate to drive these systems, sharing similar constraints in size, power consumption, and wider temperature range. In addition, the volumes in which they were produced ensured that the car manufacturers had an inexpensive platform to design from.
However, there are challenges on this path as well. Cars and smartphones have vastly different design cycles (9-18 months vs. 3+ years). Because of this a smartphone processor platform can be at the end of life before the automotive system designed around it could even reach the market. Video is also an issue. Smartphones generally have two inputs for video to support the front and back camera. An AIS or ADAS solution by contrast could be receiving and analyzing video from 4 or more cameras, or other sensors.
The desire to use more powerful processors to power advanced automotive systems looks set to continue, as more cameras and more screens appear inside the car over the next decade. Given the benefits, it seems likely that the car manufacturer will continue to try to adapt existing mobile processors to meet this need. Because of this, there is a need for flexible solutions that can help adapt the mobile processor as an application processor for the automotive market.
FPGAs are particularly well suited for these sorts of bridging applications. Modern FPGAs are able to provide a needed degree of flexibility, while still maintaining an efficient cost, low power consumption, and powerful features. In particular, FPGAs are able to help resolve the mismatch in the video inputs, outputs and connectivity in the modern automobile, reducing cost and time to market. A variety of FPGAs and ASSPs from Lattice Semiconductor are available in an automotive grade, including AEC-Q100 certification on select solutions. These solutions will be increasingly important, as the AIS system of today becomes the foundation of the ADAS system driving the automated car of the future.
When the Tesla Model S was introduced in 2012 one of its many innovative features was the large 17-inch infotainment touchscreen in the center console. Also notable was the lack of physical controls. Outside of the core driving components, everything else, HVAC, entertainment, and information had all been pushed to this one giant touch display.
While the Tesla console screen is remarkable for its all-encompassing size, it is a natural evolution of display proliferation inside the car cabin. Video displays first showed up in the center console in the entertainment system of the car. From there they expanded to the headrests to provide rear seat entertainment. Today they have expanded further to digital instrument clusters and heads up displays.
Like displays, cameras first showed up in the car for a single purpose, to help drivers see what was behind them as they were backing up. Since then their role has expanded. In today’s car, cameras can be used to analyze a car’s surroundings and information overlaid the augmented reality through a Head-Up Display (HUD). In some models of cars, the side mirrors are beginning to be removed in favor of cameras that are tucked into the car’s body, reducing wind resistance and improving fuel efficiency.
As the number of cameras and screens connected to the AIS has proliferated, it has become harder to adapt the mobile processor platforms to these systems. Most smartphones have two cameras, a primary rear-facing camera, and a front-facing camera for videoconferencing. But when applied to the car space, with its multiple cameras, the two inputs available are insufficient.
A second challenge is dealing with the video interfaces. Mobile phone processors generally have a single DSI output for display, however screens in the automotive space largely use LVDS which many mobile processors do not support.
Lattice Infotainment Solutions
FPGAs help automotive infotainment systems adapt to these challenges by preprocessing video signals for customer resolutions, bridging various interfaces throughout the car, and functioning as a serializer/deserializer (SERDES) allowing multiple video screens to be driven by a single video output.
CrossLink is the world’s first programmable ASSP, and is a powerful solution that allows multiple camera or sensor inputs to be aggregated into a single high speed input to the application processor. It supports MIPI, CSI-2, subLVDS, LVDS, HiPSI, and LVCMOS and outputs. It can also serve as a deserializer, allowing video from a single source to be distributed to many screens. This fusion of pASSP and PLD allows for maximum flexibility/integration.
ECP5 FPGAs are a mixed interface bridge can serve as a video bridge between the mobile application processor’s DSI or FPD-Link output, and the LVDS or embedded DisplayPort (eDP) input of most automotive displays.
ECP5 can be used in a variety of other infotainment use cases as well, including splitting a single video output for dual rear-seat displays, and cropping and formatting video for a specific video resolution.
On the video input side, Lattice Semiconductor provides a number of ASSP solutions for the automotive market that help connect phones to the car over MHL, or HDMI, and distribute that content around the car.
ADAS refers to a variety of new technologies that help the driver on the road. These systems utilize an array of advanced sensors that inform the driver to dangers up ahead, and in some cases automatically compensate for changing road conditions.
While today assistance is as far as the mass market systems go, already companies from Tesla to Google are testing semi-autonomous and fully autonomous systems that one day soon will bring passengers safely from a starting point to their destination with minimal input from the driver.
ADAS depends upon the sensors that help it take in data about the world around it. These can include cameras, radar, LIDAR and others. Whereas in older cars, a camera was primarily used to assist in backing up, in an ADAS system the video from an array of cameras is seamlessly transformed into a bird’s eye view of a car and its surroundings. Cameras can also be used in conjunction with radar sensors for parking assistance and automatic braking or for drive recording to a “black box” that allows insurance agents to investigate the aftermath of an accident.
Many of these systems require a number of cameras to operate, requiring data from multiple cameras and other sensors to be seamlessly aggregated together in order for the system to properly analyze its surroundings.
In order to more accurately analyze a car’s surroundings, ADAS cameras will become higher resolution, with higher frame rates and higher color depths. This will allow systems to analyze more data, but also require significantly more bandwidth to be supported. Whereas a basic rearview camera system still relies on low resolution analog connections, ADAS cameras require higher speed digital interfaces, such as CSI-2.
Here, as in the infotainment space, the limited I/Os of a mobile processor mean that bridging solutions are required to adapt it to the needs of the automotive space. With FPGA solutions, multiple camera and other sensor data will need to be aggregated through a single camera input on the processor. More powerful FPGAs can help preprocess data and even control sensors as well.
Lattice ADAS Solutions
As in the infotainment space, CrossLink provides low power, low cost, sensor and camera interface bridging. This allows data from multiple image sensors to be aggregated and delivered via Crosslink to the AP via a single CSI-2 interface. The small footprint of CrossLink allows it to be placed closer to the sensor adding flexibility to manufacturing design.
In the future, CrossLink could be used in upcoming radar systems for ADAS. CSI-2 is the interface of choice in such functions too. The radars are used for multiple topologies in conjunction to the imaging to enhance ADAS functions. Once again CrossLink comes to rescue by converting the CSI-2 to the input that can be accepted by the application processor.
In addition to data bridging and aggregation, the more powerful and capable ECP5 can act as an intelligent hub for controlling and aggregating multiple sensors of different types.
In addition, the ECP5 can perform image signal processing before sending the image data to the application processor, lowering the processing power needed in the application processor to perform these functions and improving overall performance in the car.
The flexibility of FPGAs have already proven invaluable in many industries. In the automotive space, FPGAs can help bridge the gap between the needs of advanced entertainment and safety systems, and the mobile processors on which they run. This business model has distinct advantages; allowing automotive manufacturers to use mass market proven products from the smartphone space and rapidly adapting them to the changing needs of the car landscape. FPGAs may yet prove useful in other areas of the car as well. FPGAs are now beginning to show up in areas such as the motor control space in other industries and could prove useful in the automotive space as well.
One thing is clear; as car electronics continue to advance, and especially as ADAS and other efforts continue to move the industry towards the driverless car future, more and more sensors will need to be integrated into these systems, and the need for FPGAs to provide the flexibility to adapt cameras, sensors, video, and higher speed connectivity to the changing needs of the market will only grow.