MEMS and sensors for autonomous automotive applications
How will MEMS and sensors enable the evolution of the automotive industry? As part of an industry webinar held in August, STMicroelectronics’ Davide Bruno* explained how we’re edging closer to a future of fully autonomous vehicles — and how his company is helping to make that happen.
Bruno revealed that the 2019 total available market (TAM) for the automotive industry is in the range of $35 billion, split between traditional automotive core electronics (around 65%), and digitalisation and electrification (35%). With the current speed of new projects, new developments and new technologies, he said this will soon shift to more than 60% allocated to digitalisation and electrification and less than 40% for the traditional automotive core electronics — a shift that will happen in just 3–5 years.
“The traditional automotive model is changing, so many OEMs and Tier 1s are rethinking the way their new cars will serve the consumer,” Bruno said. “This opens up a lot of opportunities for ST in terms of products, including sensors.
“In the past, the car was a very static element which consists of wheels, engine and steering, and it was not able to do anything smart. Today, the car can park by itself, understand whether you are crossing the lane or not, and even drive by itself. This is done with sensors. Without sensors, you cannot do these things.”
According to Bruno, ST has identified four growth drivers for the use of automotive sensors: shared mobility and access control; road noise cancellation; 5G; and driving assistance. “[But] it is the driving assistance that is really changing the game,” he said. “It is not a dream anymore — it is a reality. Everybody is working on this.”
Bruno explained that there are several ‘levels’ of autonomous driving — from no automation (level 0) up to full automation (level 5) — with the addition of sensors moving a car from one level up to the next. ST believes that 33% of the cars produced in 2021 will be at levels 2 (partial automation) and 3 (conditional automation), with level 4 viable by the year 2028 and level 5 by the year 2040.
“To move to level 4 and level 5 … we need more fusion between different subsystems including radar, LiDAR and so on,” Bruno said. “The LiDAR system can’t work well if there is rain, dust and fog; it only works well to define objects when the weather conditions are good. Major subsystems must coexist together.”
Furthermore, he said, “Each sensor must have the capability of post-processing, locally in the sensor system … Instead of sending information without any interpretation or any post-processing to the central unit, we need to have smart sensor processing as much as possible. We believe the future for ADAS (advanced driver-assistance systems) is smart sensors, not just more sensors.”
The use of sensors in cars is by no means a new trend, with Bruno revealing that the first wave of sensor adoption began with the airbag in 1974. Indeed, Bruno claimed that we are probably not as good at driving as we like to think, as electronic stability control (ESC) has been standard in cars for many years now.
“The ESC system … corrects our driving mistakes for avoiding accidents, yet we do not even perceive it,” he said. And with car makers now also seeking to implement sensors for rollover detection and stabilisation, the packaging of such sensors will be important for ensuring their performance.
“A ceramic package gives us much more stability and better linearity,” Bruno said.
Other sensors enable what Bruno calls ‘non-safety’ applications, including navigation and entertainment. This area is ST’s speciality, he said, due to the company’s simple, streamlined product portfolio that enables the same devices to cover multiple applications — from key access to detecting wheel vibration.
But how will the company fare in level 5, where we sit inside a vehicle but do not actually drive it? According to Bruno, for this we need a better understanding of a sensor’s accuracy, stability and linearity.
“There is not so much difference between a sensor which is an inertial module unit used in a smartphone and the sensor you install in your car to achieve level 5 autonomous driving,” he said. “What is important, and what is more challenging, is the accuracy, linearity and stability ... At level 5 there is no margin of error, so there is no possibility for mistakes.”
How will this be achieved? For manufacturers working to level 3 today, Bruno says they put together several devices and operate them in a way that minimises errors. But this is very expensive and very complex, he said, because software is required to synchronise all the devices’ signals and verify all the information.
“So the future clearly is to have one single 6-axis or x-axis system, where x can be four, five or six DOF (degrees of freedom), plus what is called functional safety,” Bruno said.
Bruno clarified that when we talk about the cost of sensors, we’re talking about the time required for calibration. He gave the example of the aviation industry, with aeroplanes already sitting comfortably at level 4 or 5 of automation.
“To enable this function, you need a system that costs $150,000, which will take one week to be calibrated,” he said. “The sensor itself is not so different from a standard sensor. But the calibration means the sensor plus the ASIC, plus all the firmware and the way it is calibrated one by one.”
The cost of such a system would obviously have to come down in order to be used in an entry-level car, Bruno said. This is what ST is currently working on, in the form of a device called the ASM330CHH — whose closest competitor costs around $10,000 and requires 1–2 days of calibration.
“This is the challenge, which is to provide affordable sensors for the inertial platform,” Bruno said. “This is what we are working on now, not only with our internal team, but also with key experts and leaders on the packaging and defining methodologies for the calibration and testing.”
This isn’t the company’s only foray into autonomous vehicles. ST also has a range of high-sensitivity image sensors that, when combined with the proper processing and systems, could be used to monitor in-cabin alertness based on the driver’s facial expressions. So if the driver is noticeably drowsy or distracted, the system could enable driver assistance functions or encourage the driver to pull over and take a rest.
The company is also working in cooperation with LeddarTech on the development of a LiDAR evaluation kit, which will include ST’s MEMS mirror-based laser-beam scanning solutions. The kit is being developed to target automotive LiDAR applications for high-speed highway driving, as well as industrial and robotics LiDAR applications, with ST’s mirrors acting as actuators that will enable LiDAR systems to be very small and cost-effective. These mirrors are produced using a semiconductor, Bruno said; in a high-volume process that means they end up being very small, reliable and cost-effective.
Bruno concluded by acknowledging that the cars of the future will need more and more sensors. But this does not necessarily mean we need to develop new sensors, as current technologies including the accelerometer, gyroscope and microphone should be sufficient. We just need new ways to use these sensors.
So how will companies such as ST help achieve levels 4 and 5 of automation? According to Bruno, “We need to work with key experts and take specific steps, like the ceramic packaging or enhance the calibration, to develop this system with very solid and robust technology at an affordable cost. The combination of these will speed up the adoption of MEMS in automotive applications.”
Singaporean scientists have developed a way for robots to have the artificial intelligence (AI)...
Scientists have developed an artificial intelligence system that recognises hand gestures by...
Researchers have developed a way to equip robots with an exceptional sense of touch, as well as...