Changing the Game In Machine Vision for Autonomous Vehicles

Autonomous vehicles represent one of the most exciting and promising technologies to emerge in the last decade, offering the potential to disrupt the economics of transportation. There is plenty of hype, but there can be no doubt that the emergence of fully autonomous vehicles will have a huge impact on people’s lives, and the global economy. Strategy Analytics, in a 2017 report published by Intel, forecasts that by 2035 Level 5 autonomous vehicles will be in widespread use, creating an economic powerhouse that will generate $7 trillion dollars in revenue by 2050.

Among the many technologies needed to construct fully autonomous driving systems – sensors play a central role. They are the eyes and ears of the car, allowing it to build an accurate model of its  surroundings from which driving decisions can be made. At Intrinsix, we believe that sensors will play an enabling role in achieving the ultimate goal of autonomous driving – fully driverless vehicles, as defined by SAE Level 5 operation (see box above for a discussion of the 6 levels of autonomy).  Through a combination of patents and technology partnerships, we are developing a new type of sensor that will change the game in machine vision for autonomous vehicles. This article discusses the limitations of current sensors, and how new sensors being developed by Intrinsix will break through those limitations.

Current Autonomous Vehicle Sensors

There are three types of sensors systems currently employed in self driving cars, based on cameras, LiDAR, and RADAR. While these systems have enabled impressive advances in autonomous technology, no car manufacturer has yet achieved Level 4 autonomy or higher in a production vehicle. While at least one manufacturer has introduced a car that claims Level 3 automation (at a high price point), Level 5 is still years away. One of the most important barriers to full Level 5 autonomy is the lack of cost effective sensors capable of reliably mapping objects in the environment, including animate and inanimate objects, under all weather conditions.

Recognizing animate objects is important, especially when collision is unavoidable. When confronted with an imminent impact, the driverless vehicle needs to make an instantaneous decision about what object to hit.  The ability to distinguish a child from a trash can is a matter of life or death. Making this kind of instantaneous decision is difficult enough in good weather, but with existing sensor technology it is completely unreliable in bad weather.

The Limitations of Current Sensor Technology

Intrinsix in conjunction with several technology partners is working on sensors to break this impasse. We are addressing three key limitations of current camera and LiDAR based systems: (1) their inability to see through rain, snow, and fog, and (2) their inability to reliably distinguish animate and inanimate objects in bad weather and (3) highway speed capable rate of updates. All three are absolutely essential for Level 5 autonomy.

To understand these limitations, consider how LiDAR systems work. They illuminate the environment with pulses of light, and then use silicon detectors to register the reflections and emitted radiation. The resulting signals are then processed by algorithms to construct and categorize images. Current LiDAR systems are forced to illuminate with short wavelengths (under 1.1 micron), a constraint imposed by the silicon detectors. Since these short wavelengths are scattered by water vapor – rain, snow and fog impair their ability to accurately process images. This is a serious barrier to full autonomy.

Current camera and LiDAR systems also have significant limitations in their ability to recognize animate objects. Typically, they combine and process data from multiple sensors (e.g. camera and LiDAR) to build a 3D model of the environment. They do well in good weather conditions when a pedestrian is fully visible and walking. But the reliability declines if the pedestrian is obscured by other objects, riding a bicycle, or if snow or fog is scattering the LiDAR pulses.

The Solution: Sensor Fusion

Intrinsix is leveraging a portfolio of granted and pending patents that address these limitations. Intrinsix new sensor solution coupled with Artificial Intelligence (AI) in the vehicle can recognize, categorize, and distinguish living from non-living objects, even under inclement weather conditions.  The result is a higher reliability, simpler, and more economical system for categorizing objects in the vehicular environment.

Unique Position

Growth in the automotive sensor market will increase dramatically as autonomous vehicles enter the mainstream. Market research by Technavio forecasts  a CAGR of more that 114% by 2021, leading to a global market of $25 billion by 2030 (as estimated by Statistica).

Our portfolio of patents and technology partnerships put Intrinsix in a unique position to take advantage of this growing market, by developing and marketing Thermal 3-D sensors. We believe they will be a game changer in the development of autonomous vehicle imaging systems, providing a cornerstone component of Level 5 autonomy.

 

References

  1. “Fatal Uber crash highlights a blind spot for self-driving cars: Night vision”, Automotive News, May 2018, http://www.autonews.com/article/20180529/MOBILITY/180529761/fatal-uber-crash-highlights-a-blind-spot-for-self-driving-cars:-night-
  2. “Accelerating the Future: The Economic Impact of the Emerging Passenger Economy”, June 2017 report by Strategy Analytics and Intel, https://newsroom.intel.com/newsroom/wp-content/uploads/sites/11/2017/05/passenger-economy.pdf
  3. “Global Autonomous Vehicle Sensors Market”, 2017 report by Tehnavio, https://www.technavio.com/report/global-automotive-electronics-global-autonomous-vehicle-sensors-market-2017-2021
  4. “Three Sensor Types Drive Autonomous Vehicles”, Gert Rudolph, Uwe Voelzke, Sensors Onlin, 10 Nov 2017, https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles
  5. “Projected global market for autonomous driving sensor components from 2015 to 2030”, Statista, https://www.statista.com/statistics/423106/projected-global-market-for-autonomous-driving-sensor-components/
Share the Post: