Thermal Imaging in Advanced Driver Assist Systems (ADAS)

Thermal Imaging in Advanced Driver Assist Systems (ADAS)

INTRODUCTION

The advent of Advanced Driver Assist Systems (ADAS) will revolutionize how we travel and transport goods by road and it has promise to do so while improving safety. It has potential to provide all age groups greater freedom of movement and improve the ef cient movement of cars. Simultaneously, it may reduce accident rates by removing human error and the ever-increasing threat of distracted driving.

Humans will be replaced as the decision makers behind the wheel of a car. The ability to sense may be substituted by a suite of sensors that not only replace, but also augment, our current capability to allow for driving in all types of weather and in all environments. Sensors are advancing, but there is not one sensor that can provide safe driving. A suite of complimentary and orthogonal sensors must be selected that will optimize driving performance across all conditions by providing critical information and redundancy to ensure safety at all time. The typical sensor suite includes radar, LIDAR, ultrasound and visible cameras. This paper will discuss the value of adding thermal cameras to the suite of sensors due to their unique strength; detecting and classifying humans and other living things in our cluttered driving environment.

WHY MULTIPLE SENSORS IN A TYPICAL ADAS ENVIRONMENT?

Driving conditions for ADAS can vary greatly and the chosen suite of sensors must not only be able to operate in city, rural, and highway environments, but it must perform equally well in all weather conditions, day or night. Each driving environment comes with challenges and requirements.

To drive autonomously requires a set of sensors that can not only detect reliably, but also have built in redundancy in case of sensor failure. No one technology embodies all aspects needed for fully autonomous driving. Each sensor can be used individually for detection or “fused” for decision making within a central electronics control unit (ECU), or vehicle central computer. Elements of a typical sensor suite can include long-range radar, LIDAR, visible cameras, short-medium radar and ultrasound as shown in Figure 1.

Autonomous driving systems use several methods, but the core approach is to detect and subsequently classify objects in order to determine a course of action for the vehicle to take. Radar and LIDAR systems can detect objects in or around the road, but need a camera to classify and identify the objects to give con dence in your situational awareness. Radar and LIDAR systems generate a point-density cloud from the re ections they gather and calculate an object range and closing speed, but they have lower

 

Figure 1: Typical sensor suite plus FLIR Thermal IR imaging

not been proven. Solid state units offer increased robustness at potentially lower cost, but have fewer points on target, especially when viewing objects from farther away. To generate the amount of data needed for object classi cation in a cost effective and reliable (redundant) solution, radar or LIDAR will most likely need to be fused with the output from a visible camera for most daylight conditions, and a thermal camera for nighttime and challenging daylight conditions.

These systems work together and can detect an object by the side of the road. However, as noted, classi cation is also a vital part of any detection capability as it may be merely a small tree that will not move – or a person that may indeed start moving. Classi cation becomes more challenging with poor lighting conditions, nighttime driving, and weather which are the critical strengths for thermal cameras. Table 1 shows the uses and strengths for each sensor type.

BENEFITS OF THERMAL IN ADAS

Radar and LIDAR are typically used for object detection, range, and mapping applications, while a camera is needed for object classi cation, speed signs, and red light detection. The addition of a thermal camera to a suite of ADAS sensors increases driving systems and situational awareness.

Thermal cameras or sensors detect differences in the relative intensities of the infrared energy being emitted or re ected from an object. This emitted infrared energy is around us all the time, and has nothing to do with the amount of visible light that is available. Thermal imaging is far more than just another kind of night vision – it sees heat, not light, and it does it 24 hours a day. Anything that creates heat can be seen with thermal. People, animals, all have individual heat signatures that can be seen with a thermal camera.

The military and law enforcement applications of infrared imaging have been known and proven for decades. Thermal cameras see through smoke, so they’re a proven tool for re ghting and search and rescue operations. Because their ability to see clearly is not related to the amount of light available they can see clearly even when looking directly into the sun when it’s low on the horizon. Even if there’s dense smoke or smog in the air, and the sun is at a blinding angle, a thermal camera will still be able to detect items clearly.

Recent technological advances have made thermal imagers smaller, lighter, and more affordable for commercial and consumer applications. This opens a whole new range of applications including ADAS, since thermal imagers are perfectly suited to detecting the differences in heat that can give away the presence of people or animals, seeing through smoke, dust or seeing in complete darkness.

THERMAL DETECTS PEOPLE IN A WIDE VARIETY OF LIGHTING CONDITIONS

Awareness of each vehicle’s position, direction, and speed in relation to
other traf c is critical to the successful implementation of ADAS. These factors can vary greatly depending on the type of driving being done, and on the speci c roadway. In all driving conditions, the power of thermal sensors to see heat

BENEFITS OF THERMAL

• Ability to clearly see people and animals even in a cluttered environment – hot objects stand out

•See clearly even in challenging lighting conditions – not affected by darkness, smoke, sunlight glare

•Redundancy and increased con dence in detection/classi cation when used with a visible camera – orthogonal detection

• Reliability – proven by over 10 years of use in the automotive industr

Power (dBm)

combined with simple object-recognition algorithms, makes it uniquely quali ed to detect and classify people and animals from the cluttered environment.

Highway driving can involve high speeds or stop-and-go-driving, so ADAS systems need to be able to detect vehicles, other objects, and hazards from both short and long ranges. Rural driving can involve lower speeds, but the greater possibility of animals and other natural hazards
on the roadway. There are over one million deer strikes in the US every year, costing several billion dollars in damage. Driving at night in rural areas is highly dependent on headlights as roads are typically poorly lit. Plus, roads are also not as predictably straight as highways so the ability to have wide situational awareness in poor lighting conditions is a challenge.

City driving may be the most challenging as it requires awareness of the cars and also items such as traf c lights, pedestrians, and bicyclists. Lighting conditions can also impact the ability to detect hazards. In daytime, the oblique angle of the sun and the resulting shadows may effectively hide pedestrians, while night driving into oncoming headlights can also hamper the detection of pedestrians and
road hazards.
SYSTEM REDUNDANCY

In selecting the suite of sensors to be used in ADAS applications, redundancy is a key requirement. While each sensor type brings its own strengths and weaknesses, the way they complement each other – reinforcing strengths and compensating for weaknesses in certain conditions – helps designers create a system that is robust enough to provide the complete situational awareness required for autonomous driving.

By their very natures, many of the sensors in an ADAS system create system redundancy simply because they operate in different areas of the electromagnetic spectrum. If the LIDAR

Figure 7: Thermal sensors are not blinded by oncoming headlights and can detect and classify people and animals in total darkness.

For more information about thermal imaging cameras, visible cameras, recorders, video management systems, or about this specific application, please visit: flir.com/automotive

system has mud or dirt blocking the receiving unit, the Radar system should be available as a backup system. If the lighting conditions (glare, nighttime) cause problems for the visible camera then the thermal camera can be used as the backup system. This demonstrates a truly redundant ADAS system.

SYSTEM INTEGRATION – COST, RELIABILITY AND PACKAGING Broader adoption of autonomous

driving solutions will require not only
a technologically reliable solution, but also a solution that is cost-effective and can be incorporated well into the car. Use of Ultrasonic, Radar, and Visible cameras are already widespread;
these technologies are reliable and
the economies of scale have made them affordable. In addition, they can be placed behind the windscreen or within the bumpers of cars for elegant packaging solutions.

In the case of LIDAR, current systems are typically costly, not very reliable, and dif cult to incorporate elegantly into a car. Because of the terri c potential of LIDAR in ADAS applications, many efforts are currently underway to create smaller solid- state units that should lower costs and increase reliability.

Thermal cameras have been used in automotive night vision applications for pedestrian and animal detection for over 10 years. They have proven extremely reliable, but due to the size of the detector and the relatively low volume of units produced, the cost of the system has relegated them to use in luxury brands and models. Recent technological developments have driven down the cost and size of the technology. One such example is the new 12μm sensor from FLIR that is used in the new Boson camera. It is cost effective and allows for elegant packaging within the car.

CONCLUSION

Autonomous driving systems need to deploy a sensor suite that is robust, high performance, and redundant enough
to deliver safety in the widest possible range of driving environments. The primary shortfall of current systems is the ability to detect and classify people and animals in challenging lighting conditions.

Thermal imagers are proven technology, and have helped drivers see up to four times farther than their high beams for more than a decade
– day, night, through smoke and haze, and past the glare of oncoming headlights. This same thermal imaging technology is the best sensor solution for detecting pedestrians, cyclists, and animals in cluttered environments, thereby giving ADAS integrations the critical information they needed to make accurate, automated decisions. Plus, their recent reductions in size and cost make them indispensable components in future ADAS systems.

For more information about using Thermal for ADAS then please see the details below

 

Automaker Media Group are pleased to announce the Automotive Automotive Testing Show & ExpoTesting Show & Expo at Ricoh Arena, Coventry in the United Kingdom on September 13th and 14th 2017.

Leave a Reply

Your email address will not be published. Required fields are marked *