Automotive Night Vision and Thermal Sensing: An Introduction to Advanced Technology
Seeing in complete darkness during inclement weather with only simple headlights is hard for human vision. The ability to recognize obstacles that headlights can barely illuminate such as dense fog or heavy rain would help the vehicle/driver to identify and avoid these potential collision events before they occur. Object identification is vital to the safety of the vehicle’s occupants and those around it. Night vision and thermal imaging provide a better picture of the environment around the vehicle in sub optimal conditions. Adding this type of technology to the sensor array in an ADAS system provides another piece of the story to give the ECMs on the vehicle more information to help make better decisions on how to keep the vehicle safely operating. With the emergence of AI into the mix, the speed at which it can identify and predict potential outcomes further enhances the vehicles’ safety systems in real time.
How Do Night and Thermal Vision Operate?
To understand night and thermal vision, you first must understand how the light spectrum is built. The electromagnetic spectrum is laid out in Gamma rays, X-rays, Ultraviolet, Visible, Near Infrared, Infrared, and Radio. These are measured on the micrometer (µm) scale, which is equal to one-millionth of a meter. In comparison, the microwave operates on one centimeter wavelength and television has lengths greater than one meter. These wavelengths are some that the human body cannot see, hear, or feel at this level. The ability to understand what is happening in this spectrum range does require external equipment to translate it into an understandable medium that the human body can comprehend. Visible light is on the spectrum in the range of 400-700nm (Sundermeier et al., 2022). Anything above or below that range is not visible to the naked eye.
The color scheme, wavelength of the light, and resolution are not ones that are comprehendible to the human eye. When discussing night vision, the use of light and thermal signatures is the underpinning of how technology operates. As with the name “thermal vision,” it utilizes the heat signature of the object to discern it from the surrounding environment in absence of visible light. Changing color based on the temperature of the surface provides information to the receiver to generate an image of the color differential which the human eye can distinguish as an object (Rivera Velázquez et al., 2022). Night vision utilizes light amplification to view the image in front of the camera. Enhancing the traces of ambient light throughout the spectrum can change darkness into an environment which allows the human eye to view the image.
Types of Thermal and Night Visions
There are two main types of Night Vision that combine both thermal and light spectrum amplification. Passive Systems – Far-Infrared (FIR) Thermal Imaging is a system that utilizes thermal imaging measuring the image by the naturally emitted heat or radiation above absolute zero (0 Kelvin) (Sundermeier et al., 2022). As this image is created, different temperatures appear as different colors depending on the heat intensity of the object. This system is also indicated as long-wave infrared (LWIR) portion of the electromagnetic spectrum operating on the 7-14 micrometers (µm) scale (Stančić et al., 2023).
Passive systems do not require external illumination, which decreases the complexity of the component and allows the image to be unmolested from an external source. To create the thermal map of the object, an array of microbolometers (sensing portion of the sensor) detect the various thermal events by changing their resistance based on the temperature of that event near it. This array is behind a Germanium lens which concentrates far-infrared radiation and focuses it onto the array. As this array develops information, it sends it to a module which then compiles it through software which then creates a thermogram, which plots the information to generate an image, to be deciphered by the control module for the unit. All these steps happen in one thirteenth of a second, so the module can then pass this information into any other control unit for human consumption. These systems have a range of sight up to 1,000 feet (300 meters).
The other type of system is an Active System - Near-Infrared (NIR) Illumination. NIR systems operate more like a flashlight providing its own infrared light source to illuminate the road where it reflects it off of objects in its view path. Similar to RADAR, NIR operates on the .75-1.4 micrometers (µm) which is much closer to the light spectrum humans can see. This system utilizes emitters that project infrared beams forward of the vehicle, either non-gated (constant beam) or gated (pulsed beam). Because of the spectrum it operates on, these beams do not hinder oncoming traffic. As the beams come back to the camera, the shutter closes. The incoming photons of like strike a photocathode which converts the photons into electrical pulses. These pulses then are processed to generate a black and white image that can then be processed within the control unit. These systems provide a higher quality image as opposed to a passive system, but have an increased cost based on the equipment needed to make that image.
Conclusion
The path from not seeing in the dark or inclement weather is becoming a thing of the past. The ability to identify objects in the way of the vehicle in low light conditions can be solved by utilizing one of these technologies discussed here in this blog. As this technology matures, the cost to implement will continue to fall, and OEMs will continue to include it within their sensor array on new vehicles. Understanding the different types of light wavelengths will give the technician a knowledge of how the system operates and what potentially might be causing the issue with it if it fails. This will become a larger part of the autonomy discussion as more vehicles come with the ability to self-drive, the need for safety will undoubtedly increase. This is just another piece of the puzzle within the discussion. Technician skills must continue to increase to be competent in this area if they determine this is the career path for them moving forward. The information generated by late model vehicles is increasing at a rate that we have not seen before. Making sense of all this information is where further training is needed throughout the industry and continuous development is a must for technicians of today; CDX is there to help.
The MAST series of CDX provides the instructor with pointed material to exceed the requirements of any ASE training currently on the market. Utilizing the Read-See-Do model throughout the series, the student has various learning modalities present throughout the products which allow them to pick the way they learn the best. From developing simulations on cutting edge topics to providing a depth of automotive technical background, CDX has a commitment to making sure instructors and students have the relevant training material to further hone their skill sets within the mechanical, electrical and software driven repair industry. CDX Learning Systems offers a growing library of automotive content that brings highly technical content to the classroom to keep you and your students up to date on what is currently happening within the Mobility Industry. Check out our Light Duty Hybrid and Electric Vehicles, along with our complete catalog.
About the Author
Nicholas Goodnight, PhD is an Advanced Level Certified ASE Master Automotive and Truck Technician and an Instructor at Ivy Tech Community College. With nearly 25 years of industry experience, he brings his passion and expertise to teaching college students the workplace skills they need on the job. For the last several years, Dr. Goodnight has taught in his local community of Fort Wayne and enjoys helping others succeed in their desire to become automotive technicians. He is also the author of many CDX Learning Systems textbooks, including Light Duty Hybrid and Electric Vehicles (2023), Automotive Engine Performance (2020), Automotive Braking Systems (2019), and Automotive Engine Repair (2018).
Related Content
Automotive Applications of Advances in Silicon Anode Battery Technology
Lithium-Sulfur Batteries: A Game-Changer for Electric Vehicles
References
Gheorghe, C., Duguleana, M., Boboc, R. G., & Postelnicu, C. C. (2024). Analyzing Real-Time Object Detection with YOLO Algorithm in Automotive Applications: A Review. In CMES - Computer Modeling in Engineering and Sciences (Vol. 141, Issue 3, pp. 1939–1981). Tech Science Press. https://doi.org/10.32604/cmes.2024.054735
Rivera Velázquez, J. M., Khoudour, L., Saint Pierre, G., Duthon, P., Liandrat, S., Bernardin, F., Fiss, S., Ivanov, I., & Peleg, R. (2022). Analysis of Thermal Imaging Performance under Extreme Foggy Conditions: Applications to Autonomous Driving. Journal of Imaging, 8(11). https://doi.org/10.3390/jimaging8110306
Stančić, I., Kuzmanić Skelin, A., Musić, J., & Cecić, M. (2023). The Development of a Cost-Effective Imaging Device Based on Thermographic Technology. Sensors, 23(10). https://doi.org/10.3390/s23104582
Sundermeier, M. C., Dierend, H., Ley, P.-P., Wolf, A., & Lachmayer, R. (2022). Active NIR illumination for improved camera view in automated driving application. Light-Emitting Devices, Materials and Applications XXVI, 25. https://doi.org/10.1117/12.2608162