TÜV SÜD, a company based in Munich, Germany, has announced that NVIDIA’s DRIVE OS 5.2 software meets the International Organization for Standardization (ISO) 26262 Automotive Safety Integrity Level (ASIL) B standard. This standard focuses on the functional safety of road vehicle systems, hardware and software.
DRIVE OS is an operating system specifically designed for in-vehicle acceleration computing, using the NVIDIA DRIVE platform. It serves as the foundation for NVIDIA’s DRIVE SDK, which includes developer tools such as the NVIDIA Tensor RT SDK for real-time AI processing and the NvMedia library for sensor input processing.
To meet the ISO 26262 standard, NVIDIA’s software had to be able to detect failures during operation and be developed through a process that addresses potential systematic faults throughout the entire V-model. This includes everything from defining safety requirements to coding, analysis, verification, and validation. Essentially, the software must aim to avoid failures whenever possible, and detect and respond to them if necessary.
After rigorous testing, TÜV SÜD’s team determined that DRIVE OS 5.2 meets the strict criteria for safety-related use in applications up to ASIL B. ISO 26262 identifies four ASILs, with A being the lowest level of automotive hazard and D being the highest.
TÜV SÜD is known for assessing compliance with national and international standards for safety, durability, and quality in various applications, including cars, factories, buildings, bridges, and other infrastructure.
In addition to meeting the ISO 26262 standard, NVIDIA DRIVE is an open platform that allows experts from top car companies to build upon the company’s industrial-strength system. Earlier this year, NVIDIA filed a patent for a system that aims to address one of the biggest challenges in autonomous driving: how self-driving cars can identify and react to emergency vehicles.
The patent, published by the US Patent and Trademark Office in May 2022, describes a system that uses microphones attached to autonomous or semi-autonomous cars to capture the sounds of nearby emergency response vehicle sirens. The microphones will work with a Deep Neural Network (DNN) to create audio signals corresponding to the detected sirens.