Logged-out Icon

AI-Powered System Aims to Make Skies Safer as Commercial Drone Traffic Soars

A new study published in the journal Computer reveals how AI algorithms are stepping in to make the skies safer as commercial drone numbers soar

MapMyIndia buys stake in drone startup Indrones for Rs 7 crores

As the skies become increasingly populated with drones, the risk of collisions rises. With predictions of nearly a million commercial drones flying below 400 feet in the U.S. airspace by 2027, there’s an urgent need for improved traffic management. These drones will be undertaking tasks like delivering packages, monitoring traffic, and offering emergency support. To address this challenge, experts are turning to artificial intelligence (AI).

A research team led by the Institute for Assured Autonomy’s Lanier Watkins and Louis Whitcomb has developed a system using AI to enhance drone traffic safety. By integrating AI, the system can autonomously make decisions, reducing the need for human intervention. Their findings, which showcase the success of this system, were published in the journal Computer.

According to Watkins, an associate research professor at the Whiting School of Engineering and a member of the Johns Hopkins Applied Physics Laboratory, their primary aim was to determine if AI could safely manage the anticipated surge in drone activities. “Our simulated system incorporates autonomy algorithms that elevate the safety and efficiency of drone operations under 400 feet,” Watkins elaborated.

The researchers’ approach was to simulate a 3D airspace populated with drones. From their previous studies, they knew that collision avoidance algorithms significantly minimized mishaps. By further introducing deconfliction algorithms, which manage traffic flow and timings, they found a near elimination of airspace accidents.

Moreover, the team added a touch of real-world unpredictability to their simulator. They incorporated “noisy sensors” to simulate real-world uncertainties, making their system more versatile. Additionally, a “fuzzy interference system” was used, which evaluated the risk levels of each drone based on factors like their closeness to obstacles and their route adherence. Such mechanisms allowed the system to autonomously make decisions to avert collisions.

Whitcomb, a professor of mechanical engineering at the Whiting School of Engineering, commented on their study’s breadth. “We took into account numerous variables, even including ‘rogue drones’ that veer off their designated paths. The findings have been quite optimistic.”

The next steps for the research team involve refining their simulations by factoring in dynamic elements such as weather to represent a more holistic real-world environment.

Highlighting the depth of their work, Watkins mentioned that their research builds upon over 20 years of studies at the Johns Hopkins University Applied Physics Laboratory. These studies have been aimed at bolstering the safety protocols of the U.S.’s National Airspace System.

Watkins also emphasized the broader significance of their work. “Our studies help scholars grasp how AI algorithms, designed to safeguard our skies, operate amidst uncertainties in a 3D-simulated airspace. It’s crucial to keep a vigilant eye on these AI systems to ensure they don’t malfunction.”

This website uses cookies to ensure you get the best experience on our website