We recently interviewed Stefan Milz, the founder, managing director, and head of R&D at Spleenlab, to hear how Spleenlab is enabling autonomous processes like inspections and collision avoidance with their VISIONAIRY AI software on top of the Auterion software stack.
Integrated Software Stack
Software is an exciting part of the open drone ecosystem because that’s where Artificial Intelligence (AI) and Machine Learning (ML) happen. Spleenlab’s ML algorithms can be installed on Auterion’s AI Node directly on board mobile robots, enabling the deployment of safe and reliable perception software.
The perception software running on board is used for object detection, tracking, and positioning—these are all the things that need to move through space autonomously without a GPS signal, for example.
“When we started integrating our technologies with the Auterion Stack, one of the first applications was flying a drone in an unknown environment and having it detect obstacles and assets.”
Data Processing Onboard
The real challenge is to detect a vehicle in real-time, identify it, and then interact with it, while also giving the information to the customer. It’s all about taking data that is captured by sensors and then processing it and turning it into information.
“It’s great that we could partner with Auterion on the AI Node, so we can deliver our software that runs embedded in real-time on the drone, on low-power consumption and enable the entire process.”Stefan Milz, Spleenlab
Advanced ML algorithms are computation-heavy and require suitable horsepower to be deployed onboard mobile robots. With Auterion’s Skynode™ and AI Node, the VISIONAIRY® AI software can easily be deployed and run onboard drones.
Not having to transfer the data to the cloud is safer because the process doesn’t rely on a video link and it lowers the bandwidth because the data is processed into metadata that can be interacted with and used to make decisions. All of this brings us closer to full autonomy.
The machine learning approach helps to automate more of the drone process, which enables superior technologies and will further commercial drone use.
Today, even if end users have automated vehicles, there’s still someone controlling the drone to inspect an asset like a tower or a bridge. But with VISIONAIRY AI and AI Node, even that part can be automated.
This was one of the first applications Spleenlab integrated with the Auterion Stack, to fly in an unknown environment to detect an obstacle or asset. The technology can detect a cell tower, locate where it is in real-time on the drone, and then execute a mission, like an inspection. For the end user, it helps them automate inspections, saving them time and money.
Future is AI
Everyone wants to operate BVLOS (Beyond Visual Line of Sight), but two entities govern this space, the FAA and EASA, and they set the requirements. These requirements dictate parts of the system’s designs, which can make things complicated. We need to define how software should work in a safety-critical system. The near future is about figuring out how to bring AI into such environments. For example, what happens if the video link breaks down? The system can react and switch to a safe-state mode and do an emergency landing.
Indoor flying is also a big topic, where we absolutely want to have a stable and redundant positioning without GPS. Middle to long term, this is the biggest challenge.
Spleenlab is excited about AI Node and they’re expanding partnerships with companies within the Auterion Ecosystem. Until now the drone market was about flying and now it’s about operating it automatically, which is where AI comes in.