It makes sense to perform the most dangerous activities in simulations when testing advanced driver assistance systems (ADAS), autonomous vehicle prototypes or simply cars in general. An early autonomous system can be extremely unpredictable, and Tesla warns that they could “make the wrong decision at the worst possible time.” So even before installing the system in the car, car manufacturers do a lot of work on the simulator to prepare the software for reality. At its most extreme, Tesla is building a supercomputer with capabilities far beyond what is needed.
However, sooner or later you have to verify your work on the go. A system that makes mistakes regularly keeps the driver alert, according to research, but systems that work flawlessly and only make a mistake once in a while can lull the human test driver into a false sense of security, resulting in death and road destruction. . As a result, you will need to be able to simulate driving that falls somewhere between simulation and reality.
One way for companies in the automotive space to bridge this gap is to build simulators that take in the entire vehicle, not just the computer. This approach helps ensure that the vehicle is ready for street testing, as the entire vehicle has passed a good test and won’t do anything crazy.
It turns out that autonomous aircraft, including unmanned drones, face similar challenges. Throwing a drone in the air and watching it perform can lead to some dangerous situations for the public, but computer simulations for the software that flies it don’t quite prepare the drone for actual testing in real airspace.
That’s why it’s exciting to see Microsoft doing the same thing as the car simulator link above, but with drones.
Microsoft’s Drone Simulator Lab
They start by telling us the story of Josh Riedy, CEO of Airtonomy, when he visited a Microsoft lab. Using VR glasses, he tracked a drone in a virtual wind farm. Across the Midwest, Leech is using hyper-realistic simulations to train autonomous aerial vehicles that now inspect wind farms, monitor wildlife and detect leaks in oil tanks.
As he looked around, he felt as if he was actually there in the air with the drone, and the drone’s software was similarly tricked into thinking he was actually in the air.
“You don’t want to fly drones into wind turbines or power lines or anything,” Riedy said. “Combined with the fact that winter can literally last 7 months in North Dakota, we realized we needed something other than the physical world to design our solutions for customers.”
Data generated by Project AirSim is used to train artificial intelligence models which actions to perform in each phase of flight, from takeoff through flight to landing. It will also provide libraries of simulated 3D environments representing various urban and rural environments, as well as a suite of sophisticated pre-trained AI models to help accelerate autonomous air infrastructure inspection, last-mile delivery and urban air mobility.
Gurdeep Pall, corporate vice president of Business Incubations in Technology & Research at Microsoft, said advances in AI, computing and sensor technology are beginning to change the way we move people and goods. And this is not just a problem in rural areas with wind farms; gridlocked roads and highways can no longer suffice as the fastest way to get from place to place due to the growth of urban density. Instead, companies are turning to drones for transportation.
“Autonomous systems will transform many industries and enable many aviation scenarios, from last-mile delivery of goods in congested cities to inspecting downed power lines from 1,000 miles away,” Pall said. “But first we need to safely train these systems in a realistic, virtualized world. Project AirSim is a critical tool that allows us to bridge the world of bits and the world of atoms, and demonstrates the power of the industrial metaversion—virtual worlds in which businesses will build, test, and refine solutions and then bring them to the real world. .”
AirSim, a former open source project from Microsoft Research that is now being canceled but inspired today’s launch, aimed to provide a high-fidelity simulation. AirSim was a popular research tool, but required significant programming and machine learning expertise. Microsoft has now transformed the open-source technology into a comprehensive platform that allows AAM customers to simulate 3D environments with AI aircraft, more easily test and train them.
“Everyone talks about artificial intelligence, but very few companies are able to build it at scale,” said Balinder Malhi, lead engineer for the AirSim project. “We built Project AirSim with key capabilities that we believe will help democratize and accelerate aviation autonomy—namely, the ability to accurately simulate the real world, capture and process massive amounts of data, and code for autonomy without the need for deep AI expertise.”
With Project AirSim, developers can access pre-trained AI building blocks, such as high-quality localization and obstacle avoidance models and precision landings. These out-of-the-box features eliminate the need for deep machine learning knowledge, allowing more people to train autonomous aircraft.
Simulink and AirSim are already available on various platforms, including Windows and Android. Microsoft is also working with industry partners to bring an accurate simulation of the weather, physics and – most importantly – the sensors that an autonomous machine uses to “see” the world. Customers can use high-fidelity sensor simulations based on Ansys’ physical sensors to obtain rich ground truth data for autonomous vehicles through a partnership with MathWorks. Meanwhile, Microsoft and MathWorks are working together on ways for customers to import their own physical modeling into Simulink using Simulink.
Technology alone will not be enough to bring about the age of autonomous flight. Industry must also find a way through existing aviation monitoring systems and regulatory situations. In order to accelerate the sector, the Project AirSim team is currently engaging with standards organisations, civil aviation authorities and legal entities to develop the required standards and methods of compliance.
Microsoft’s Pall said the company wants to work with global civil aviation authorities on how the AirSim project could help certify safe autonomous systems, potentially providing scenarios within AirSim that an unmanned vehicle must successfully navigate. In one scenario, there is heavy rain, strong winds and GPS connectivity drops. If the car can get from A to B every time under these conditions, Pall says it would be a significant achievement towards certification.
AirSim pioneer Ashish Kapoor is excited to have helped the simulation engine move from a code-based research tool to a more robust platform that can be used by any company without technical expertise. Kapoor, an aviator himself, can’t wait to see what this development will mean for the aviation industry.
“When a plane flies through space in the AirSim project, a ton of data is generated,” said Kapoor, now general manager of Microsoft’s autonomous systems research group. “Our ability to capture this data and translate it into autonomy will dramatically change the aviation landscape. And because of this, we will see many more vehicles in the sky to help monitor farms, inspect critical infrastructure and transport goods and people to the most remote places.”
Featured image provided by Microsoft and Airtonomy.
Do you appreciate CleanTechnica’s originality and reporting? Consider becoming a CleanTechnica member, supporter, technician or ambassador – or a patron on Patreon.
Don’t want to miss a story about clean technologies? Sign up for daily updates from CleanTechnica by email. Or follow us on Google News!
Got a CleanTechnica tip, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.