How drone autonomy unlocks a new era of AI opportunities
Reese Mozer, American Robotics@reesemozer
July 23, 2022 1:10 PM
Image Credit: Anton Petrus/Getty
Join executives from July 26-28 for Transform's AI & Edge Week. Hear from top leaders discuss topics surrounding AL/ML technology, conversational AI, IVA, NLP, Edge, and more. Reserve your free pass now!
[Editor’s note: American Robotics is a commercial developer of automated drone systems.]
Drones have been
talked about extensively for two decades now. In many respects, that attention has been warranted. Military drones have changed the way we fight wars. Consumer drones have changed the way we film the world. For the commercial market, however, drones have largely been a false start. In 2013, the Association for Unmanned Vehicle Systems International (AUVSI) predicted an
$82 billion market by 2025. In 2016, PwC predicted
$127 billion within the “near future.” But we aren’t anywhere close to those projections yet. Why is that?
Unmute
Duration 17:45
/
Current Time 1:20
Advanced Settings
FullscreenPlayUp Next
Presentation: Why operationalizing data mesh is critical for operating in the cloud
Let’s start with the primary purpose of drones in a commercial setting:
data collection and analysis. The drone itself is a means to an end – a flying camera from which to get a unique aerial perspective of assets for inspection and analysis, be it a pipeline, gravel storage yard, or vineyard. As a result, drones in this context fall under the umbrella of “remote sensing.”
In the world of remote sensing, drones are not the only player. There are high-orbit satellites, low-orbit satellites, airplanes, helicopters and hot air balloons. What do drones have that the other remote sensing methods do not? The first thing is:
image resolution.
EVENT
Transform 2022
Register now for your free virtual pass to Transform’s AI Week, July 26-28. Hear from AI and data executives from Visa, Lowe’s eBay, Credit Karma, Kaiser, Honeywell, Google, Nissan, Toyota, John Deere, and more.
Register Here
What does “high resolution” really mean?
One product’s high resolution is another product’s low resolution.
Image resolution, or more aptly Ground Sample Distance (GSD) in this case, is a product of two primary factors: (1) how powerful your imaging sensor is, and (2) how close you are to the object you are imaging. Because drones are typically flying very low to the ground (50-400 feet AGL), the opportunity to collect higher image resolutions than aircraft or satellites operating at higher altitudes is significant. Eventually you run into issues with physics, optics and economics, and the only way to get a better picture is to get closer to the object. To quantify this:
- “High resolution” for a drone operating at 50ft AGL with a 60MP camera is around 1 mm/pixel.
- “High resolution” for a manned aircraft service, like the now-defunct Terravion, was 10 cm/pixel.
- “High resolution” for a low-orbit satellite service, like Planet Labs, is 50 cm/pixel.
Put another way, drones can provide upwards of 500 times the image resolution of the best satellite solutions.
The power of high resolution
Why does this matter? It turns out there is a very direct and powerful correlation between image resolution and potential value. As the computing phrase goes: “garbage in, garbage out.” The quality and breadth of machine vision-based analytics opportunities are exponentially higher at the resolutions a drone can provide vs. other methods.
A satellite might be able to tell you how many well pads are in Texas, but a drone can tell you exactly where and how the equipment on those pads is leaking. A manned aircraft might be able to tell you what part of your cornfield is stressed, but a drone can tell you what pest or disease is causing it. In other words, if you want to
resolve a crack, bug, weed, leak or similarly small anomaly, you need the proper image
resolution to do so.
Bringing artificial intelligence into the equation
Once that proper image resolution is obtained,
now we can begin training
neural networks (NNs) and other
machine learning (ML) algorithms to learn about these anomalies, detect them, alert for them and potentially even predict them.
Now our software can learn how to differentiate between an oil spill and a shadow, precisely calculate the volume of a stockpile, or measure a slight skew in a rail track that could cause a derailment.
American Robotics estimates that over 10 million industrial asset sites worldwide have use for automated drone-in-a-box (DIB) systems, collecting and analyzing 20GB+ per day per drone. In the
United States alone, there are over 900,000 oil and gas well pads, 500,000 miles of pipeline, 60,000 electrical substations, and 140,000 miles of rail track, all of which require constant monitoring to ensure safety and productivity.
As a result, the scale of this opportunity is actually hard to quantify. What does it mean to fully digitize the world’s physical assets every day, across all critical industries? What does it mean if we can start applying modern AI to petabytes of ultra-high-resolution data that has never existed before? What efficiencies are unlocked if you can detect every leak, crack and area of damage in near-real time? Whatever the answer, I’d wager the $82B and $127B numbers estimated by AUVSI and PwC are actually low.
So: if the opportunity is so large and clear, why haven’t these market predictions come true yet? Enter the second important capability unlocked by autonomy:
imaging frequency.
What does “high frequency” really mean?
The useful imaging frequency rate is 10x or more than what people originally thought.
The biggest performance difference between
autonomous drone systems and piloted ones is the frequency of data capture, processing and analysis. For 90% of commercial drone use cases, a drone must fly repetitively and continuously over the same plot of land, day after day, year after year, to have value. This is the case for agricultural fields, oil pipelines, solar panel farms, nuclear power plants, perimeter security, mines, railyards and stockpile yards. When examining the full operation loop from setup to processed, analyzed data, it is clear that operating a drone manually is much more than a full-time job. And at an average of
$150/hour per drone operator, it is clear a full-time operational burden across all assets is simply not feasible for most customers, use cases and markets.
This is the central reason why all the predictions about the commercial drone industry have, thus far, been delayed. Imaging an asset with a drone once or twice a year has little to no value in most use cases. For one reason or another, this frequency requirement was overlooked, and
until recently [subscription required], autonomous operations that would enable high-frequency drone inspections were prohibited by most federal governments around the world.
With a fully-automated drone-in-a-box system, on-the-ground humans (both pilots and observers) have been removed from the equation, and the economics have completely changed as a result. DIB technology allows for constant operation, multiple times per day, at less than a tenth of the cost of a manually operated drone service.
With this increased frequency comes not only cost savings but, more importantly, the ability to track problems when and where they occur and properly train AI models to do so autonomously. Since you don’t know when and where a methane leak or rail tie crack will occur, the only option is to scan every asset as frequently as possible. And if you are gathering that much data, you better build some software to help filter out the key information to end users.
Tying this to real-world applications today
Autonomous drone technology represents a revolutionary ability to digitize and analyze the physical world, improving the efficiency and sustainability of our world’s critical infrastructure.
And thankfully, we have
finally moved out of the theoretical and
into the operational. After 20 long years of riding drones up and down the Gartner Hype Cycle, the “plateau of productivity” is cresting.
In January 2021, American Robotics became the first company
approved by the FAA to operate a drone system beyond visual line-of-sight (
BVLOS) with no humans on the ground, a
seminal milestone unlocking the first truly autonomous operations. In May 2022, this approval was expanded to include
10 total sites across eight U.S. states, signaling a clear path to national scale.
More importantly, AI software now has a practical mechanism to flourish and grow. Companies like
Stockpile Reports are using automated drone technology for daily stockpile volumetrics and inventory
monitoring. The
Ardenna Rail-Inspector Software now has a
path to scale across our nation’s rail infrastructure.
AI software companies like
Dynam.AI have a
new market for their technology and services. And customers like Chevron and ConocoPhillips are looking toward a near-future where methane emissions and oil leaks are significantly curtailed using
daily inspections from autonomous drone systems.
My recommendation: Look not to the smartphone, but to the
oil fields,
rail yards,
stockpile yards, and
farms for the next data and AI revolution. It may not have the same pomp and circumstance as the “metaverse,” but the
industrial metaverse might just be more impactful.
Reese Mozer is cofounder and CEO of American Robotics.