Exyn Brings Level 4 Autonomy to Drones

Fully autonomous exploration and mapping of the unknown is a cutting-edge capability for commercial drones

4 min read

Exyn Drone
Photo: Exyn

Drone autonomy is getting more and more impressive, but we’re starting to get to the point where it’s getting significantly more difficult to improve on existing capabilities. Companies like Skydio are selling (for cheap!) commercial drones that have no problem dynamically path planning around obstacles at high speeds while tracking you, which is pretty amazing, and it can also autonomously create 3D maps of structures. In both of these cases, there’s a human indirectly in the loop, either saying “follow me” or “map this specific thing.” In other words, the level of autonomous flight is very high, but there’s still some reliance on a human for high-level planning. Which, for what Skydio is doing, is totally fine and the right way to do it.

Exyn, a drone company with roots in the GRASP Lab at the University of Pennsylvania, has been developing drones for inspections of large unstructured spaces like mines. This is an incredibly challenging environment, being GPS-denied, dark, dusty, and dangerous, to name just a few of the challenges. While Exyn’s lidar-equipped drones have been autonomous for a while now, they’re now able to operate without any high-level planning from a human at all. At this level of autonomy, which Exyn calls Level 4A, the operator simply defines a volume for the drone to map, and then from takeoff to landing, the drone will methodically explore the entire space and generate a high resolution map all by itself, even if it goes far beyond communications range to do so.

Let’s be specific about what “Level 4A” autonomy means, because until now, there haven’t really been established autonomy levels for drones. And the reason that there are autonomy levels for drones all of a sudden is because Exyn just went ahead and invented some. To be fair, Exyn took inspiration from the SAE autonomy levels, so there is certainly some precedent here, but it’s still worth keeping in mind that this whole system is for the moment just something that Exyn came up with by themselves and applied to their own system. They did put a bunch of thought into it, at least, and you can read a whitepaper on the whole thing here.

Exyn Autonomy LevelsLarger version here.Graphic: Exyn

A couple things about exactly what Exyn is doing: Their drone, which carries lights, a GoPro, some huge computing power, an even huger battery, and a rotating Velodyne lidar, is able to operate completely independently of a human operator or really any kind of external inputs at all. No GPS, no base station, no communications, no prior understanding of the space, nothing. You tell the drone where you want it to map, and it’ll take off and then decide on its own where and how to explore the space that it’s in, building up an obscenely high resolution lidar map as it goes and continuously expanding that map until it runs out of unexplored areas, at which point it’ll follow the map back home and land itself. “When we’re executing the exploration,” Exyn CTO Jason Derenick tells us, “what we’re doing is finding the boundary between the visible and explored space, and the unknown space. We then compute viewpoint candidates, which are locations along that boundary where we can infer how much potential information our sensors can gain, and then the system selects the one with the most opportunity for seeing as much of the environment as possible.”

Flying at up to 2 m/s, Exyn’s drone can explore 16 million cubic meters in a single flight (about nine football stadiums worth of volume), and if the area you want it to explore is larger than that, it can go back out for more rounds after a battery swap.

It’s important to understand, though, what the limitations of this drone’s autonomy are. We’re told that it can sense things like power lines, although probably not something narrow like fishing wire. Which so far hasn’t been a problem, because it’s an example of a “pathological” obstacle—something that is not normal, and would typically only be encountered if it was placed there specifically to screw you up. Dynamic obstacles (like humans or vehicles) moving at walking speed are also fine. Dust can be tricky at times, although the drone can identify excessive amounts of dust in the air, and it’ll wait a bit for the dust to settle before updating its map.

Exyn DronePhoto; Exyn

The commercial applications of a totally hands-off system that’s able to autonomously generate detailed lidar maps of unconstrained spaces in near real-time are pretty clear. But what we’re most excited about are the potential search and rescue use cases, especially when Exyn starts to get multiple drones working together collaboratively. You can imagine a situation in which you need to find a lost person in a cave or a mine, and you unload a handful of drones at the entrance, tell them “go explore until you find a human,” and then just let them do their thing.

To make this happen, though, Exyn will need to add an additional level of understanding to their system, which is something they’re working on now, says Derenick. This means both understanding what objects are, as well as reasoning about them, which could mean what the object represents in a more abstract sense as well as how things like dynamic obstacles may move. Autonomous cars have to do this routinely, but for a drone with severe size and power constraints, it’s a much bigger challenge, but one that I’m pretty sure Exyn will figure out.

The Conversation (0)