Cybernetics usually refers to humans enhancing themselves with robotic parts. Sometimes, we heard about animal-robot cyborgs, or insect-robot cyborgs. It’s not all that often that we hear about plant-robot cyborgs, because what’s a plant going to do with a robot, right? But you could argue that plants have the most to gain from robotic enhancements, because otherwise (with a few totally cool exceptions) plants aren’t capable of mobility or manipulation at all.
It’s straightforward to see how mobility and manipulation could be useful for plants, but the real question is, How do you get a plant to tell its robotic parts what to do? At the MIT Media Lab, Harpreet Sareen is trying to figure this out, and Elowan the mobile cybernetic plant is just the first in “a series of plant-electronic hybrid experiments.”
Elowan is an attempt to demonstrate what augmentation of nature could mean. Elowan’s robotic base is a new symbiotic association with a plant. The agency of movement rests with the plant based on its own bio-electrochemical signals, the language interfaced here with the artificial world.
These in turn trigger physiological variations such as elongation growth, respiration, and moisture absorption. In this experimental setup, electrodes are inserted into the regions of interest (stems and ground, leaf and ground). The weak signals are then amplified and sent to the robot to trigger movements to respective directions.
Such symbiotic interplay with the artificial could be extended further with exogenous extensions that provide nutrition, growth frameworks, and new defense mechanisms.
The difference between this plant-robot hybrid and others that we’ve seen in the past is that the plant is actually in control: The robotic base moves where the plant wants it to, to the extent that a.) plants want things and b.) the plant is able to communicate such, and c.) we’re able to correctly interpret it. So, it’s not just that the robot part is like, "Oh, there’s some light over there, plants like light, let’s go over to the light,” because that’d be completely independent of the plant itself. Instead, the system measures signals from the plant itself and takes direction from that. Whether it’s the right direction or not isn’t necessarily clear, but at least the plant is in the loop somewhere, rather than being just a passenger.
While the intent here is to give the plant some agency of its own, the practical result is still a robot with a plant on it that chases light. That’s a pretty safe thing for the robot to do, I suppose, but are plants more nuanced than that, and if so, is it something that robots could eventually detect and respond to? My dying houseplants really, really hope so.
[ MIT Media Lab ]
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.