Given the current state of computer science and robotics, it’s hard to understand how ”the singularity” meme has become lodged in the serious discourse of the technosphere. This is the idea that, as a consequence of exponentially accelerating technological innovation and continuously self-improving artificial intelligence, computer power will outstrip human brainpower, leading to the end of human culture as we know it. Not a century from now, mind you, but somewhere between 2030 and 2045, depending on whom you talk to.
The concept was framed in its most tech-savvy form by computer scientist and science-fiction writer Vernor Vinge in 1983 in Omni magazine. It has since morphed into a complicated ”theory” that for some, notably prolific inventor Ray Kurzweil, includes a posthuman afterlife in which we abandon our biological selves and are uploaded into digital and possibly robotic vessels, there to spend eternity as cybernetic Methuselahs. It is also thought by its followers to be inevitable, not merely one of many possible future scenarios.
What we found is that there’s a lot of hyperbole distracting us from the real work under way in nanotechnology, brain implants, and machine learning. Researchers are, with some success, making machines more intelligent and responsive to solving real-world problems. The explosion of disciplines involved in these pursuits gives you some sense of their complexity. Robotics departments have now added developmental, epigenetic , or evolutionary to their names; control and systems are becoming more and more intelligent; AI is coursing through the blood of embodied cognitive science.
But we’re still a very long way from understanding how consciousness arises in the human brain, let alone figuring out how to re-create it in a machine. We’re even a long way from the much simpler goal of creating autonomous, self-organizing, and perhaps even self-replicating machines.
Simple locomotion—like walking—has only recently been conquered by roboticists. And there’s still a lot of work to be done to integrate walking with other functions, like seeing and hearing. An example that’s been causing a sensation on YouTube is Boston Dynamics’ DARPA-funded robotic dog duo, BigDog and LittleDog. They are the braindogs of IEEE member and company founder Marc Raibert, who figured out in the 1980s that robotic running could be controlled by a few decoupled control laws.
BigDog--which actually looks more like a large headless spider that’s been chopped in half—is designed to help soldiers in the field carry heavy equipment and supplies across rocky terrain. It can run, walk, and climb and is able to right itself after stepping into a hole or tripping over a branch, but it can’t figure out how to avoid these obstacles—at least not yet.
LittleDog, on the other hand, is learning to ”see” its environment before taking a walk. Software and sensors help the robot evaluate the surface it’s about to step on so it can decide where to go next. The interesting thing, as our journalism intern Sally Adee noted in her blog post about LittleDog, is just how much processing time it takes the robopup to make ”walk here, not there” decisions. It’s certainly not ready for Tokyo’s busy crosswalks.
This is hard scientific and technical work, and it involves the deep understanding and use of dozens of biological and electromechanical systems. The ability to remain upright while avoiding potholes is important, but it’s just an infinitesimal step along the road to anything resembling human consciousness.
Wireless communication, ubiquitous computing, nanotechnology, distributed sensing, and embedded systems are going to converge and deliver wonders. Electronic prosthetic devices and biopharmaceuticals will help us correct or expand our physical and mental capabilities. Ultimately, we may even learn enough about consciousness to re-create it in a machine and create artificial vessels for our own minds. But with all we have to do over the next 30 to 40 years, we don’t expect to be hitting the ”Upload to digital heaven” button anytime soon.
Do you? Let us know.