Automaton iconAutomaton

Unstoppable Robot Eats Landmines for Breakfast

The Digger D-3 is the most recent addition to my own personal list of robots not to stand in front of. It's a mine-clearing robot, and not the sort of mine-clearing robot that pokes around with a metal detector. Instead, it's the sort of mine-clearing robot that just sucks it up and tells the landmines to bring it.

At the front of the D-3 is a giant spinning metal pulverizer thing of death, which has tungsten hammers that beat down a quarter meter into the ground, turning everything they touch into mulch. This includes landmines, and although the mines do tend to blow up before getting shredded, the robot hardly seems to notice:

An operator commands this beast from a safe distance using a remote control unit. The hull of the robot is made up of hardened steel plates in a "V" shape to help limit any damage from antitank mines and unexploded shells of sizes up to 81mm, and the D-3 has been able to successfully ingest mines containing as much as 8 kilograms of explosive, which is nothing to sneeze at. The only potentially vulnerable spots are the air intakes, which are themselves protected from flying shrapnel by special grates. At full throttle, the D-3 can reliably clear a comforting 100 percent of landmines from the ground at a rate of 1,000 square meters per hour [about 10,000 square feet per hour], while also divesting the land of any unwanted shrubbery and unlucky mole colonies.

Despite all the protection, machines do break down on occasion, and Digger has taken the somewhat unusual step of making the robot as easy as possible for other people to repair. The guts of the robot are straightforward to access, the armor has been designed to be easy to weld, and Digger even provides plans so that if you have the means, you can build your own spare parts. The reason for doing this is that Digger wants the D-3 to be able to make a difference in far-flung communities crippled by the threat of landmines, and to do that, you need an extremely reliable robot.

The future for the D-3 likely lies in some form of limited autonomy, but don't worry: The people who actually end up using this thing don't like the idea of it being fully autonomous any more than you do. Expect it to eventually be able to obey pretty specific instructions like "go here," as opposed to commands like "hey, why don't you find a spot where you think there might be landmines, beat it into a pulp, and come back when you're done."

[ Digger D-3 ] via [ Robots Podcast ]

Smithsonian Snaps Up Nine Historic Robots from Sandia National Labs

While thinking about robotics as a still-emerging field, as we do, we don't often stop to consider how even the relatively recent past has a significant historical relevance. Fortunately, this is the job of the Smithsonian Institution, and they seem to be very proactive about it, having just acquired nine robots from Sandia National Labs for their permanent collection.

The robots in the above picture include MARV (Miniature Autonomous Robotic Vehicle), a design from 1996 that used mostly commercial parts and measured only about one inch square [about 6.5 square centimeters]. MARV was one of the first robots to really tackle miniaturization head on, and it inspired all kinds of tiny little descendants, including Sandia's own dime-sized tank.

Also heading to the Smithsonian are SIR, a robot that could navigate through a building autonomously in 1985 (on the left), Dixie, a reconnaissance robot from 1987 (at the back), and some of those crazy hopping robots.

It's fun to think about what robots that we have around us right now are likely to find a place in the Smithsonian's collection within a decade or two... After five seconds of thought (which means I'm missing all kinds of slightly less obviously but equally worthy choices), I'd have to put my money on a Roomba, PR2, Keepon, a Predator, and Wall-E. What do you think?

Via [ Sandia ]

Stanford's 'JediBot' Tries to Kill You With a Foam Sword

If there was one bad thing about those lightsaber-wielding robots from Yaskawa that we saw at ICRA, it was that you couldn't bust out your own lightsaber and jump in the middle of the fight. A paper also presented at ICRA showed us robots swinging swords in simulation against humans, but without much in the way of physical combat. Now a student project at Stanford has put these two brilliant ideas together and come up with "JediBot," a robot arm that will actually try to kill you with a foam sword:

"The robot applies quite a bit of force." Get it? Force? Yeah!

This project was part of Stanford's three and a half week long "Experimental Robotics" course, which, from the sound of things, is basically just an excuse for students to mess around with robots to get them to do cool stuff. Also developed as part of the course were a robot that plays golf, several robots that draw, and a robot that can make hamburgers and then drown then in ketchup for you:

[ Stanford Robotics ] via [ Stanford News ]

Robot Helps Quadriplegic Scratch an Itch for the First Time in a Decade

We love watching PR2 fold laundry, play pool, bake cookies, and bring us beer, but robots with the capability to do the same kinds of things that humans can do aren't around just to take over for us when we're feeling lazy. Robots also exist to do things that humans can't do, whether that's making fast and precise movements, defusing bombs, or lending a gripper to a person with a disability.

Henry Evans, the dude in the above video, has been a quadriplegic for the last ten years, having suffered a stroke when he was just 40 years old. He saw a PR2 on TV last year, and thought that a robot might be a handy thing to have around the house to help him live a bit more independently. Georgia Tech's Healthcare Robotics Lab and Willow Garage have been collaborating with Henry since then, and he's been able to use a PR2 to do things like shave himself and scratch itches when he has them, things for which Henry has been dependent on other people for the last decade.

Part of what makes the PR2 ideal for this sort of thing are its high-level autonomous capabilities. Using a head tracker, Henry can give the robot commands to navigate to specific locations or fetch objects, and the PR2's sensors and software handle the rest. Of course, it's not realistic to hope that every disabled person will be able to one day get a PR2 (each costs $400,000). What is realistic (I hope) is that what Willow Garage and Georgia Tech are learning here will help them to design better software and hardware for the next generation of home service and healthcare robots, which will be affordable so more people can have them.

This project is an important reminder that while most of us are hoping that robots will at some point step in and make our lives easier and more convenient, most of us actually don't really need robots. Some people do need them, though, and it's great to see companies and research groups with so much expertise in this area working to make robots available where they have the potential to do the most good.

[ Willow Garage ]

[ Georgia Tech Healthcare Robotics ]


Senator Calls Robot Projects Wasteful
Tue, June 14, 2011

Blog Post: A U.S. senator criticizes the NSF for squandering "millions of dollars on wasteful projects"

PR2 Robot Learning To Bake Cookies
Fri, June 10, 2011

Blog Post: Push a single button and have your PR2 bot bake you a fresh batch of cookies

PR2 Does The Impossible, Folds Towels
Wed, March 31, 2010

Blog Post: PR2 may not be the fastest at folding towels, but the fact that it can do it entirely autonomously is nuts

Willow Garage PR2 Robot Now Plays Pool
Wed, June 16, 2010

Blog Post: After learning how to open doors, plug itself into wall outlets, and fold towels, the robot now can play pool

Polish Robot Coaxes Expressiveness Out of Weird Design

We're always impressed by how much expressiveness and emotion can be squeezed out of even the simplest robot faces if they're cleverly done, and Emys (for "emotive head system"), a robot from the Wroclaw University of Technology in Poland, is a fantastic example. Just watch:

Yeah, I didn't entirely get all that either, but that "surprise" face is priceless. For a less, um, dramatic run-through of all of the expressions that Emys can make, there's another video here.

Emys is part of the LIREC Project, which is a European research project that's "exploring how to design digital and interactive companions who can develop and read emotions and act cross-platform." In short, they're trying to figure out how to make robots a little more fun to hang out with, by giving them some tools to tell how you're feeling, and giving you an expressive face (of sorts) to look at.

This disembodied head also comes with a fancy wheeled Segway-style body called FLASH, and there's even an arm. Just one arm, yeah, but that's enough to shake hands and give a thumbs-up, and who could want anything more than that?

[ Emys ] via [ Telegraph ] and [ Robot Living ]

Freaky Robot Mouth Learns to Sing

Professor Hideyuki Sawada from Kagawa University in Japan was at Robotech 2011 showing off that incredibly bizarre robot mouth of his. It's based as closely as possible on a human mouth, complete with an air pump for lungs, eight fake vocal cords, a silicon tongue, and even a nasal resonance cavity that opens and closes. Like other robot mouths, it uses a microphone to listen to itself speak (or whatever you want to call it) and analyze what it hears to try to figure out how to be more understandable and less, you know, borderline nightmarish.

I know, there wasn't a demo in that vid. But I've got one right here for you, of this robot attempting to sing a Japanese children's song called "Kagome Kagome." You can hear what it's supposed to sound like over on Wikipedia before or after you listen to the robot have a go, but either way, you're not gonna recognize much. The action starts at about 30 seconds in:

Wonderful. Don't get me wrong, on principle this is some undeniably fascinating stuff. I have to wonder, though, whether the effort it would take to get this thing into a humanoid robot would really pay off relative to a voice synthesis system based on software and speakers. I guess there might be other advantages to a bionic mouth, but I'll leave the speculation up to you.

[ Kagawa University ] via [ Akihabara News ]

Do Robots Take People's Jobs?

Last month, President Barack Obama announced the National Robotics Initiative, a major program to develop next-generation robots for manufacturing, healthcare, and other areas. The robotics community received the new initiative with enthusiasm, but some observers expressed concern about an expansion in automation, raising a perennial question in robotics: Do robots take people’s jobs?

“The real purpose of automating manufacturing is to eliminate skilled workers and replace them with low paid button pushers—preferably offshore,” commented one IEEE Spectrum reader who’s worked as a control engineer for 25 years. Said another: "As jobs at all levels, from McDonald's to college-educated knowledge-workers, are increasingly automated, there will be more unemployment." Other readers voiced similar concerns.

adept ceo john dulchinos with industrial robotTo hear what the pro-robots camp has to say, I spoke to John Dulchinos [photo, right], president and CEO of Adept Technology, the largest U.S. industrial robotics company. Adept, based in Pleasanton, Calif., offers a variety of robotics products, including SCARA, parallel, linear, and mobile robots. Dulchinos, a mechanical engineer by training, says he became interested in robots during college. “A publication by IEEE got me into robotics,” he says. “It talked about the personal robotics revolution, how it was going to be bigger than the computer industry, and I said, I want to go into robots.”

Dulchinos says that automation, though it might take some people’s jobs in the short term, is essential for keeping companies competitive, and thus able to expand and hire more workers. That's why more and more companies in industries as varied as food packaging and electronics manufacturing are embracing robots.

“If you look out far enough, machines are going to win," he says. "The human body is not a machine. It wears out. It was not designed to be a factory machine. It was designed to be a thinking machine."

Dulchinos believes that "robotics is going to be one of the transformative technologies of the 21st century." The entire robotics industry is only a 5 billion dollar market today, he says, but according to some estimates it will grow to 100 billion by 2020. He envisions future domestic robots helping people at home and factory robots that are not job takers but rather robotic assistants that work alongside human workers. (Just don't let a robot borrow your iPhone.)

Read below my full interview with Dulchinos, in which he discusses how countries like Germany and China are using robots to improve manufacturing, and how a new generation of smart factory robots could do the same in the United States.

Erico Guizzo: Let’s start with a question that comes up again and again: Do robots take people’s jobs?

John Dulchinos: Robots are not the enemy. It’s low-cost labor that’s the enemy. If you want to look at where jobs are going, it’s not robots taking people’s jobs; it’s entire companies and industries moving overseas. Robots elicit an emotional response from people because they are personified as people, but really what robots do is they enhance productivity and they free people up to do other tasks. In a global economy where cost rules, the only way for Western countries to be able to compete effectively against low-cost labor markets is through productivity gains, and robots are one way to achieve that.

Let me give you some background: In the last 15 years the United States has lost somewhere between 2 and 3 million jobs in manufacturing. And in that period of time, China has grown to surpass every other country now except the United States in total manufacturing capacity; in fact, in the next year or two, China is expected to surpass even the United States. Germany has actually grown its manufacturing population. Germany and Japan have the highest density of robots [number of industrial robots per 10,000 manufacturing workers]. And Germany has used robots to grow their manufacturing employee base, because they’ve been able to be competitive and bring manufacturing plants back to Germany. With that comes not only the manufacturing jobs but all the other indirect jobs as well.

So do robots take jobs away? I’m asked that question many times. I would draw some comparisons to other industries. In 1900, about 50 percent of the U.S. population was in farming. Today it’s less than 5 percent. Yet our output is far greater than it was in 1900. And while you’d argue that tractors and cotton mills and other machines eliminated a number of farming jobs, the reality is that the mechanization enhanced the productivity of farming to a point where people went into manufacturing, into engineering, into a variety of industries that spun the industrial age and then the information age in the United States. Had that mechanization not occurred in farming, we wouldn’t have much of the advances we had in the 20th century, because we’d all be on the farms producing food.

EG: Still, some argue that companies can use robots to become more productive and even expand, but in the end they’ll need less workers…

JD: Let me ask you something: Do computers take jobs?

EG: That’s a good question. There was a big debate in the 1980s and ’90s about computers replacing office workers and whether or not productivity was increasing.

JD: What happened to all the secretaries in the 1970s and 1980s? Where are they today? If you follow this logic that if a specific task is automated that means a person now goes to the unemployment line, then we would have tens or hundreds of thousands of secretaries in the unemployment line. And they are not. They've upgraded their skills, are using computers, and learned to do more complex tasks as opposed to being relegated to menial tasks. Computers have been quite an enabler. And robots are just a computer with an arm connected to it.

I can put it in numbers for you. Over the last 15 years, there’s probably been about 10,000 to 15,000 industrial robots deployed a year in the United States. So that's around 100,000 robots deployed from 1995 to 2010. If each robot replaced two people, which I think would be a stretch, but let’s say it did, that would be 200,000 people, or maybe if you want to be really aggressive, 400,000 or 500,000 people. We’ve actually lost millions of jobs in manufacturing over the last 15 years. Really what happened is, manufacturing is going away because the United State isn’t competitive in the global economy anymore.

And I’ll give you a scary thought. China last year grew to become the No. 4 market in industrial robots. In the next two years it will likely pass the United States in number of robots installed every year. And so if you want to extrapolate that, you'll realize that China, which has the lowest labor costs in the industrialized world, is putting in robots at a pace that is on par with the United States and soon will be faster than the United States. To me, that’s putting the Chinese manufacturing economy on steroids.

adept ceo john dulchinos with mobile robot
"Move over, Dulchinos, I want to be CEO."

EG: Government agencies and companies have invested in robotics R&D for decades, but things seem to move so slowly. Why will this new National Robotics Initiative be able to help now? What’s different?

JD: Robots were pioneered in the United States 50 years ago. What happened is the United States didn’t embrace them. Japan did. Look at the significant gains that Japan made in the 1970s and ’80s in manufacturing. Those were all fueled by the creation of a very strong and vibrant domestic robotics industry.

What’s different about now is that robotic technology has gotten to a point in the research lab where there’s the potential for a new generation of smarter, more flexible, safer robots. This new generation of machines promises to expand the applications of robotics not only in manufacturing but also in healthcare, military, and domestic applications.

That has the potential to shake up the industry and create an inflection point where the United States can compete with Japan, Korea, Europe, and China—all of which are all spending much more money on robotics R&D than the United States—and gain a leadership position in next-generation robotics to enhance global competitiveness across a host of industries.

So if you look at who has the potential to win in the 21st century robotics race, government funding will be a factor in that race. It’s not the only one, for sure; in fact, I think the United States has better foundational technology to win that race, but it can’t without some help to try to level the playing field. The Obama initiative, which is a beginning, is a way to bring some focus and value that can push the U.S. robotics industry forward, and give companies a chance to compete fairly in the global economy.

EG: What are some technology barriers that you think should be attacked?

JD: It starts at the sensing level. A robot is a programmable machine, but it’s a dumb machine if it can’t be aware of its environment and adapt to its environment. Sensing technologies like vision, which gives the robot the ability to perceive and see things around it, need to improve a lot. So does tactile sensing, so robots can feel objects in the same way or better than a person. And we need not only better sensors, we need to make them cheaper, so we can use them in applications that aren’t cost effective today.

Another area for improvement is collaborative robots. Robots that can work safely alongside people. The robot of the future is not a job taker; it’s a job assistant. So think about what a robot is going to do in the future. You’ll have, in a production facility, people and robots intermixed together doing tasks. Robots are going to allow people to be more effective. You see this already in surgical robots. These robots are not taking doctors' jobs away—they’re making doctors more efficient, enabling them to perform a surgery with a higher level of precision and consistency.

Lastly, we also need new materials, new motor technologies, and safety strategies. I think that once robots reach a certain cost point, you'll have a robot in your home, doing laundry for you, helping with vacuuming and cleaning, or imagine robots at hospitals delivering supplies to the OR. These applications require robots that are smarter, more adaptable to their environment, more able to work alongside people and assist people. I think those are the technologies that could come out of a more focused effort in terms of R&D in robotics.

China is putting in robots at a pace that is on par and soon will be faster than the United States. To me, that’s putting the Chinese manufacturing economy on steroids.
—John Dulchinos

EG: Companies like yours do invest in R&D, I’m sure, and here comes the government funding universities and other businesses that may become your competitor. Aren’t you upset?

JD: Yes, but I’m more upset that Japan has been subsidizing my Japanese competitors and Korea is subsidizing my Korean competitors. I will give you an example. This is how it plays out for a company like mine. We manufacture robots all around the world. We produce some of them here, some in Japan, some in Europe. And the motors that I use, the best motors for my SCARA robots are made in Japan. I am at a competitive disadvantage in the purchase of those motors because I am a U.S. company; all of my Japanese competitors get much better pricing on those motors because they’re in the Japanese community. So I’m competing on unlevel playing field, and the reason that there’s a lot of competition coming from places like Japan is for two reasons. One is Japan was a big market for robots many years ago, and second there was government investment.

So what I think the government can do—and I would say that the last thing that this industry needs is a handout from the government—what I’m looking for the Obama administration and the government to help with is to create a robotics domestic market here. We’re looking to build a coordinated activity that allows us to take some of the barriers down. Another barrier is regulation. To allow people to create a new generation robotics company and don’t have to worry about fighting a lot of regulatory tape or being sued for early applications, we need to take down some of the regulatory barriers. One of the big areas is to define the environment for how robots and people will work together. Robots get held to a completely different standard than other machinery in terms of safety. The United States could take a leadership role in defining how robots and people can coexist in a safe manner and enable this assistive robotics environment to expand.

It goes all the way to cars. Google demonstrated earlier this year that they drove a robotic car over 100,000 miles and that allowed them to prove that an autonomous vehicle has potential to be in highways. There’s 40,000 people a year who die in car accidents, and that could be significantly reduced if we automated the highways and have robotic cars driving. Paving the way for that kind of technology is, again, one example of how the government could help.

EG: What’s the most exciting emerging market for Adept? And how do you see manufacturing robots evolving?

JD: I think packaging is a very exciting area. The jobs are challenging, margins are very thin, throughputs are very high. A typical poultry plant might have 50 percent turnover on an annual basis. You think people like those jobs? You think people are lining up for those jobs? We see jobs where a person is picking up 10-pound cylinders of meat and putting into a box, 30 to 40 times a minute, eight hours a day. You go pick up a 10-pound weight and put it into a box and see how long you last. The other thing that’s driving that market is the crackdown on immigration. Those jobs are all being filled by illegal aliens. And why do you think that is? Because you can’t get people to do those jobs anymore; they hate them.

Another exciting area is electronics. The robots we sell go into disk drive factories, for example. When you’re putting together a disk drive, one of the biggest enemies is contamination. The biggest source of contamination is people. People are dirty. When a disk driver manufacturer decides to automate a factory, it’s not because they want to get rid of labor per se, it’s because people can’t do the job at the level of cleanliness and yield that is required to be competitive in that space.

EG: What about more complex electronics? We hear about problems involving workers assembling iPhones in high volume. Will robots be able to fully assemble a smartphone?

JD: There are certain jobs that require dexterity at a level that robots don’t have yet. But they’re getting better and they’ll get there. The question is: What do we need to do to have high-tech electronics manufactured here in the United States? Let me tell you a story about cellphones. In the mid-1990s, one of Adept’s largest markets was actually selling robots to cellphone manufacturers. That was when the market was a couple hundred million cellphones sold a year. Fast forward 15 years later. There isn’t s a single cellphone built in the United States anymore. And I can point to those old factories—all those people are gone. All the production is sitting in China right now, done largely by a mix of Chinese labor and automation.

EG: All the robots you sold to cellphone manufacturers in the 1990s are not being used?

JD: No. The problem was that robots weren’t flexible enough to be able to automate those factories. The product lifecycles were so short that with the level of technology in the mid-1990s, it was difficult to automate those lines with robotics at a level that would be cost justifiable. So that’s a great example of where, had robots been capable at the time, we would have a cellphone manufacturing industry today in the Untied States. But because robots weren’t capable enough, the market went to low-cost labor in China. And now China is starting to use robots to automate their factories. China owns the industry, and no one is going to be able to take it back.

So how can robots help moving forward? What robots enable, and this is what’s exciting about them, is that if you can build a domestic market, robots can help equalize the manufacturing costs anywhere in the world. And then what really matters is infrastructure costs and what’s the regulatory, tax, and business environment and where’s the market. And that’s what determines where you’re going to build a factory; not just because the labor costs are cheaper.

So I think we’re at a very exciting point. This is a chance over the next decade for the United States to regain leadership in robotics. And with a little bit of focus from the government to help push some technologies forward and push the regulatory environment in the right direction, I think that the United States can win this race. Success 10 years from now will be that there’s a number of U.S. robotics companies and they’re leaders in creating next-generation robots that permeate our everyday lives.

This interview has been edited and condensed.

Images: Adept

Virginia Tech's RoMeLa Rocks RoboCup 2011

CHARLI-2 and DARwIn-OP robots relax in front of the Louis Vuitton Humanoid Cup

It's been a wild week at RoboCup 2011 in Istanbul, but we've got results for you: Virginia Tech's RoMeLa has emerged victorious in both the KidSize and AdultSize leagues, and they're bringing home the stylish and coveted (and stylish) Louis Vuitton Humanoid Cup for Best Humanoid for their AdultSize robot, CHARLI-2. This is big news, since Europe and Asia have historically dominated the RoboCup competitions, and in fact this'll be the very first time that the Cup (pictured above) has made it to the United States.

Dr. Dennis Hong (center) rewards a DARwIn-OP robot for a job well done

We'll have more details for you in the coming weeks from RoMeLa, their Team DARwIn partners at UPenn, and the other RoboCup teams, but for now, they all deserve a little break. Don't worry, though: we've got a bunch of video including RoMeLa's team of home grown DARwIn-OP humanoids in the finals, CHARLI-2's final match, and footage of the non-humanoid competitions as well (definitely don't miss the Middle Size final). Once again, congrats to Dr. Dennis Hong and the entire RoMeLa team (and their robots) for an impressive performance.

KidSize Final Match: Team DARwIn vs. CIT Brains (Japan)

AdultSize Final Match: Team CHARLI vs. Singapore Polytechnic University's ROBO-ERECTUS

Middle Size Final Match: Team Tech United (Netherlands) vs. Team Water (China)

Small Size Final Match: Team Skuba (Thailand) vs. Team Immortals (Iran)

Standard Platform Final Match: Team B-Human (Germany) vs. Team Nao Devils (Germany)

TeenSize Final Match: Team NIMBRO (Germany) vs. Team KMUTT Kickers (Thailand)

For more RoboCup 2011 video, check out both the BotSportTV and DutchRobotics YouTube channels.

[ RoboCup 2011 ]

[ RoMeLa ]

Robots Learn to Take Pictures Better Than You

Humans would like to believe that artistry and creativity can't be distilled down into a set of rules that a robot can follow, but in some cases, it's possible to at least tentatively define some things that work and some things that don't. Raghudeep Gadde, a master's student in computer science at IIIT Hyderabad, in India, has been teaching his Nao robot some general photography rules that he says enable the robot to "capture professional photographs that are in accord with human professional photography."

Essentially, they've simply taught their robot to do its best to obey both the rule of thirds and the golden ratio when taking pictures with its built-in camera. The rule of thirds states that it's best to compose a picture such that if you chop the scene up into nine equal squares (i.e. into thirds both vertically and horizontally), what you're focusing on should be located at an intersection between squares. And the golden ratio basically says that the best place for a horizon is the line that you get when you separate your scene into two rectangles, with one being 1.62 times the size of the other. This all sounds like math, right? And robots like math.

These aren't hard-and-fast rules, of course, and brilliant and creative photographers will often completely ignore them. But we don't need robots to be brilliant and creative photographers (and if they were, it would be serious blow to our egos). It would just be helpful if they were decent at it, which is what this algorithm is supposed to do, although there's certainly no substitute for a human's ability to spot interesting things to take pictures of. That said, I imagine that it would be possible to program a robot to look for scenes with high amounts of color, contrast, or patterns, which (in a very general sense) is what we look for when taking pictures.

The other part to all this is that Nao has some idea of what constitutes a "good" picture from a more abstract, human perspective. Using an online photo contest where 60,000 pictures were ranked by humans, Nao can give the images it takes a quality ranking, and if that ranking falls below a certain threshold, the robot will reposition itself and try to take a better picture.

If this method works out, there are all kinds of applications (beyond just robots) that it could be adapted to. For example, Google image searches could include a new filter that returns only images that don't completely suck. Or maybe the next generation of digital cameras will be able to explain exactly why that picture you just took was absolutely terrible, and then offer suggestions on how to do better next time.

UPDATE: Raghudeep sent us some example images to show how Nao autonomously reframes a photo after analyzing an initial shot. Note that the robot has a low-resolution camera that takes only 640 x 480 pictures.

Example 1, initial shot

Example 1, reframed shot

Example 2, initial shot

Example 2, reframed shot

Example 3, initial shot

Example 3, reframed shot

[ Raghudeep Gadde ] via [ New Scientist ]

Should We Automate Intimacy?

asimo happy birthdayWhat would happen to intimacy if an acquaintance you casually communicate with could manage to know you better than a close family member?

As my birthday rolled around this year, I was struck by the diverse levels of automation involved in the greetings received. There was the totally computerized message (with coupon!) from the auto dealer; a card with a handwritten signature from the insurance agent, or, more likely, his secretary; Facebook greetings from friends who were alerted by the social network about my birthday; an e-card with a personalized note from a brother, reminded by his calendar program; and printed cards personalized by relatives, most reminded by traditional, print calendars. From my husband, I received an iPad, with an endearingly personalized card, and from my daughter, a photograph with an original poem, based on shared memories. Neither of these last two need reminders because they have my birth date stored in their memories -- at least they'd better!

My response reflected the value ascribed to these intimations of intimacy: I trashed the auto dealer and insurance agent cards without reading them; set the personal notes on the sideboard for a week before discarding; used the iPad to respond in kind to the electronic greetings, with my husband's card stored with other memorabilia; and hung my daughter's poem-inscribed photo on my wall.

Now, what would happen if people had software at their disposal capable of knowing you well enough to be able to craft very personal messages or even a birthday poem as good as your daughter's? What if people could use a robot, more eloquent than they could ever be, to deliver birthday messages? Or maybe even marriage proposals?

Intimacy is part of the “We think, therefore I am” hard-wiring described by Philippe Rochat in "Others in Mind." Rochat elucidates how we become conscious of our own existence mostly through recognition by others and how we constantly struggle to reconcile our internal view of ourselves with what we perceive others reflecting. So to the extent that a poem about us conforms with our internal view, that view is validated; we feel the satisfaction of intimacy with an author who understands us. The question is, do we feel validated if that "someone" is not a person but the result of smart algorithms interacting with our email trail and social profiles? Already there've been experiments to create AI tools that'd automatically post Twitter and Facebook updates just like you would, or that would work as your "cognitive assistants," completely taking care of organizing and prioritazing your information and communication activities. 

Perhaps the real question, then, is a more complex one: As we become more dependent on computers and automation (and more addicted to social networks), are we evolving into persons joined by brain-computer interfaces? I say that metaphorically, but only for now, as advances in neural prostheses might turn that scenario into reality. And when that happens, will we become more like conjoined twins, sharing an amygdala? Would the e-cards from the insurance agent become input data in our cortices that we need to send to a mental spam folder? Or in the case of desired inputs, could AI algorithms generate messages that validate us and make us feel warm-and-fuzzy? Could Asimo deliver you a better happy birthday message than your spouse's?

Society must weigh the benefits and dangers of automated intimacy. When I began writing this post, I felt wary of that concept. But as I think it through, I'm reconsidering. In AI and neurology, when a process is critical, common, and/or instantaneous, it's usually hardwired, automated. Does it make sense, then, as we become a communal being, that the process of personal validation, of recognition of each self that will make up Ourselves (I use the Royal group noun form purposely) should be automated and hardwired? I might be ready to answer "maybe," with this caveat from neuroscientist David Eagleman's "Incognito: The Secret Lives of the Brain" bestseller: "Evolve solutions, [but] when you find a good one, don’t stop."

I look forward to hearing others' ideas and feelings about automated intimacy.

Jeanne Dietsch, an IEEE member, is CEO and co-founder of MobileRobots in Amherst, N.H., and vice president of emerging technologies at Adept Technology. Her last post was about automated airport security.

Image: Honda



IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:

Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
Jason Falconer
Angelica Lim
Tokyo, Japan

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Load More