Automaton iconAutomaton

Polish Robot Coaxes Expressiveness Out of Weird Design

We're always impressed by how much expressiveness and emotion can be squeezed out of even the simplest robot faces if they're cleverly done, and Emys (for "emotive head system"), a robot from the Wroclaw University of Technology in Poland, is a fantastic example. Just watch:

Yeah, I didn't entirely get all that either, but that "surprise" face is priceless. For a less, um, dramatic run-through of all of the expressions that Emys can make, there's another video here.

Emys is part of the LIREC Project, which is a European research project that's "exploring how to design digital and interactive companions who can develop and read emotions and act cross-platform." In short, they're trying to figure out how to make robots a little more fun to hang out with, by giving them some tools to tell how you're feeling, and giving you an expressive face (of sorts) to look at.

This disembodied head also comes with a fancy wheeled Segway-style body called FLASH, and there's even an arm. Just one arm, yeah, but that's enough to shake hands and give a thumbs-up, and who could want anything more than that?

[ Emys ] via [ Telegraph ] and [ Robot Living ]

Freaky Robot Mouth Learns to Sing

Professor Hideyuki Sawada from Kagawa University in Japan was at Robotech 2011 showing off that incredibly bizarre robot mouth of his. It's based as closely as possible on a human mouth, complete with an air pump for lungs, eight fake vocal cords, a silicon tongue, and even a nasal resonance cavity that opens and closes. Like other robot mouths, it uses a microphone to listen to itself speak (or whatever you want to call it) and analyze what it hears to try to figure out how to be more understandable and less, you know, borderline nightmarish.

I know, there wasn't a demo in that vid. But I've got one right here for you, of this robot attempting to sing a Japanese children's song called "Kagome Kagome." You can hear what it's supposed to sound like over on Wikipedia before or after you listen to the robot have a go, but either way, you're not gonna recognize much. The action starts at about 30 seconds in:

Wonderful. Don't get me wrong, on principle this is some undeniably fascinating stuff. I have to wonder, though, whether the effort it would take to get this thing into a humanoid robot would really pay off relative to a voice synthesis system based on software and speakers. I guess there might be other advantages to a bionic mouth, but I'll leave the speculation up to you.

[ Kagawa University ] via [ Akihabara News ]

Do Robots Take People's Jobs?

Last month, President Barack Obama announced the National Robotics Initiative, a major program to develop next-generation robots for manufacturing, healthcare, and other areas. The robotics community received the new initiative with enthusiasm, but some observers expressed concern about an expansion in automation, raising a perennial question in robotics: Do robots take people’s jobs?

“The real purpose of automating manufacturing is to eliminate skilled workers and replace them with low paid button pushers—preferably offshore,” commented one IEEE Spectrum reader who’s worked as a control engineer for 25 years. Said another: "As jobs at all levels, from McDonald's to college-educated knowledge-workers, are increasingly automated, there will be more unemployment." Other readers voiced similar concerns.

adept ceo john dulchinos with industrial robotTo hear what the pro-robots camp has to say, I spoke to John Dulchinos [photo, right], president and CEO of Adept Technology, the largest U.S. industrial robotics company. Adept, based in Pleasanton, Calif., offers a variety of robotics products, including SCARA, parallel, linear, and mobile robots. Dulchinos, a mechanical engineer by training, says he became interested in robots during college. “A publication by IEEE got me into robotics,” he says. “It talked about the personal robotics revolution, how it was going to be bigger than the computer industry, and I said, I want to go into robots.”

Dulchinos says that automation, though it might take some people’s jobs in the short term, is essential for keeping companies competitive, and thus able to expand and hire more workers. That's why more and more companies in industries as varied as food packaging and electronics manufacturing are embracing robots.

“If you look out far enough, machines are going to win," he says. "The human body is not a machine. It wears out. It was not designed to be a factory machine. It was designed to be a thinking machine."

Dulchinos believes that "robotics is going to be one of the transformative technologies of the 21st century." The entire robotics industry is only a 5 billion dollar market today, he says, but according to some estimates it will grow to 100 billion by 2020. He envisions future domestic robots helping people at home and factory robots that are not job takers but rather robotic assistants that work alongside human workers. (Just don't let a robot borrow your iPhone.)

Read below my full interview with Dulchinos, in which he discusses how countries like Germany and China are using robots to improve manufacturing, and how a new generation of smart factory robots could do the same in the United States.

Erico Guizzo: Let’s start with a question that comes up again and again: Do robots take people’s jobs?

John Dulchinos: Robots are not the enemy. It’s low-cost labor that’s the enemy. If you want to look at where jobs are going, it’s not robots taking people’s jobs; it’s entire companies and industries moving overseas. Robots elicit an emotional response from people because they are personified as people, but really what robots do is they enhance productivity and they free people up to do other tasks. In a global economy where cost rules, the only way for Western countries to be able to compete effectively against low-cost labor markets is through productivity gains, and robots are one way to achieve that.

Let me give you some background: In the last 15 years the United States has lost somewhere between 2 and 3 million jobs in manufacturing. And in that period of time, China has grown to surpass every other country now except the United States in total manufacturing capacity; in fact, in the next year or two, China is expected to surpass even the United States. Germany has actually grown its manufacturing population. Germany and Japan have the highest density of robots [number of industrial robots per 10,000 manufacturing workers]. And Germany has used robots to grow their manufacturing employee base, because they’ve been able to be competitive and bring manufacturing plants back to Germany. With that comes not only the manufacturing jobs but all the other indirect jobs as well.

So do robots take jobs away? I’m asked that question many times. I would draw some comparisons to other industries. In 1900, about 50 percent of the U.S. population was in farming. Today it’s less than 5 percent. Yet our output is far greater than it was in 1900. And while you’d argue that tractors and cotton mills and other machines eliminated a number of farming jobs, the reality is that the mechanization enhanced the productivity of farming to a point where people went into manufacturing, into engineering, into a variety of industries that spun the industrial age and then the information age in the United States. Had that mechanization not occurred in farming, we wouldn’t have much of the advances we had in the 20th century, because we’d all be on the farms producing food.

EG: Still, some argue that companies can use robots to become more productive and even expand, but in the end they’ll need less workers…

JD: Let me ask you something: Do computers take jobs?

EG: That’s a good question. There was a big debate in the 1980s and ’90s about computers replacing office workers and whether or not productivity was increasing.

JD: What happened to all the secretaries in the 1970s and 1980s? Where are they today? If you follow this logic that if a specific task is automated that means a person now goes to the unemployment line, then we would have tens or hundreds of thousands of secretaries in the unemployment line. And they are not. They've upgraded their skills, are using computers, and learned to do more complex tasks as opposed to being relegated to menial tasks. Computers have been quite an enabler. And robots are just a computer with an arm connected to it.

I can put it in numbers for you. Over the last 15 years, there’s probably been about 10,000 to 15,000 industrial robots deployed a year in the United States. So that's around 100,000 robots deployed from 1995 to 2010. If each robot replaced two people, which I think would be a stretch, but let’s say it did, that would be 200,000 people, or maybe if you want to be really aggressive, 400,000 or 500,000 people. We’ve actually lost millions of jobs in manufacturing over the last 15 years. Really what happened is, manufacturing is going away because the United State isn’t competitive in the global economy anymore.

And I’ll give you a scary thought. China last year grew to become the No. 4 market in industrial robots. In the next two years it will likely pass the United States in number of robots installed every year. And so if you want to extrapolate that, you'll realize that China, which has the lowest labor costs in the industrialized world, is putting in robots at a pace that is on par with the United States and soon will be faster than the United States. To me, that’s putting the Chinese manufacturing economy on steroids.

adept ceo john dulchinos with mobile robot
"Move over, Dulchinos, I want to be CEO."

EG: Government agencies and companies have invested in robotics R&D for decades, but things seem to move so slowly. Why will this new National Robotics Initiative be able to help now? What’s different?

JD: Robots were pioneered in the United States 50 years ago. What happened is the United States didn’t embrace them. Japan did. Look at the significant gains that Japan made in the 1970s and ’80s in manufacturing. Those were all fueled by the creation of a very strong and vibrant domestic robotics industry.

What’s different about now is that robotic technology has gotten to a point in the research lab where there’s the potential for a new generation of smarter, more flexible, safer robots. This new generation of machines promises to expand the applications of robotics not only in manufacturing but also in healthcare, military, and domestic applications.

That has the potential to shake up the industry and create an inflection point where the United States can compete with Japan, Korea, Europe, and China—all of which are all spending much more money on robotics R&D than the United States—and gain a leadership position in next-generation robotics to enhance global competitiveness across a host of industries.

So if you look at who has the potential to win in the 21st century robotics race, government funding will be a factor in that race. It’s not the only one, for sure; in fact, I think the United States has better foundational technology to win that race, but it can’t without some help to try to level the playing field. The Obama initiative, which is a beginning, is a way to bring some focus and value that can push the U.S. robotics industry forward, and give companies a chance to compete fairly in the global economy.

EG: What are some technology barriers that you think should be attacked?

JD: It starts at the sensing level. A robot is a programmable machine, but it’s a dumb machine if it can’t be aware of its environment and adapt to its environment. Sensing technologies like vision, which gives the robot the ability to perceive and see things around it, need to improve a lot. So does tactile sensing, so robots can feel objects in the same way or better than a person. And we need not only better sensors, we need to make them cheaper, so we can use them in applications that aren’t cost effective today.

Another area for improvement is collaborative robots. Robots that can work safely alongside people. The robot of the future is not a job taker; it’s a job assistant. So think about what a robot is going to do in the future. You’ll have, in a production facility, people and robots intermixed together doing tasks. Robots are going to allow people to be more effective. You see this already in surgical robots. These robots are not taking doctors' jobs away—they’re making doctors more efficient, enabling them to perform a surgery with a higher level of precision and consistency.

Lastly, we also need new materials, new motor technologies, and safety strategies. I think that once robots reach a certain cost point, you'll have a robot in your home, doing laundry for you, helping with vacuuming and cleaning, or imagine robots at hospitals delivering supplies to the OR. These applications require robots that are smarter, more adaptable to their environment, more able to work alongside people and assist people. I think those are the technologies that could come out of a more focused effort in terms of R&D in robotics.


China is putting in robots at a pace that is on par and soon will be faster than the United States. To me, that’s putting the Chinese manufacturing economy on steroids.
—John Dulchinos

EG: Companies like yours do invest in R&D, I’m sure, and here comes the government funding universities and other businesses that may become your competitor. Aren’t you upset?

JD: Yes, but I’m more upset that Japan has been subsidizing my Japanese competitors and Korea is subsidizing my Korean competitors. I will give you an example. This is how it plays out for a company like mine. We manufacture robots all around the world. We produce some of them here, some in Japan, some in Europe. And the motors that I use, the best motors for my SCARA robots are made in Japan. I am at a competitive disadvantage in the purchase of those motors because I am a U.S. company; all of my Japanese competitors get much better pricing on those motors because they’re in the Japanese community. So I’m competing on unlevel playing field, and the reason that there’s a lot of competition coming from places like Japan is for two reasons. One is Japan was a big market for robots many years ago, and second there was government investment.

So what I think the government can do—and I would say that the last thing that this industry needs is a handout from the government—what I’m looking for the Obama administration and the government to help with is to create a robotics domestic market here. We’re looking to build a coordinated activity that allows us to take some of the barriers down. Another barrier is regulation. To allow people to create a new generation robotics company and don’t have to worry about fighting a lot of regulatory tape or being sued for early applications, we need to take down some of the regulatory barriers. One of the big areas is to define the environment for how robots and people will work together. Robots get held to a completely different standard than other machinery in terms of safety. The United States could take a leadership role in defining how robots and people can coexist in a safe manner and enable this assistive robotics environment to expand.

It goes all the way to cars. Google demonstrated earlier this year that they drove a robotic car over 100,000 miles and that allowed them to prove that an autonomous vehicle has potential to be in highways. There’s 40,000 people a year who die in car accidents, and that could be significantly reduced if we automated the highways and have robotic cars driving. Paving the way for that kind of technology is, again, one example of how the government could help.

EG: What’s the most exciting emerging market for Adept? And how do you see manufacturing robots evolving?

JD: I think packaging is a very exciting area. The jobs are challenging, margins are very thin, throughputs are very high. A typical poultry plant might have 50 percent turnover on an annual basis. You think people like those jobs? You think people are lining up for those jobs? We see jobs where a person is picking up 10-pound cylinders of meat and putting into a box, 30 to 40 times a minute, eight hours a day. You go pick up a 10-pound weight and put it into a box and see how long you last. The other thing that’s driving that market is the crackdown on immigration. Those jobs are all being filled by illegal aliens. And why do you think that is? Because you can’t get people to do those jobs anymore; they hate them.

Another exciting area is electronics. The robots we sell go into disk drive factories, for example. When you’re putting together a disk drive, one of the biggest enemies is contamination. The biggest source of contamination is people. People are dirty. When a disk driver manufacturer decides to automate a factory, it’s not because they want to get rid of labor per se, it’s because people can’t do the job at the level of cleanliness and yield that is required to be competitive in that space.

EG: What about more complex electronics? We hear about problems involving workers assembling iPhones in high volume. Will robots be able to fully assemble a smartphone?

JD: There are certain jobs that require dexterity at a level that robots don’t have yet. But they’re getting better and they’ll get there. The question is: What do we need to do to have high-tech electronics manufactured here in the United States? Let me tell you a story about cellphones. In the mid-1990s, one of Adept’s largest markets was actually selling robots to cellphone manufacturers. That was when the market was a couple hundred million cellphones sold a year. Fast forward 15 years later. There isn’t s a single cellphone built in the United States anymore. And I can point to those old factories—all those people are gone. All the production is sitting in China right now, done largely by a mix of Chinese labor and automation.

EG: All the robots you sold to cellphone manufacturers in the 1990s are not being used?

JD: No. The problem was that robots weren’t flexible enough to be able to automate those factories. The product lifecycles were so short that with the level of technology in the mid-1990s, it was difficult to automate those lines with robotics at a level that would be cost justifiable. So that’s a great example of where, had robots been capable at the time, we would have a cellphone manufacturing industry today in the Untied States. But because robots weren’t capable enough, the market went to low-cost labor in China. And now China is starting to use robots to automate their factories. China owns the industry, and no one is going to be able to take it back.

So how can robots help moving forward? What robots enable, and this is what’s exciting about them, is that if you can build a domestic market, robots can help equalize the manufacturing costs anywhere in the world. And then what really matters is infrastructure costs and what’s the regulatory, tax, and business environment and where’s the market. And that’s what determines where you’re going to build a factory; not just because the labor costs are cheaper.

So I think we’re at a very exciting point. This is a chance over the next decade for the United States to regain leadership in robotics. And with a little bit of focus from the government to help push some technologies forward and push the regulatory environment in the right direction, I think that the United States can win this race. Success 10 years from now will be that there’s a number of U.S. robotics companies and they’re leaders in creating next-generation robots that permeate our everyday lives.

This interview has been edited and condensed.

Images: Adept

Virginia Tech's RoMeLa Rocks RoboCup 2011

CHARLI-2 and DARwIn-OP robots relax in front of the Louis Vuitton Humanoid Cup

It's been a wild week at RoboCup 2011 in Istanbul, but we've got results for you: Virginia Tech's RoMeLa has emerged victorious in both the KidSize and AdultSize leagues, and they're bringing home the stylish and coveted (and stylish) Louis Vuitton Humanoid Cup for Best Humanoid for their AdultSize robot, CHARLI-2. This is big news, since Europe and Asia have historically dominated the RoboCup competitions, and in fact this'll be the very first time that the Cup (pictured above) has made it to the United States.

Dr. Dennis Hong (center) rewards a DARwIn-OP robot for a job well done

We'll have more details for you in the coming weeks from RoMeLa, their Team DARwIn partners at UPenn, and the other RoboCup teams, but for now, they all deserve a little break. Don't worry, though: we've got a bunch of video including RoMeLa's team of home grown DARwIn-OP humanoids in the finals, CHARLI-2's final match, and footage of the non-humanoid competitions as well (definitely don't miss the Middle Size final). Once again, congrats to Dr. Dennis Hong and the entire RoMeLa team (and their robots) for an impressive performance.


KidSize Final Match: Team DARwIn vs. CIT Brains (Japan)


AdultSize Final Match: Team CHARLI vs. Singapore Polytechnic University's ROBO-ERECTUS


Middle Size Final Match: Team Tech United (Netherlands) vs. Team Water (China)


Small Size Final Match: Team Skuba (Thailand) vs. Team Immortals (Iran)


Standard Platform Final Match: Team B-Human (Germany) vs. Team Nao Devils (Germany)


TeenSize Final Match: Team NIMBRO (Germany) vs. Team KMUTT Kickers (Thailand)

For more RoboCup 2011 video, check out both the BotSportTV and DutchRobotics YouTube channels.

[ RoboCup 2011 ]

[ RoMeLa ]

Robots Learn to Take Pictures Better Than You

Humans would like to believe that artistry and creativity can't be distilled down into a set of rules that a robot can follow, but in some cases, it's possible to at least tentatively define some things that work and some things that don't. Raghudeep Gadde, a master's student in computer science at IIIT Hyderabad, in India, has been teaching his Nao robot some general photography rules that he says enable the robot to "capture professional photographs that are in accord with human professional photography."

Essentially, they've simply taught their robot to do its best to obey both the rule of thirds and the golden ratio when taking pictures with its built-in camera. The rule of thirds states that it's best to compose a picture such that if you chop the scene up into nine equal squares (i.e. into thirds both vertically and horizontally), what you're focusing on should be located at an intersection between squares. And the golden ratio basically says that the best place for a horizon is the line that you get when you separate your scene into two rectangles, with one being 1.62 times the size of the other. This all sounds like math, right? And robots like math.

These aren't hard-and-fast rules, of course, and brilliant and creative photographers will often completely ignore them. But we don't need robots to be brilliant and creative photographers (and if they were, it would be serious blow to our egos). It would just be helpful if they were decent at it, which is what this algorithm is supposed to do, although there's certainly no substitute for a human's ability to spot interesting things to take pictures of. That said, I imagine that it would be possible to program a robot to look for scenes with high amounts of color, contrast, or patterns, which (in a very general sense) is what we look for when taking pictures.

The other part to all this is that Nao has some idea of what constitutes a "good" picture from a more abstract, human perspective. Using an online photo contest where 60,000 pictures were ranked by humans, Nao can give the images it takes a quality ranking, and if that ranking falls below a certain threshold, the robot will reposition itself and try to take a better picture.

If this method works out, there are all kinds of applications (beyond just robots) that it could be adapted to. For example, Google image searches could include a new filter that returns only images that don't completely suck. Or maybe the next generation of digital cameras will be able to explain exactly why that picture you just took was absolutely terrible, and then offer suggestions on how to do better next time.

UPDATE: Raghudeep sent us some example images to show how Nao autonomously reframes a photo after analyzing an initial shot. Note that the robot has a low-resolution camera that takes only 640 x 480 pictures.

Example 1, initial shot

Example 1, reframed shot

Example 2, initial shot

Example 2, reframed shot

Example 3, initial shot

Example 3, reframed shot

[ Raghudeep Gadde ] via [ New Scientist ]

Should We Automate Intimacy?

asimo happy birthdayWhat would happen to intimacy if an acquaintance you casually communicate with could manage to know you better than a close family member?

As my birthday rolled around this year, I was struck by the diverse levels of automation involved in the greetings received. There was the totally computerized message (with coupon!) from the auto dealer; a card with a handwritten signature from the insurance agent, or, more likely, his secretary; Facebook greetings from friends who were alerted by the social network about my birthday; an e-card with a personalized note from a brother, reminded by his calendar program; and printed cards personalized by relatives, most reminded by traditional, print calendars. From my husband, I received an iPad, with an endearingly personalized card, and from my daughter, a photograph with an original poem, based on shared memories. Neither of these last two need reminders because they have my birth date stored in their memories -- at least they'd better!

My response reflected the value ascribed to these intimations of intimacy: I trashed the auto dealer and insurance agent cards without reading them; set the personal notes on the sideboard for a week before discarding; used the iPad to respond in kind to the electronic greetings, with my husband's card stored with other memorabilia; and hung my daughter's poem-inscribed photo on my wall.

Now, what would happen if people had software at their disposal capable of knowing you well enough to be able to craft very personal messages or even a birthday poem as good as your daughter's? What if people could use a robot, more eloquent than they could ever be, to deliver birthday messages? Or maybe even marriage proposals?

Intimacy is part of the “We think, therefore I am” hard-wiring described by Philippe Rochat in "Others in Mind." Rochat elucidates how we become conscious of our own existence mostly through recognition by others and how we constantly struggle to reconcile our internal view of ourselves with what we perceive others reflecting. So to the extent that a poem about us conforms with our internal view, that view is validated; we feel the satisfaction of intimacy with an author who understands us. The question is, do we feel validated if that "someone" is not a person but the result of smart algorithms interacting with our email trail and social profiles? Already there've been experiments to create AI tools that'd automatically post Twitter and Facebook updates just like you would, or that would work as your "cognitive assistants," completely taking care of organizing and prioritazing your information and communication activities. 

Perhaps the real question, then, is a more complex one: As we become more dependent on computers and automation (and more addicted to social networks), are we evolving into persons joined by brain-computer interfaces? I say that metaphorically, but only for now, as advances in neural prostheses might turn that scenario into reality. And when that happens, will we become more like conjoined twins, sharing an amygdala? Would the e-cards from the insurance agent become input data in our cortices that we need to send to a mental spam folder? Or in the case of desired inputs, could AI algorithms generate messages that validate us and make us feel warm-and-fuzzy? Could Asimo deliver you a better happy birthday message than your spouse's?

Society must weigh the benefits and dangers of automated intimacy. When I began writing this post, I felt wary of that concept. But as I think it through, I'm reconsidering. In AI and neurology, when a process is critical, common, and/or instantaneous, it's usually hardwired, automated. Does it make sense, then, as we become a communal being, that the process of personal validation, of recognition of each self that will make up Ourselves (I use the Royal group noun form purposely) should be automated and hardwired? I might be ready to answer "maybe," with this caveat from neuroscientist David Eagleman's "Incognito: The Secret Lives of the Brain" bestseller: "Evolve solutions, [but] when you find a good one, don’t stop."

I look forward to hearing others' ideas and feelings about automated intimacy.

Jeanne Dietsch, an IEEE member, is CEO and co-founder of MobileRobots in Amherst, N.H., and vice president of emerging technologies at Adept Technology. Her last post was about automated airport security.

Image: Honda

Robot Soccer Players Learning Fancy Human Skills

In the past, most humanoid robot soccer competitions have consisted of repeated kicking of the ball towards the goal and (for all practical purposes) not too much else. Ambitious algorithms and programming have fallen victim to sensors and hardware that can't always keep up, as well as opponents who tend to interfere in carefully planned strategies. However, we're starting to see some exceptionally clever robot maneuvers leading up to RoboCup 2011 in Istanbul, which had its first round of matches just yesterday.

These two videos come from the Darmstadt Dribblers, whom you may remember as the victors in the KidSize bracket at RoboCup 2010. They show the robots practicing both human-style throw-ins, and a skilled passing game that avoids obstacles, all completely autonomously:

Impressive. Most impressive. Personally, I think we humans are doomed, especially considering that it was two years ago now (i.e. foreverago in robot years) that a team of non-humanoid robots actually managed to score on a team of humans in a friendly game.

[ Darmstadt Dribblers ]

[ RoboCup 2011 ]

Robot Vacuum Sucks Up Radiation at Fukushima Plant

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

fukushima irobot warrior vacuum cleaner

After Japan's devastating earthquake and tsunami in March, U.S. firm iRobot sent four of its rugged, tank-like robots to help with recovery operations at the crippled Fukushima nuclear power plant.

It seems iRobot should have sent some Roomba vacuuming bots as well.

Last week, Tokyo Electric Power Co. (TEPCO), the plant's operator, said it was improvising a robotic vacuum cleaner to remove radioactive dirt from the reactors.

TEPCO built the system by taking an industrial-grade vacuum and attaching its business end to the manipulator arm of a Warrior, iRobot's strongest mobile robot. By remote controlling the bot from a safe distance, workers planned to clean up radioactive debris and sand, which collected on the floor when the tsunami flooded the plant.

Watch the Warrior entering Reactor No. 3 and doing some vacuuming:

In April, TEPCO sent two PackBot robots, also made by iRobot, into some of the reactors. The robots measured high levels of radiation and captured dramatic footage of the damaged facilities. The company has also relied on robotic drones and remote-controlled construction machines. But this is the first time TEPCO uses robots to assist with removal of radioactive debris inside the reactors.

The goal of the cleanup, TEPCO said, was to "reduce the radiation exposure" of workers, who might have to go near or into the reactors to perform repairs and other work. Did it work? I haven't seen any details, but will report back if I find out whether the work helped to reduce radiation levels.

See below details of the operation [click image to enlarge].

irobot fukushima robot vacuum cleaning reactors

Images and video: TEPCO

Updated July 8, 2011 10:07 a.m.

READ ALSO:

Drone Reveals Fukushima Destruction
Wed, April 20, 2011

Blog Post: Video taken by a Honeywell T-Hawk micro air vehicle show damage with unprecedented detail

Videos of PackBot Robots Inside Reactors
Wed, April 20, 2011

Blog Post: Videos show two iRobot PackBots navigating inside the highly radioactive buildings

Robots Enter Fukushima Reactors
Mon, April 18, 2011

Blog Post: Two robots have entered Reactors No. 1 and No. 3 and performed radioactivity measurements

Can Robots Fix Fukushima Reactors?
Tue, March 22, 2011

Blog Post: It's too dangerous for humans to enter the Fukushima nuclear plant. Why not send in robots?

You've Never Seen a Robot Drive System Like This Before

What you're looking here is a hemispherical omnidirectional gimbaled wheel, or HOG wheel. It's nothing more than a black rubber hemisphere that rotates like a spinning top, with servos that can tilt it left and right and forwards and backwards. Powered by this simple drive system, the robot that it's attached to can scoot around the floor in ways that I would have to characterize as "alarmingly fast."

Before I go on about the design, have a look at just what this thing is capable of. Its creator, Curtis Boirum, a grad student at Bradley University, in Peoria, Ill., demoed it at the 2011 RoboGames symposium:

Just to reiterate, a HOG wheel is simply a rubber hemisphere that spins on its axis very, very fast. When the hemisphere is vertical, it's just like a spinning top, but by tilting the hemisphere so that one of its sides makes contact with the ground, you can vector torque in any direction near-instantaneously, depending on which side of the hemisphere you use.

So for example, if the hemisphere is spinning clockwise, tilting it so that the right side contacts the ground will "pull" the robot forward, with the amount of torque directly proportional to the tilt of the hemisphere, like an infinite gear ratio without any gears. It's very simple, very efficient, and as you can see from the video, the drive system is capable of delivering more torque than any of the poor robots that it's attached to can reliably handle.

This idea has actually been around for decades: a concept illustration of a car with a HOG drive graced the cover of the 1938 edition of Mechanics and Handicraft Magazine. Nothing much has really been done with it since, but Curtis (who actually re-invented the system from scratch) is hoping to create a cheap, powerful, and agile omnidirectional drive system that can be adapted for use by both researchers and hobbyists. We hope he'll build a car-sized version too.

Update- this is now being called a "Singularity Drive System," in reference to the zero gear ratio transition point, which is a mathematical singularity.

[ Bradley University ]

READ ALSO:

Robot Moves Like a Galloping Snail
Tue, July 05, 2011

Blog Post: This robot may not look much like a snail, but it does the "snail-wave" just like one

Omniwheels Gain Popularity in Robotics
Mon, October 04, 2010

Blog Post: Omniwheels are an ingenious invention that allows a platform to move in any direction while facing any direction

Ball Balancing Robot With Style
Tue, June 08, 2010

Blog Post: Swiss researchers create stylish robot that balances on a ball, with help from Disney

A Robot That Balances on a Ball
Thu, April 29, 2010

Blog Post: After building wheeled robots and legged robots, a researcher created a robot that rides on a ball

LG's New RoboKing Vacuum Can Now Explain Its Failures

LG's RoboKing series of robot vacuums may or may not be variations on the Roomba theme to the extent that they're not allowed to be sold here in the United States, where Roomba is the undisputed king (queen?) and reigns with a tight fist and lots of patents. But we have to give credit to LG for thinking outside the box disc when it comes to introducing nifty features. For example, unlike the Roomba, Mint, or Neato XV-11, the RoboKing navigates (and maps its environment) using a pair of cameras that scan the ceiling and the floor, which is a pretty neat trick:

The latest version of the RoboKing, announced just yesterday, adds a self-diagnostic mode where the robot actually checks itself out and tells you what's up. Push the diagnostic button, and the robot will give itself a 30 second shakedown cruise and then report back (in a sultry female voice, no less) with the status of 14 different components. No word on just exactly what it'll tell you, but I imagine something like, "that awful noise I'm making is because I just tried to eat one of your socks; please remove it before I explode."

We don't have too much else to go on at this point beyond that for those of you fortunate enough to live somewhere with less stringent patent enforcement, the LG RoboKing VR6172LM will be available soon for the equivalent of about $730.

Via [ Akihabara News ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More