Automaton iconAutomaton

Virginia Tech's RoMeLa Rocks RoboCup 2011

CHARLI-2 and DARwIn-OP robots relax in front of the Louis Vuitton Humanoid Cup

It's been a wild week at RoboCup 2011 in Istanbul, but we've got results for you: Virginia Tech's RoMeLa has emerged victorious in both the KidSize and AdultSize leagues, and they're bringing home the stylish and coveted (and stylish) Louis Vuitton Humanoid Cup for Best Humanoid for their AdultSize robot, CHARLI-2. This is big news, since Europe and Asia have historically dominated the RoboCup competitions, and in fact this'll be the very first time that the Cup (pictured above) has made it to the United States.

Dr. Dennis Hong (center) rewards a DARwIn-OP robot for a job well done

We'll have more details for you in the coming weeks from RoMeLa, their Team DARwIn partners at UPenn, and the other RoboCup teams, but for now, they all deserve a little break. Don't worry, though: we've got a bunch of video including RoMeLa's team of home grown DARwIn-OP humanoids in the finals, CHARLI-2's final match, and footage of the non-humanoid competitions as well (definitely don't miss the Middle Size final). Once again, congrats to Dr. Dennis Hong and the entire RoMeLa team (and their robots) for an impressive performance.


KidSize Final Match: Team DARwIn vs. CIT Brains (Japan)


AdultSize Final Match: Team CHARLI vs. Singapore Polytechnic University's ROBO-ERECTUS


Middle Size Final Match: Team Tech United (Netherlands) vs. Team Water (China)


Small Size Final Match: Team Skuba (Thailand) vs. Team Immortals (Iran)


Standard Platform Final Match: Team B-Human (Germany) vs. Team Nao Devils (Germany)


TeenSize Final Match: Team NIMBRO (Germany) vs. Team KMUTT Kickers (Thailand)

For more RoboCup 2011 video, check out both the BotSportTV and DutchRobotics YouTube channels.

[ RoboCup 2011 ]

[ RoMeLa ]

Robots Learn to Take Pictures Better Than You

Humans would like to believe that artistry and creativity can't be distilled down into a set of rules that a robot can follow, but in some cases, it's possible to at least tentatively define some things that work and some things that don't. Raghudeep Gadde, a master's student in computer science at IIIT Hyderabad, in India, has been teaching his Nao robot some general photography rules that he says enable the robot to "capture professional photographs that are in accord with human professional photography."

Essentially, they've simply taught their robot to do its best to obey both the rule of thirds and the golden ratio when taking pictures with its built-in camera. The rule of thirds states that it's best to compose a picture such that if you chop the scene up into nine equal squares (i.e. into thirds both vertically and horizontally), what you're focusing on should be located at an intersection between squares. And the golden ratio basically says that the best place for a horizon is the line that you get when you separate your scene into two rectangles, with one being 1.62 times the size of the other. This all sounds like math, right? And robots like math.

These aren't hard-and-fast rules, of course, and brilliant and creative photographers will often completely ignore them. But we don't need robots to be brilliant and creative photographers (and if they were, it would be serious blow to our egos). It would just be helpful if they were decent at it, which is what this algorithm is supposed to do, although there's certainly no substitute for a human's ability to spot interesting things to take pictures of. That said, I imagine that it would be possible to program a robot to look for scenes with high amounts of color, contrast, or patterns, which (in a very general sense) is what we look for when taking pictures.

The other part to all this is that Nao has some idea of what constitutes a "good" picture from a more abstract, human perspective. Using an online photo contest where 60,000 pictures were ranked by humans, Nao can give the images it takes a quality ranking, and if that ranking falls below a certain threshold, the robot will reposition itself and try to take a better picture.

If this method works out, there are all kinds of applications (beyond just robots) that it could be adapted to. For example, Google image searches could include a new filter that returns only images that don't completely suck. Or maybe the next generation of digital cameras will be able to explain exactly why that picture you just took was absolutely terrible, and then offer suggestions on how to do better next time.

UPDATE: Raghudeep sent us some example images to show how Nao autonomously reframes a photo after analyzing an initial shot. Note that the robot has a low-resolution camera that takes only 640 x 480 pictures.

Example 1, initial shot

Example 1, reframed shot

Example 2, initial shot

Example 2, reframed shot

Example 3, initial shot

Example 3, reframed shot

[ Raghudeep Gadde ] via [ New Scientist ]

Should We Automate Intimacy?

asimo happy birthdayWhat would happen to intimacy if an acquaintance you casually communicate with could manage to know you better than a close family member?

As my birthday rolled around this year, I was struck by the diverse levels of automation involved in the greetings received. There was the totally computerized message (with coupon!) from the auto dealer; a card with a handwritten signature from the insurance agent, or, more likely, his secretary; Facebook greetings from friends who were alerted by the social network about my birthday; an e-card with a personalized note from a brother, reminded by his calendar program; and printed cards personalized by relatives, most reminded by traditional, print calendars. From my husband, I received an iPad, with an endearingly personalized card, and from my daughter, a photograph with an original poem, based on shared memories. Neither of these last two need reminders because they have my birth date stored in their memories -- at least they'd better!

My response reflected the value ascribed to these intimations of intimacy: I trashed the auto dealer and insurance agent cards without reading them; set the personal notes on the sideboard for a week before discarding; used the iPad to respond in kind to the electronic greetings, with my husband's card stored with other memorabilia; and hung my daughter's poem-inscribed photo on my wall.

Now, what would happen if people had software at their disposal capable of knowing you well enough to be able to craft very personal messages or even a birthday poem as good as your daughter's? What if people could use a robot, more eloquent than they could ever be, to deliver birthday messages? Or maybe even marriage proposals?

Intimacy is part of the “We think, therefore I am” hard-wiring described by Philippe Rochat in "Others in Mind." Rochat elucidates how we become conscious of our own existence mostly through recognition by others and how we constantly struggle to reconcile our internal view of ourselves with what we perceive others reflecting. So to the extent that a poem about us conforms with our internal view, that view is validated; we feel the satisfaction of intimacy with an author who understands us. The question is, do we feel validated if that "someone" is not a person but the result of smart algorithms interacting with our email trail and social profiles? Already there've been experiments to create AI tools that'd automatically post Twitter and Facebook updates just like you would, or that would work as your "cognitive assistants," completely taking care of organizing and prioritazing your information and communication activities. 

Perhaps the real question, then, is a more complex one: As we become more dependent on computers and automation (and more addicted to social networks), are we evolving into persons joined by brain-computer interfaces? I say that metaphorically, but only for now, as advances in neural prostheses might turn that scenario into reality. And when that happens, will we become more like conjoined twins, sharing an amygdala? Would the e-cards from the insurance agent become input data in our cortices that we need to send to a mental spam folder? Or in the case of desired inputs, could AI algorithms generate messages that validate us and make us feel warm-and-fuzzy? Could Asimo deliver you a better happy birthday message than your spouse's?

Society must weigh the benefits and dangers of automated intimacy. When I began writing this post, I felt wary of that concept. But as I think it through, I'm reconsidering. In AI and neurology, when a process is critical, common, and/or instantaneous, it's usually hardwired, automated. Does it make sense, then, as we become a communal being, that the process of personal validation, of recognition of each self that will make up Ourselves (I use the Royal group noun form purposely) should be automated and hardwired? I might be ready to answer "maybe," with this caveat from neuroscientist David Eagleman's "Incognito: The Secret Lives of the Brain" bestseller: "Evolve solutions, [but] when you find a good one, don’t stop."

I look forward to hearing others' ideas and feelings about automated intimacy.

Jeanne Dietsch, an IEEE member, is CEO and co-founder of MobileRobots in Amherst, N.H., and vice president of emerging technologies at Adept Technology. Her last post was about automated airport security.

Image: Honda

Robot Soccer Players Learning Fancy Human Skills

In the past, most humanoid robot soccer competitions have consisted of repeated kicking of the ball towards the goal and (for all practical purposes) not too much else. Ambitious algorithms and programming have fallen victim to sensors and hardware that can't always keep up, as well as opponents who tend to interfere in carefully planned strategies. However, we're starting to see some exceptionally clever robot maneuvers leading up to RoboCup 2011 in Istanbul, which had its first round of matches just yesterday.

These two videos come from the Darmstadt Dribblers, whom you may remember as the victors in the KidSize bracket at RoboCup 2010. They show the robots practicing both human-style throw-ins, and a skilled passing game that avoids obstacles, all completely autonomously:

Impressive. Most impressive. Personally, I think we humans are doomed, especially considering that it was two years ago now (i.e. foreverago in robot years) that a team of non-humanoid robots actually managed to score on a team of humans in a friendly game.

[ Darmstadt Dribblers ]

[ RoboCup 2011 ]

Robot Vacuum Sucks Up Radiation at Fukushima Plant

Special Report: Fukushima and the Future of Nuclear Power

Editor's Note: This is part of IEEE Spectrum's ongoing coverage of Japan's earthquake and nuclear emergency.

fukushima irobot warrior vacuum cleaner

After Japan's devastating earthquake and tsunami in March, U.S. firm iRobot sent four of its rugged, tank-like robots to help with recovery operations at the crippled Fukushima nuclear power plant.

It seems iRobot should have sent some Roomba vacuuming bots as well.

Last week, Tokyo Electric Power Co. (TEPCO), the plant's operator, said it was improvising a robotic vacuum cleaner to remove radioactive dirt from the reactors.

TEPCO built the system by taking an industrial-grade vacuum and attaching its business end to the manipulator arm of a Warrior, iRobot's strongest mobile robot. By remote controlling the bot from a safe distance, workers planned to clean up radioactive debris and sand, which collected on the floor when the tsunami flooded the plant.

Watch the Warrior entering Reactor No. 3 and doing some vacuuming:

In April, TEPCO sent two PackBot robots, also made by iRobot, into some of the reactors. The robots measured high levels of radiation and captured dramatic footage of the damaged facilities. The company has also relied on robotic drones and remote-controlled construction machines. But this is the first time TEPCO uses robots to assist with removal of radioactive debris inside the reactors.

The goal of the cleanup, TEPCO said, was to "reduce the radiation exposure" of workers, who might have to go near or into the reactors to perform repairs and other work. Did it work? I haven't seen any details, but will report back if I find out whether the work helped to reduce radiation levels.

See below details of the operation [click image to enlarge].

irobot fukushima robot vacuum cleaning reactors

Images and video: TEPCO

Updated July 8, 2011 10:07 a.m.

READ ALSO:

Drone Reveals Fukushima Destruction
Wed, April 20, 2011

Blog Post: Video taken by a Honeywell T-Hawk micro air vehicle show damage with unprecedented detail

Videos of PackBot Robots Inside Reactors
Wed, April 20, 2011

Blog Post: Videos show two iRobot PackBots navigating inside the highly radioactive buildings

Robots Enter Fukushima Reactors
Mon, April 18, 2011

Blog Post: Two robots have entered Reactors No. 1 and No. 3 and performed radioactivity measurements

Can Robots Fix Fukushima Reactors?
Tue, March 22, 2011

Blog Post: It's too dangerous for humans to enter the Fukushima nuclear plant. Why not send in robots?

You've Never Seen a Robot Drive System Like This Before

What you're looking here is a hemispherical omnidirectional gimbaled wheel, or HOG wheel. It's nothing more than a black rubber hemisphere that rotates like a spinning top, with servos that can tilt it left and right and forwards and backwards. Powered by this simple drive system, the robot that it's attached to can scoot around the floor in ways that I would have to characterize as "alarmingly fast."

Before I go on about the design, have a look at just what this thing is capable of. Its creator, Curtis Boirum, a grad student at Bradley University, in Peoria, Ill., demoed it at the 2011 RoboGames symposium:

Just to reiterate, a HOG wheel is simply a rubber hemisphere that spins on its axis very, very fast. When the hemisphere is vertical, it's just like a spinning top, but by tilting the hemisphere so that one of its sides makes contact with the ground, you can vector torque in any direction near-instantaneously, depending on which side of the hemisphere you use.

So for example, if the hemisphere is spinning clockwise, tilting it so that the right side contacts the ground will "pull" the robot forward, with the amount of torque directly proportional to the tilt of the hemisphere, like an infinite gear ratio without any gears. It's very simple, very efficient, and as you can see from the video, the drive system is capable of delivering more torque than any of the poor robots that it's attached to can reliably handle.

This idea has actually been around for decades: a concept illustration of a car with a HOG drive graced the cover of the 1938 edition of Mechanics and Handicraft Magazine. Nothing much has really been done with it since, but Curtis (who actually re-invented the system from scratch) is hoping to create a cheap, powerful, and agile omnidirectional drive system that can be adapted for use by both researchers and hobbyists. We hope he'll build a car-sized version too.

Update- this is now being called a "Singularity Drive System," in reference to the zero gear ratio transition point, which is a mathematical singularity.

[ Bradley University ]

READ ALSO:

Robot Moves Like a Galloping Snail
Tue, July 05, 2011

Blog Post: This robot may not look much like a snail, but it does the "snail-wave" just like one

Omniwheels Gain Popularity in Robotics
Mon, October 04, 2010

Blog Post: Omniwheels are an ingenious invention that allows a platform to move in any direction while facing any direction

Ball Balancing Robot With Style
Tue, June 08, 2010

Blog Post: Swiss researchers create stylish robot that balances on a ball, with help from Disney

A Robot That Balances on a Ball
Thu, April 29, 2010

Blog Post: After building wheeled robots and legged robots, a researcher created a robot that rides on a ball

LG's New RoboKing Vacuum Can Now Explain Its Failures

LG's RoboKing series of robot vacuums may or may not be variations on the Roomba theme to the extent that they're not allowed to be sold here in the United States, where Roomba is the undisputed king (queen?) and reigns with a tight fist and lots of patents. But we have to give credit to LG for thinking outside the box disc when it comes to introducing nifty features. For example, unlike the Roomba, Mint, or Neato XV-11, the RoboKing navigates (and maps its environment) using a pair of cameras that scan the ceiling and the floor, which is a pretty neat trick:

The latest version of the RoboKing, announced just yesterday, adds a self-diagnostic mode where the robot actually checks itself out and tells you what's up. Push the diagnostic button, and the robot will give itself a 30 second shakedown cruise and then report back (in a sultry female voice, no less) with the status of 14 different components. No word on just exactly what it'll tell you, but I imagine something like, "that awful noise I'm making is because I just tried to eat one of your socks; please remove it before I explode."

We don't have too much else to go on at this point beyond that for those of you fortunate enough to live somewhere with less stringent patent enforcement, the LG RoboKing VR6172LM will be available soon for the equivalent of about $730.

Via [ Akihabara News ]

Omnidirectional Robot Moves Like a Galloping Snail

A snail might not necessarily be your first choice when it comes to mobile robot design, but our gastropodal friends have a few tricks up their non-sleeves when it comes to moving themselves around. Most snails rely on two techniques to move: undulating, which uses fluidic mucus pressure, and galloping, which is apparently when "like an inchworm, the animal sticks the front of its foot to a surface (thanks to suction and friction from the mucus), and then draws the rest of its body up behind it.”

This galloping technique has been adapted (and expanded) for robots by the Biomechatronics Lab at Chuo University in Japan. Their "Snail-Wave Omnidirectional Mobile Robot," Toro II, may not look a whole lot like a snail (and it's completely mucus free), but check out its moves:

The advantage of this robot with its sexy wave action is stable omnidirectionality: no matter what direction it's moving in (and even if it's not moving and/or completely powered off), the robot boasts a large and grippy area that's always in contact with the ground. You can compare this to other omnidirectional robots, such as the slug bots, squishy creatures that transform from soft to hard, or the Mecanum wheel system, which offers varying resistance depending on what direction it's moving and requires active braking if it needs to stay in one place. The snail robot, by contrast, is much more resilient to things like unintended shoves, and its designers suggest that this inherent stability and freedom of movement might make robots like these ideal for hospitals and factories.

[ Nakamura Lab ]

Next Big Thing in Silicon Valley: Robotics?

Silicon Valley is known for its software, semiconductor, and Internet companies. Can it become a high-tech nexus for robotics too?

"Yes," says Rich Mahoney, director of robotics at SRI International, in Menlo Park, Calif., in the heart of Silicon Valley. And to make sure that people know it, he and his colleagues at SRI, along with local robotics companies such as Adept Technology and Willow Garage, have formed a group called Silicon Valley Robotics (SVR). The goal of SVR is to "nurture the robotics industry in this area and help create an environment where other companies would want to come here and start up," he says.

rich mahoney sri international silicon valley robotics There are other robotics centers in the United States, most notably the Boston area surrounding the Massachusetts Institute of Technology and also Pittsburgh, where Carnegie Mellon University resides. The greater Silicon Valley area has Stanford University and U.C. Berkeley, and quite a heritage of robotics accomplishments too, but compared to these other regions, the area has been "overlooked in some ways as being a center for robotics," says Mahoney [photo, right]. The reason for it may be that "there was so much other activity going on here and that robotics was lost relative to all the other things."

Mahoney had already been in robotics for over 20 years before he came to work in Silicon Valley in September 2008. Once he arrived, he was surprised to find there was a real cluster of robotics companies and research groups in this area, and yet unlike Boston and Pittsburgh, there was no organization representing that industry. So he started talking about the idea of forming a group where people in the robotics industry can get together to network and discuss important issues. With Philip von Guggenberg and Regis Vincent at SRI, Mahoney started having weekly meetings to talk about ways to make it happen and put together a mailing list. The group grew organically with volunteers organizing meetings, but it was not until this year's National Robotics Week, when Silicon Valley Robotics endorsed and managed the Robot Block Party at Stanford, that they decided to get exposure.

The group consists of about 40 organizations and is still in an informal grass roots stage. They get together at members' facilities for networking events. Right now the plan is to form a "leadership council" by the end of this year which will define the structure of the organization so that it can move on to the next stage. SRI, Adept, Willow Garage and German electronics company Robert Bosch, which conducts robotics research at its facility in Palo Alto, are interested in participating in this council, according to Mahoney. Currently there is no membership fee and "any organization in the greater Silicon Valley region interested in the robotics industry can be a part of it," he notes.

silicon valley roboticsPR2 demo during a SVR meeting at Bosch's Palo Alto research center.

As robots jump out of the factory floors into homes and communities, the robotics industry promises to grow dramatically, and Silicon Valley will be competing with other areas for talent and investment. Recently, French robotics company Aldebaran Robotics decided to set up its U.S. operation in Boston. Mahoney says that Aldebaran had been looking at San Francisco as a potential location. "I am absolutely convinced that if there was a Silicon valley Robotics fully organized that I could have referred them to, to promote and attract them, that they would be in San Francisco," Mahoney points out.

On the other hand, there is also the need to cooperate with the other robotics regions to get their message heard in Washington, in regards to regulations, immigration and liability issues, which need to be made clear for the market to grow. And from that standpoint, a group like SVR will play an important role as the region's "single voice" so that it can "cooperate to elevate the resources and attention of the whole country."

SVR is also planning on organizing an "investor forum" to get the local venture capitalists interested in the robotics field. Much of the funding in robotics research in the U.S. has so far been from the military budget and for the robotics industry to bloom there is need for investment from the private sector - just as the Internet started with military funding and then blossomed into an industry. When Mahoney gave a talk on the state of robotics at a local industry event, he got "blank stares." "There's a whole industry here that's starting to emerge and if you are in the investment community, you have to pay attention," he emphasizes.

"As an outsider coming in, I find Silicon Valley a remarkable place with an aura, a concentration of technical know-how combined with an innovative spirit. I have no doubt that once the dots get connected, that things will happen quickly."

This article appeared originally at GetRobo.

Norri Kageki is a journalist who writes about robots. She is originally from Tokyo and currently lives in the San Francisco Bay Area. She is the publisher of GetRobo and also writes for various publications in the U.S. and Japan.

Controlling a Quadrotor Using Kinect

My colleagues working on the Flying Machine Arena (or FMA) at the ETH Zurich have just posted a video of their latest feat: A natural human-machine interface for controlling their quadrocopters.

The Magic Wand used for controlling quadrocopters at the ETH Zurich's Flying Machine ArenaUntil now, visitors of the FMA could use a magic wand like the one in the right picture to send quadrotors racing through the 10x10x10m space. As shown in the video, the addition of a Kinect now allows a far more natural and intuitive interaction.

What's next? I vote for using the new interface to have Asimo directing the FMA's dancing quadrocopters to the Quadrocopter Opera!

[ ETH - IDSC ]

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More