Automaton iconAutomaton

Robot Bus Moves People, No Driver Needed

robosoft roburide

French company Robosoft has unveiled what it calls a "cybernetic transport system." The robuRIDE carries 30 passengers and reaches 24 kilometers per hour, driving autonomously using differential GPS and onboard sensors.

robosoft roburideThe vehicle weighs 3 metric tons, or 5 metric tons fully loaded. A 380-volt pack of lead acid batteries gives it 8 hours of autonomy. In automatic mode, it can follow a pre-recorded path. To drive it manually, you use a game controller.

A safety system relies on a laser scanner to avoid collisions. In the video below, the researcher demonstrates it by putting his body on the line. The vehicle detects him and slowly decelerates, stopping about 2 meters from him. Even if this system fails, a soft foam bumper stops the vehicle if it hits something, or someone.

Now my favorite part: According to the video, you can regulate the vehicle's cabin temperature by "opening windows and [using] fans." Elegant engineering.

This is the system's second generation, and the video shows a test conducted in France last month. The robuRIDE is part of Rome's Cybernetic Transport System, a program to implement a high-tech transportation infrastructure at the city's new convention center. Is this the future of mass transit?

Vincent Dupourque, CEO of Robosoft, says:

We have completed the 2nd generation of our robuRIDE, used to implement Cybernetic Transport Systems. It can reach 24 km/h with 30 passengers, and has a dynamic accuracy of a few centimeters, thanks to the hybrid navigation system based on GPS, inertial and odometry.

This video has been done during Factory Acceptance Tests of the robuRIDE for ROME Cybernetic Transport System, in May 2010, in Dax, Biarritz and Bidart. We explain here the vehicle and how it works, show how it can be transported from one site to another, and also some users give their first impressions.

Photos and video: Robosoft

Monkey Controls Advanced Robot Using Its Mind

university of pittsburgh brain machine interface

In a remarkable demonstration of brain-machine interface technology, researchers at the University of Pittsburgh have taught a monkey to use just its thoughts to control an advanced robotic arm and perform elaborate maneuvers with it.

It's not the first time a monkey with sensors implanted in its brains has controlled machines with its mind. But this seven-degrees-of-freedom robot arm is probably the most complex system a monkey has ever mastered with its thoughts alone.

Researchers have long been working to put the brain in direct communication with machines. The hope is one day brain-machine interfaces will allow paralyzed people to operate advanced prosthetics in a natural way. Recent demonstrations have seen animals and humans controlling ever more complex devices.

But the experiments at the University of Pittsburgh, led by Dr. Andrew Schwartz, a professor of neurobiology, appear to involve an unprecedented degree of complexity in terms of the robotic arm, the level of control, and the difficulty of the manipulations demonstrated. Watch:

The video above shows the experiment. Note the monkey on the right side of the screen. It uses its right arm to tap a button. (Its left arm is gently restrained inside a tube.) This triggers the robotic manipulator labeled DENSO [left side of the screen] to position a black knob at an arbitrary point in space. The monkey then controls its articulated robotic arm to grasp the knob.

After gently touching the knob, the monkey places its mouth on a straw: it then gets a drink reward. (Actually, the animal places its mouth on the straw before even touching the knob; that's because it has learned, from repetition, that it's about to get the reward.) After that, both robotic arms reset. Again, the monkey taps the button, waits for the knob's new position, and readily and precisely moves its robotic arm to get a drink.

In this experiment, the monkey received two brain implants: one in the hand area and another in the arm area of its motor cortex. The implants monitor the firing of motor neurons and send this data to a computer, which translates the patterns into commands for the robotic arm.

In a previous study two years ago, Dr. Schwartz and his team taught a macaque to control a simpler mechanical arm to feed itself. This was a four-degrees-of-freedom arm with shoulder joints, an elbow, and a simple gripper.

Now the researchers have added three more degrees of freedom by adding an articulated wrist, which can perform roll, pitch and yaw movements. And the arm itself was replaced by a larger and nimbler manipulator. Although the video doesn't show it, the monkey can not only touch the knob but also precisely turn it by rotating the mechanical wrist.

Dr. Schwartz was kind enough to share this much with us, but he says more details will have to wait until he and his colleagues publish their results. We'll be waiting.

Photo and video: Dr. Andrew Schwartz/University of Pittsburgh

Read also:

Build Your Cyborg Body and Live Forever, Maybe
Fri, June 06, 2008

Blog Post: Check out our Bionic Body Shop to see some of the most advanced prosthetic devices

Monkey's Brain Can 'Plug and Play' to Control Computer With Thought
July 2009

Article: Researchers show brain can learn to operate prosthetic device effortlessly

The Rocket-Powered Prosthetic Arm
Mon, December 17, 2007

Video: Vanderbilt engineers trade in batteries for rocket fuel in the robotic arm they are developing for amputees.

The Truth About Bender's Brain
May 2009

Article: David X. Cohen, co-creator of "Futurama," reveals how MOS Technology's 6502 processor ended up in the robot's head

Vgo Telepresence Robot

In what may be (but probably isn’t) just a coincidence, a third telepresence robot has made a (pre) commercial appearance in as many weeks. This robot is called Vgo, and… Well, it does telepresence. Stop me if you’ve heard this before, but you get on your computer on one end, connect to the robot, and then drive it around while looking through its cameras. Sensors keep you from running into stuff or falling down stairs, and it’ll run all day on one battery charge. The biggest news, at this point, is that the Vgo is only supposed to cost $5000. Plus a mandatory support contract of $1200 a year. So, $6000.

The Boston Globe has a nice piece on Vgo… There aren’t many more technical details, but I did find this interesting:

Two analysts I spoke with differed on the potential for robotic videoconferencing. Rob Enderle, a technology analyst at the Enderle Group who has written about the slow spread of traditional videoconferencing systems, said that “the closer we get to simulating being there, the better an alternative to travel it will become.’’

But Dan Kara, president of the publishing company Robotics Trends in Framingham, said, “I’m not quite sold on mobile telepresence. How is it that much better than having someone at the remote site carry around a netbook computer with a free copy of Skype on it?’’

The whole minion+laptop+Skype thing is exactly the point we made back when Anybots’ QA was introduced at CES for $30k. Obviously, a telepresence robot is much better than minion+laptop+Skype, but the question is, is it really that much better in terms of cost effectiveness? At the $6k price point, perhaps. Or maybe that’s not the question… Maybe the question should be, how much hardware is required to simulate being somewhere else to the extent that is necessary to make paying for a robotic telepresence solution a practical idea? I don’t have the answer, but hopefully the consumer market will, now that there are (or soon will be) three different telepresence robots available for people to purchase.

[ Vgo ] via [ Boston Globe ] via [ Texasalpha ]

iRobot Demonstrates New Weaponized Robot

UPDATE: Some readers argued that the APOBS, or Anti-Personnel Obstacle Breaching System, developed in a joint program of the U.S. Army and Navy, is not, technically, a weapon, because it's not an anti-personnel system but rather a system used against obstacles. Perry Villanueva, the project engineer for the APOBS program on the Army side, says the APOBS "is not a weapon in the traditional sense, but it is a weapon." Other readers wondered how the rocket compensates for things like wind. Villanueva says that is more of an operational issue. "With high winds it is up to the soldier to position it so it will have a high probability of landing on its target."

iRobot released today new video of its Warrior robot, a beefed-up version of the more well-known PackBot, demonstrating use of the APOBS, or Anti-Personnel Obstacle Breaching System, an explosive line charge towed by a rocket, with a small parachute holding back the end of the line. The APOBS, iRobot says, is designed for "deliberate breaching of anti-personnel minefields and multi-strand wire obstacles." It can clear a path 45 meters long and 0.6 meters wide.

iRobot worked with the Naval Air Warfare Center Weapons Division (NAWCWD), the U.S. Army Tank Automotive Research, Development, and Engineering Center (TARDEC), and the U.S. Marine Corps Forces Pacific Experimentation Center (MEC) for this demonstration. It took place in November 2009 at the Naval Air Weapons Station China Lake in California's Mojave Desert.

Although it may concern those who don't like the arming of robots, it makes great eye candy for those who like robots, rockets, and explosions.

Now, let me say this: I am neither condoning nor condemning the weaponization of robots, just stating the facts that I am aware of.

In early 2009 a handful of defense related companies came to Thailand to demonstrate their latest war toys to the local generals. One of those companies was iRobot, and as I have many friends who work for iRobot and I was living in Bangkok at that time, I got to meet up with them to see one of the toys they brought: a Warrior.

At the time, the Warrior hardware was complete, designed to carry 150 pounds, but I've seen it lift people standing on it. Unfortunately, and understandably, many of my questions about it were answered with "we aren't sure we are allowed to answer that." I couldn't get an answer as to how much it would cost, but I was given the impression that it's more than $100,000 per unit.

Back in the day, the founders of iRobot had been against the weaponization of robots. Perhaps business and financial pressures are pushing the boundaries. Indeed, the military market is becoming ever more important, according to the company's first quarter results. Finances were very tight in 2009, so iRobot probably sees military systems as a market they'll have to explore and expand.

Updated by Erico Guizzo, June 1, 2010 : Added details on APOBS; edited comments on iRobot financials and weaponized robots. June 2: Added details on demonstration participants, date, and place. June 3: Added more details on APOBS.


iRobot's Shape-Shifting Robot
Tue, October 13, 2009

Blog Post: iRobot shows off a soft, morphing robot prototype that will be able to squeeze through wall cracks and under doors.

Robots With Knives
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

How I Became a Robot and Went Partying
Thu, May 27, 2010

Blog Post: I couldn't make it to Willow Garage's PR2 launch party last night, so I went as a robot

Boston Dynamics' Robot Mule
Mon, February 01, 2010

Blog Post: This robot mule will be able to navigate rough terrain, carrying 180 kilograms of soldier gear -- no driver required

Aggressive Quadrotor Maneuvers Are Totally Nuts

uav drone grasp lab

Every once in a while, we get to see a video of a robot doing something that makes us think "OMG WTF THAT’S WICKED CRAZY IMPOSSIBLE!!!" And then, we remember that crazy stuff is entirely possible, because we’re talking about robots, and we have to stop thinking about what is and is not possible in terms of human capabilities.

This is one of those videos:

I don’t have much more info for you than what’s in the video, unfortunately, but it does look like these maneuvers (while obviously autonomous) are currently restricted to an area with a whoooole bunch of sensors that can tell the robot where it is with an accuracy (and frequency) that’s probably pretty impressive.

If you remember, we’ve seen both autonomous acrobatics and autonomous landing on slopes by UAVs, but nothing like this… The precision of these maneuvers is just totally completely nuts.

[ GRASP Lab ] via [ DIY Drones ]

Willow Garage's PR2 Robots Graduate

It's graduation season, and yesterday Willow Garage, a start-up dedicated to accelerating the development of personal robots, sent its first graduation class of PR2s off into the world. These 11 robots are heading out to universities and labs in Germany, Japan, Belgium, and the United States, where they will help researchers figure out how robots can assist the elderly and the autistic, navigate buildings and open doors, and help people do house chores, to name just a few of the many projects in the works. At the graduation party in Menlo Park, Calif., some of the researchers told IEEE Spectrum about their plans for these robots. And then it was time to celebrate.

And here's Willow Garage showing off the PR2 at a pre-party press conference. The video was recorded by Spectrum's Erico Guizzo, who was embodied as a Texai, a telepresence robot also created by Willow:

How I Became a Texai Robot and Went Partying

willow garage texas texai telepresence robotI couldn't make it to Willow Garage's PR2 robot launch party last night, so I went as a robot.

While some 400 people dragged their physical bodies to the event in Menlo Park, Calif., I sat in my living room in Brooklyn, N.Y., and uploaded myself into a robot surrogate.

Using this telepresence robot, called Texai, I was able to move around, see and talk to Willowites and guests, and sip WD-40 cocktails. Just kiddin'. No drinks for robots.

Two Willow engineers built the first Texai prototype just for fun, using spare parts they found in the office. The robot proved so useful it became an official project at Willow, which has built 25 of them.

This was the second time this month I've embodied a robot, which convinces me humanlike androids and robotic surrogacy will become a reality sooner than we think.

But the star of the night, and the reason for the party, was another robot, Willow's PR2. Or more precisely, the 11 PR2s that Willow is giving away to institutions all over the world to speed up research in personal robotics.

The PR2 is a mobile robot with advanced vision and manipulation capabilities. Each costs several hundred thousand dollars. But what makes the robot stand out is its software: the Robot Operating System, or ROS, a powerful, open source robotics platform that Willow is building.

Eric Berger (left) and Keenan Wyrobek of Willow Garage show off the PR2 robot. I attended the event via a telepresence robot (that's my face on the bottom left corner).

My colleague Tekla Perry, who was also at the event (in her physical body), interviewed several PR2 recipients and will be posting videos.

For me the most interesting part was being a Texai for a night.

The robot's head is a standard flat-screen monitor, fixed atop a long metal pole. The Texai uses a wheel system similar to the PR2's. And it also runs ROS, which handles the motor controllers and teleoperation functions.

People I talked to via the robot really wanted to know how the driving works. The Texai uses Skype to establish a two-way video link, and a Web page shows a simple, intuitive control interface [below].

willow garage texas texai controller robot

You just use the mouse to hold and drag a little red ball and the robot moves. You can also make the head camera point in different directions, or switch to an auxiliary camera that shows the robot's wheels, to help while navigating through furniture and feet.

Learning how to drive is easy. But safety first! Willow makes new Texai users watch a video showing all the things you should not do with the robot -- drive down a stairway, let children ride on it, stick a screwdriver into its body.

At first my driving was in grandma-mode. But after a few minutes I felt comfortable to drive faster and fearless. You can move in any direction, slowly or rapidly, as well as rotate on your vertical axis.

The robot has a plastic bumper, so it won't damage walls, furniture, or a person's leg, for that matter. I did manage, though, to get myself stuck against a wall.

"May I help you?" said Sanford Dickert, my driving instructor and escort at Willow [see photo at the beginning of this post]. Yes, please! He nudged me -- well, the robot -- and off I went.

I headed out to the EE lab to talk to Dallas Goecker, a Willow electrical engineer living in Indiana who, along with Curt Meyers, built the first Texai prototype. Goecker, as he does every workday, was present as a Texai.

So there we were: Two Texai talking to each other screen to screen.

Dallas Goecker, Texai co-creator, and I (inset, bottom left) meet face to face -- or screen to screen.

Goecker told me that being a robot became so natural for him that he sometimes can't recall whether he did something -- a discussion with a coworker, say -- as a person or as a robot.

So what is WIllow going to do with their 25 Texai? They're not selling them. So far they're doing some field tests at undisclosed sites and collecting feedback.

For a company focused on open source projects, I find they're a bit secretive about the Texai. My guess is the robot has commercial possibilities that they want to explore. Especially when you have a Google CEO showing it off at parties.

Or maybe they'll just give the robots away for free.

After the press conference, the party was to continue at a tent outside the building. I was told there was just one robot for reporters. John Markoff was in it and he was not getting out. Damn you New York Times!

Oh, well. I hung out inside with other guests and my Texai brothers. It was fun. The highlight was meeting people I'd been spoken via e-mail or on Twitter but had never met in person: BotJunkie's Evan Ackerman, GetRobot's Noriko Kageki, and Hizook's Travis Deyle -- some of the world's top robotics bloggers!

I even met some celebrities. As a huge MythBusters fan, it was great to chat with Tory Belleci, who for some reason wouldn't stop laughing at me, or the robot, or both [below].

My robotic existence wasn't perfect. More than once people had their backs facing me, though unintentionally, thinking I was just a piece of high-tech furniture. Other people felt clearly uncomfortable with a talking monitor and pretended I wasn't there.

Sometimes I couldn't hear people and vice versa. Twice I lost connection and had to log on again. And once my video feed froze and my face, I was later told, became a Francis Bacon portrait.

But overall it was a great experience. In the future, I see no reason why people wouldn't rely heavily on telepresence robots to attend meetings, interact with coworkers, and -- why not -- go partying.

More images:

willow garage texas texai robot

Robots: The Nao Humanoid

Aldebaran's Nao robot

In its latest episode, Robots, the podcast for news and views on robotics, takes a closer look at French robotics company Aldebaran and its humanoid Nao. Aldebaran's Vice President in Engineering Luc Degaudenzi and his colleague Cédric Vaudel, who is Aldebaran's Sales Manager for North America, discuss the Nao's success in the RoboCup Standard Platform League, share details on the robot and outline how they see the future of the market for humanoids. Read on or tune in!

Read also:

The Robots Podcast: 50 Years of Robotics (Part 1)
Fri, April 23, 2010

Blog Post: The Robots podcast is celebrating its 50th episode.

Robots: 50 Years of Robotics (Part 2)
Sat, May 15, 2010

Blog Post: The Robots podcast celebrates 50 episodes.

Aldebaran Robotics seeking beta-testers for its Nao humanoid robot
Thu, June 04, 2009

Blog Post: The French robotics company is inviting robot enthusiasts in France and the U.K. to try Nao -- for a fee

Microsoft Shifts Robotics Strategy, Makes Robotics Studio Available Free

Updated May 20, 4:23 p.m.: Added National Instruments comments; 5:49 p.m.: Added Willow Garage comments; May 21, 11:21 a.m.: Added details on competing robotics software platforms; 1:50 p.m. Added Herman Bruyninckx comments.

microsoft robotics developer studio
Microsoft's new and now free release of its Robotics Developer Studio includes new 3-D simulation environments like this multi-level house.

Over the past year or so, Microsoft's robotics group has been working quietly, very quietly. That's because, among other things, they were busy planning a significant strategy shift.

Microsoft is upping the ante on its robotics ambitions by announcing today that its Robotics Developer Studio, or RDS, a big package of programming and simulation tools, is now available to anyone for free.

Previously, RDS had multiple releases: one free but with limited features, a full commercial version that users could purchase, and an academic version distributed only to partners.

By releasing a single version with full capabilities and at no cost, Microsoft wants to expand its RDS user base, hoping to amass a legion of hobbyists, researchers, entrepreneurs, and other robot enthusiasts who will come up with the next big things in consumer robotics.

I spoke about the new plan with Stathis Papaefstathiou, who leads the robotics group and is responsible for Microsoft’s robotics strategy and business model.

"We decided to take out all of the barriers that today our users might have in order to help them build these new [robotic] technologies," he told me.

Papaefstathiou (pronounced papa-ef-sta-THI-u) says that price is a big limitation for mass produced robots. "That means that in the consumer space it's not about sophisticated hardware, it's about the software stack."

He says RDS has been downloaded half a million times since it launched in 2007. The company estimates it has about 60,000 active users.

Those are respectable numbers but they didn't help Microsoft fulfill its goal of kick-starting a vast and lucrative robotics development ecosystem -- like MS-DOS and Windows did for the PC.

So over the past two years the robotics group, which is part of of an elite software division called Startup Business Group led by Amit Mital, who reports to Craig Mundie, set about devising a plan to expand Microsoft's stake in robotics.

Not everyone is convinced the new plan makes sense.

"This is all just a non-event," says Herman Bruyninckx, a robotics professor at K.U. Leuven in Belgium and coordinator of EURON, the European Robotics Research Network.

Bruyninckx, an advocate of free and open source software who started OROCOS, or Open Robot Control Software, a framework for robot control, says that making RDS free is not a change in strategy and nobody he knows in the robotics community is "talking about RDS, let alone using or planning to use it."

Papaefstathiou says that in addition to creating a single RDS release, Microsoft is also making the source code of selected program samples and other modules available online, hoping to improve collaboration among users. In particular, the group wants to entice the growing community of hobbyists, do-it-yourselfers, and weekend robot builders.

He also says there will be closer collaboration with other projects at Microsoft. He mentions Project Natal, a motion tracking user interface that Microsoft is creating for the Xbox 360. He says Natal's ability to track gestures could "be available also in solutions where human-robot interaction becomes important."

Will the new strategy work?

microsoft robotics developer studio
A factory model is now part of Microsoft Robotics Developer Studio's simulation environments.

RDS is not a robot operating system -- it's a comprehensive set of development tools, samples, and tutorials. It includes a visual programming interface, a popular 3-D simulator, and also Microsoft's CCR and DSS runtime toolkit.

But despite its broad range of tools, RDS works best with the specific robot platforms it supports, including iRobot's Create, LEGO Mindstorms, CoroWare, Parallax, and others.

These are great robot platforms but by no means the only ones. In fact, many budding roboticists today are using Arduinos and programming ATmega microcontrollers to build innovative robots. Why would they need RDS?

Microsoft has plenty of competition as well. Other robotics software platforms include Urbi by French firm Gostai, ERSP by Evolution Robotics, and the Player/Stage Project.

One platform that is rapidly gaining adoption and has shown impressive results is the Robot Operating System, or ROS, a broad set of open source tools by Silicon Valley robotics firm Willow Garage.

Other users, including a growing number of high school students participating in the popular FIRST robotics competition, use National Instruments' LabVIEW tools and controllers to program their robots.

Papaefstathiou acknowledges that there are alternative software packages that can do some of the things -- visual programming and simulation, for example -- that RDS does, but he insists that "there's no single competitor for the overall toolset that we have."

As for Willow Garage, Papaefstathiou says they're "targeting different platforms and different capabilities," adding that some of the robots they're using are half million dollar systems.

"People are doing a great job in developing robotics technology there, but this is not something that goes into scale," he says. "And we here in Microsoft we are about scale."

Not surprisingly, Willow Garage disagrees.

"We designed ROS to be flexible and open, because researchers and application developers alike need to be able to inspect, improve, and extend the system," says Brian Gerkey, Willow Garage's director of open source development. "As a result, ROS is now used on a wide variety of robots, from inexpensive iRobot Creates to sophisticated humanoids and even autonomous cars. It's only through open source that we can reach this level of adoption and community involvement."

National Instruments, for its part, welcomes Microsoft's move.

"I'm glad to see that National Instruments, Microsoft, Willow Garage and other major players are aligned on a critical missing element to the robotics industry crossing the chasm and really taking off," says Shelley Gretlein, senior group manager of NI's real-time and embedded software. "The key is in the development software. Lowering the software barriers will make it easy to get into robotics."

Microsoft established the robotics group in 2007 under the leadership of Tandy Trower, a software veteran who'd headed some of Microsoft's largest and most successful businesses, eventually becoming a minister-without-portfolio reporting directly to Bill Gates.

Trower and Gates believed the consumer market was the right place for the next biggest innovation in robotics, finding parallels with the beginnings of the PC industry, a view Gates described in a now-famous Scientific American article, "A Robot in Every Home."

But things changed late last year when Trower left Microsoft to start a healthcare robotics company. The company chose Papaefstathiou, an unashamed Trekkie -- "Data is very inspirational" --  with a background in high-performance computing, as the robotics group's new leader. It's up to him now to turn Gates' a-robot-in-very-home vision into reality.

I do see potential for a big expansion of RDS. But my impression is that it will be strongest among schools and universities. Now any engineering school in, say, Brazil, Russia, India or China, could use it and have students programming robots, or at least simulating them.

The question is, Will promising, cool robotics products for the consumer market emerge from a larger RDS community? I asked Papaefstathiou what kinds of commercial robots he envisions would be around.

He wouldn't give me specific examples, preferring to say it was up to "the community to think broader about the scenarios."

"Consumer robotics is a new product category and building [applications] there requires leveraging the capabilities and inspiration of a broader community," he says. "This is exactly what we want to do.”

Microsoft's robotics squad. Left to right: Stathis Papaefstathiou (general manager), Russ Sanchez (creative director), Branch Hendrix (business development), Stewart MacLeod (development), Hunter Hudson (quality), Mukunda Murthy (program management), George Chrysanthakopoulos (distinguished engineer), and Chris Tham (engineering).

Images: Microsoft Robotics Group

Read also:

The Conference Room That Re-Arranges Itself
Wed, May 19, 2010

Blog Post: Just pick how you want it set up and the tables move themselves into position

Autonomous Car Learns To Powerslide Into Parking Spot
Mon, May 10, 2010

Blog Post: Stanford's "Junior" combines two control schemes to stop right on target

Robots With Knives: A Study of Soft-Tissue Injury in Robotics
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

Willow Garage Giving Away 11 PR2 Robots Worth Over $4 Million
Tue, May 04, 2010

Blog Post: The robotics company has announced the 11 institutions in the U.S., Europe, and Japan that will receive its advanced PR2 robot to develop new applications

The Conference Room That Re-Arranges Itself

You can add a new entry to the long list of problems that can be solved by robots: arranging tables in a conference room. On my personal workplace hassle scale, I'm not sure that moving conference room furniture ranks much above "occasional nuisance." But Yukiko Sawada and Takashi Tsubouchi at the University of Tsukuba, Japan, evidently find shoving tables to be an unappealing task for humans. So they built a room that could re-arrange itself.

In this case, the tables are the robots. Select the arrangement you want from a graphical interface, and the tables will move to their new locations. The movement is monitored by an overhead camera with a fish-eye lens, and the software uses a trial-and-error approach to determine the best sequence of motion. But it's best to see the room in action for yourself. Check out the video the researchers presented at ICRA earlier this month.

In the paper, the authors explained the rationale for the project:

In these days, at conference rooms or event sites, people arrange tables to desired positions suitable for the event. If this work could be performed autonomously, it would cut down the man power and time needed. Furthermore, if it is linked to the Internet reservation system of the conference room, it would be able to arrange the tables to an arbitrary configuration by the desired time.

I'm not sure the cost and complexity of such a system could ever be low enough to be practical, but there's definitely something fun about watching the tables reconfigure themselves. And if you already have autonomous, why not go all the way and add a reconfigurable wall?



IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:

Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
Jason Falconer
Angelica Lim
Tokyo, Japan

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Load More