Outside of New York City’s Eyebeam studio, an artist's hub dedicated to the convergence of art and technology, two women pause to see a pen doodling across a canvas behind a window. When they touch little circles on the glass, the pen changes direction.
“What’s this?” they ask. Then they read the description. This is a SADbot.
SADbot, or Seasonally Affected Drawing Robot, is a solar-powered, interactive drawing machine created by Eyebeam artists Dustyn Roberts and Ben Leduc-Mills. The contraption was on display this month at Eyebeam's Window Gallery in Chelsea.
"People are only happy when it's sunny," says Roberts. "Just like our robot."
When the sky is dark, SADbot stops doodling and "goes to sleep." But when the sun is out, SADbot lets people interact with it and doodles across a large canvas.
SADbot uses an Arduino microcontroller, four photocell sensors, a battery, and two stepper motors to control two cables attached to a pen. The electronics gets power from solar panels on the building's roof. But light not only powers the installation -- it also affects SADbot's behavior.
The interactive part occurs when a person stands in front of SADbot and covers up one of its photocell sensors, which the SADbot registers and then changes its drawing direction. By covering the sensors in a determined sequence, a person could do his or her own drawings.
But after checking the gallery's window where SADbot was to be installed, Roberts and Leduc-Mills noticed a problem. The window doesn't get much sunlight -- which would make SADbot, well, sad.
No problem. The artists built a rooftop mirror array to direct sunlight to a fixed mirror hanging off the ledge, which reflects light down to the gallery window.
If none of the photocells are covered, SADbot draws according to the programmed algorithm -- in the current case, small movements in random directions.
"At the moment its aesthetic is very small, random movements, or doodles," says Leduc-Mills. Since the project has been up, they've been filling up one canvas with doodles per day, which tells them that SADbot has received a lot of interaction.
Leduc-Mills wanted to create an interactive project that people could influence from the sidewalk so he took his ideas to Roberts, a mechanical engineer, and SADbot was born.
To build SADbot, the duo raised over US $1,000 in funding on Kickstarter.com, an innovative project-funding site, which paid for all of the bot's components. Depending on the size of the donation, backers of the SADbot project received SADbot drawings, mini SADbot DIY kits, and fully built miniSADbots.
SADbot uses open source platforms like Arduino, Processing, and EasyDriver motor boards, so it's easy for you to build your own SADbot!
What if your desk lamp could not only shine light but also project online content onto your workspace? LuminAR is an augmented reality project from MIT's Media Lab that combines robotics and gestural interfaces in an everyday household item.
Developed by Natan Linder and Pattie Maes from the Fluid Interfaces Group, the device consists of two parts: a bulb and a lamp. The LuminAR Bulb can be screwed into a standard incandescent light fixture and contains a pico projector, camera, and a compact computer with wireless access to the Net. The lamp fixture, meanwhile, is a a rotating base with a multi-jointed robot arm that can move to different positions by following user gestures.
The bulb's camera tracks hand positions while the projector streams online content to different areas of the desktop. The two turn a desk into an interactive surface. The robot can also be taught to remember preferred areas to project content or digital tools such as an email application or a virtual keyboard, as seen in the video below.
The project is similar to the Sixth Sense by Pranav Mistry, also of the Fluid group, and other gestural interfaces that combine hand tracking with content projection. The difference is the form factor. The LuminAR Bulb could have wider appeal because it can be used with any ordinary desk lamp, though it would then lack robotic functions.
Still, it's an innovative way to free computing from the mouse-and-keyboard box and embed it in the environment. I wonder whether the projector is powerful enough to work well on a brightly lit desktop, and whether the robotic arm might misinterpret an involuntary gesture like sneezing and do something undesirable. Or it might hand you a tissue.
The robot has different configurations, including one with wheels. The space version has a head, torso, and arms, but no wheels or legs, because it will be mounted on a spacecraft or satellite.
The goal is to use Justin to repair or refuel satellites that need to be serviced. Its creators say that ideally the robot would work autonomously. To replace a module or refuel, for example, you'd just press a button and the robot would do the rest.
But that's a long-term goal. For now, the researchers are relying on another approach: robotic telepresence. A human operator controls the robot from Earth, using a head-mounted display and a kind of arm exoskeleton. That way the operator can see what the robot sees and also feel the forces the robot is experiencing.
Justin's head has two cameras, used for stereoscopic vision, which means the operator can get a sense of depth when manipulating the arms. And the arms and fingers have force and torque sensors, to provide feedback to the operator, who are able to know if, say, a screw is tight.
Watch the video to see a reporter operating the robot, which, he quips, probably "costs much more than what I can earn my entire life."
Humans aren't the only ones playing soccer right now. In just two days, robots from world-renowned universities will compete in Singapore for RoboCup 2010. This is the other World Cup, where players range from 15-centimeter tall Wall-E-like bots to adult-sized advanced humanoids.
The RoboCup, now in its 14th edition, is the world’s largest robotics and artificial intelligence competition with more than 400 teams from dozens of countries. The idea is to use the soccer bots to advance research in machine vision, multi-agent collaboration, real-time reasoning, sensor-fusion, and other areas of robotics and AI.
But its participants also aim to develop autonomous soccer playing robots that will one day be able to play against humans. The RoboCup's mission statement:
By 2050, a team of fully autonomous humanoid robot soccer players shall win the game, complying with the official rule of the FIFA, against the winner of the most recent World Cup.
It may seem far-fetched that robots will ever be able to compete with the likes of Messi or Kaká but 40 years is a long time in terms of technology. And what's wrong about dreaming big? Just think of the days when people would say a computer would never beat humans in chess -- until IBM's Deep Blue did just that in 1997. For now researchers explore fundamental questions in robot development: How well can robots move and think on their feet? And how well can they score goals? But maybe soon they'll be building PeléBot.
So check out some of this year's top players below, and let the games begin!
1) Led by Professor Manuela Veloso, the Carnegie Mellon University team (known to some as the Brazil of robot soccer) has developed a physics based motion planning AI for his dribbling bots. Using a dual-camera overhead vision system, the roboticists have programmed their robots to take into account the physics of ball movements, so the control algorithms can better predict where the ball is going to be and then position the robots accordingly.
2) RoboErectus was developed at the Robotics Center of Singapore Polytechnic. This 12-degrees-of-freedom robot has three processors for vision, control, and AI, as well as three sensors: an USB camera to capture images, a tilt sensor to detect falls, and a compass to track direction. The vision processor performs recognition and tracking of objects like the ball, goal, field lines, goal post, teammates, and opponents. And he's got some pretty sweet moves.
3) Dutch Robotics, an initiative from TU Delft, TU Eindhoven and University of Twente, presents TUlip, a humanoid soccer robot to compete in the adult size league. TUlip, 1.2 meter tall and with 14 degrees of freedom, looks a bit unsteady but this video shows it has skills. And the music is very inspirational.
4) Led by Professor Dennis Hong, Virginia Tech's Team DARwIn is always a strong contender. DARwIn, if you're wondering, stands for Dynamic Anthropomorphic Robot with Intelligence. Their robot, DARwIn IV, stands 56 cm tall and has 20 degrees of freedom. It's powered by a CompuLab fit-PC2 and multiple Robotis Dynamixel servos.
5) The UPennalizers compete in the standard platform league, in which all teams use the same robot platform, Aldebaran's Nao humanoid, so the challenge becomes programming the best players and team strategies. Teammate Aylin Caliskan says that robots beating humans by 2050 "should be 100 percent possible."
Without all the swearing, pushing, and penalties, RoboCup might not be as exciting as the World Cup in South Africa, but it's hard not to be inspired by this bunch of ball-kicking bots.
With only a small team of developers and a week's worth of development, the PR2 can now play pool! The "Poolshark" team started last Monday and began making shots on Friday. The PR2 won't be hustling you in pool halls anytime soon, but it pocketed five shots on Friday before the team decided it was time to celebrate.
The Poolshark team dealt with numerous technical challenges throughout the week: engineering a special grip and bridge so the PR2 could hold the cue, a ball detector, table localization, visualizations and input tools, shot selector, and more.
A big thanks goes to Alon Altman for his open-source FastFiz billiards library. FastFiz is a physics and rules engine for billiards that the Poolshark team used to select which shots the PR2 should take. The Poolshark team has released its own code in the billiards stack.
June is "Hackathon" month, so there are two more one-week hackathons to come: pushing a cart and fetching a drink from a refrigerator. It's one down, two to go!
These look like promising applications. Or as Evan Ackerman at BotJunkie aptly puts it:
A robot that can play pool and fetch me beer? Hellooooo new best friend.
When Canadian filmmaker Rob Spence was a kid, he would peer through the bionic eye of his Six Million Dollar Man action figure. After a shooting accident left him partially blind, he decided to create his own electronic eye. Now he calls himself Eyeborg.
Spence's bionic eye contains a battery-powered, wireless video camera. Not only can he record everything he sees just by looking around, but soon people will be able to log on to his video feed and view the world through his right eye.
Spence and his collaborators -- Kosta Grammatis, John Polanski, Martin Ling, Phil Bowen, and camera firm OmniVision -- managed to get a prototype working last year. Time magazine named it one of the best inventions of 2009. Now the group is developing a version that offers a clearer picture.
I recently met with Spence in Toronto. In unreleased footage (see screenshot below) that Spence gave me, he demos the prosthetic as a colleague filmed. The feed from the eye-cam is picked up by a wire antenna that Spence held to his cheek, and relayed to a flatscreen TV in the background.
The prototype in the video provides low-res images, but an authentic experience of literally seeing through someone else's perspective. The image is somewhat jerky and overhung by huge eyelashes; a blink throws everything out of whack for a half-second.
Rob Spence demonstrates his bionic camera eye. A wireless camera in a specially designed prosthetic relays a video feed to an antenna that he presses to his cheek. The feed is shown on the screen behind him as a colleague films. Image: Rob Spence
The bionic eye is simply designed, and components are constantly changing. It basically contains a 1.5mm-square, low-res video camera, a small round printed circuit board, video transmitter, and a 3-volt rechargeable Varta microbattery. The components are contained in resealable clear acrylic used in false eyes, but it has two holes for wires to recharge the battery.
"I can recharge my eye via USB off my laptop," says Spence.
The Eyeborg prototype in the video, the third, can only work for an hour an a half on a fully charged battery. Its transmitter is quite weak, so Spence has to hold a receiving antenna to his cheek to get a clear signal. He muses that he should build a Seven of Nine-style eyepiece to house it. He's experimenting with a new prototype that has a stronger transmitter, other frequencies and a booster on the receiver.
"Unlike you humans, I can continue to upgrade," Spence quips. "Yes, I'm a cyborg. But I think that any technology -- even clothing -- makes people cyborgs."
Spence loves to ham it up as Eyeborg, installing a red, laser-like LED light in one version of the prototype and pulling on a 1970s track suit to become Steve Austin (see the video below). But he's serious about using his camera eye to get Internet users to view the world through his eye, and is developing an Eyeborg app that may feature augmented reality functions.
"In today's world, you have Facebook and camera eyes," he says. "Tomorrow, we'll have collective consciousness and the Borg. It's a collective robot consciousness. I believe that's a genuine modern concern."
Eyeborg seems content to lead us into our robot future.
In a unique step for an industrial robot manufacturer, robotics giant KUKA has released a mobile platform (wow) with open interfaces (wow) that include several open source (wow) software modules for education and research (wow). Yes, that's "wow" four times!
The robot is clearly a big move for KUKA, known for more traditional industrial systems, but perhaps even more significant, it's also a departure from the proprietary software, closed interfaces, and stationary robots typically used by all major industrial robotics companies, including ABB, Fanuc, and KUKA itself.
It is too early to say whether the youBot is a forerunner of a bigger trend, but it makes for some amusing Friday afternoon musing: Industrial manipulators have been mostly stationary systems. Will they now start moving around?
Is this the beginning of a new class of robots, bridging the gap between today's industrial and service robots? And is industrial robotics moving towards more open standards, or are we moving towards a split into specialized hardware and software companies, similar to what happened in the PC industry?
Either way, KUKA's youBot looks promising:
youBot arm: Serial kinematics with 5 degrees of freedom (655mm height, 0.513 m³ work envelope, 6kg weight, 0.5kg payload, 0.1 mm repeatability, 80W power limit (for safety)). Fix axes serial structure with a two finger gripper at the flange. Links are made of very lightweight, but stiff magensium cast. Custom, lightweight motor-gear combinations.
youBot gripper: detachable, 2-finger gripper attached to the arm (10mm stroke/finger = 20m opening stroke; different mounting allows to grip objects up to 70mm diameter)
youBot platform: omnidirectional mobile platform with 4 KUKA omniWheels (530x360x106mm, 15 mm clearance, 20kg total weight, 20kg payload, 0.8m/s speed, EtherCAT communication, 24V)
Energy supply: one set of two batteries (Two 12 V, 5 Ah, maintenance-free lead acid rechargeable batteries; approximate runtime: 90 minutes) embedded into mobile platform
On-board PC: mini ITX board form factor (embedded CPU, passively cooled, 512 MB RAM, 4 GB Compact Flash, WLAN, USB), embedded into the mobile platform
Open source robot controller with open interfaces (position, velocity and current control) BRIDE (BRics Integrated Development Environment) based on Eclipse to simplify application development BROCRE (BRICS Open Code Repository) offers interoperable interfaces and best practice algorithms libraries:
- BRICS_MM for mobile manipulation
- BRICS_3D for 3D perception and modelling
- BRICS_RN for robust navigation) Blender simulation model Sample applications
The video below features the current prototype. The final version will have magnesium casings for the arm to make it lighter, re-designed covers for the body that allow for easier battery access and - obviously - omniwheels in KUKA-orange.
KUKA will start delivery 1. Nov 2010 for Germany (1. Dec 2010 for Europe; 1. Mar 2011 for USA/Asia). Prices are EUR 19'990 (USD 24'200) for the complete robot. Arms are sold separately for EUR 12'990 (USD 15'700), the omni-directional base for EUR 8'990 (USD 10'900), with significant discounts (20-25%) for early birds and universities.
The Focus-Project team Ballbot consists of eight future mechanical engineers, studying at the ETH Zurich, two electrical engineers studying at the ZHAW as well as the Industrial Designers educated at ZHdK. Through the combination of our skills and ideas we aim to complete an unprecedented project which develops a new concept of movement. Our team with its task is supervised by Prof. Dr. Roland Siegwart, Director of the ASL at ETH Zurich.
Unlike Kumagai's ballbot, one focus of the Rezero is design:
Rezero is meant to entertain and impress. It is supposed to create emotions. It will be able to interact with a small group of people, react on attractions and in doing so create a hands-on experience with the Ballbot technology. The Ballbot will be an ambassador of its own movement skills. Its dynamic hull even allows Rezero to show and create emotions. Imagine Rezero breathing, being curious or frightened. And even waking up or going to sleep by revealing or retracting its sphere.
Another focus is improved dynamics: To push the boundaries of current ballbots, the team uses a custom-made motor controller in combination with high-performance engines and a specially coated ball. This allows Rezero to move fast - at speeds up to 3.5m/s and with inclinations up to 17 degrees - and to perform unique movements, such as moving with high inclinations while simultaneously rotating around its vertical axis.
Now that we’ve reviewed both the iRobot Roomba 560 and the Neato XV-11, you’re probably wondering which one you should get. There’s no easy answer, but in this post we’ll highlight the features of each robot and the differences between them, so that you can decide which one is right for you.
If you haven’t read our individual reviews of each robot, you can get lots more detail at the following links:
Both robots are approximately the same size, with two driver wheels underneath and a touch sensing bumper at the front. The Roomba is round, allowing it to turn all the way around in place, while the XV-11 has a square front to help it get into corners more effectively.
The XV-11 is slightly taller:
This means that if you have furniture that’s within that height difference, the Roomba will clean underneath it but the XV-11 won’t. The XV-11 is just under 4 inches tall, while the Roomba is a bit over 3.
Both robots have a built-in carrying handle. The XV-11 is a little bit heavier. They both seem very solid and robust (although you probably want to avoid dropping them), and both come with a one year warranty.
Both robots include one button cleaning, meaning that whatever else they can do, at the very least you can just push the big “clean” button on them and they’ll go vacuum. The Roomba 560 has additional dedicated displays for scheduling cleaning times, while the XV-11 has a small multipurpose LCD display.
Both robots come with charging docks that they can return to autonomously. The Roomba’s dock is drive-on, which means that the robot charges by driving onto a little platform. The XV-11’s dock is drive-up, which means that the robot presses against the dock. The XV-11’s dock includes a storage compartment for the power adapter, which is a useful feature, since you can store the adapter inside the dock if you don’t need the extra power cord length. Both robots will attempt to ’snug’ back up to their charging contacts if they get accidentally moved.
The Roomba 560 and the Neato XV-11 both allow for on-board scheduling. You can set different times for each day of the week, and the robot will undock, clean, and redock to recharge itself. It’s relatively easy to program this on both robots, although the XV-11’s LCD makes it a bit easier.
Due to its LCD, the XV-11 has a distinct advantage when it comes to user communication. The screen tells you if you need to perform maintenance tasks, or what to do if the robot isn’t doing what it’s supposed to be doing. The Roomba will sometimes speak in a female voice when it needs assistance, but for more obscure technical issues it just beeps, and you need to keep track of the number of beeps and look up what they mean online, which is far less convenient.
Both robots come accessories that you can use to keep them away from certain areas. The XV-11 uses a magnetic strip (you get 15 feet of it and can buy more for $30) that you place on the floor, and the robot will clean up to it but not go over. You can cut the strip up, and it sort of bends enough to make curves. The Roomba uses Virtual Walls, which are little towers about the size of a coffee cup that project infrared beams which the robot won’t cross over, so you can leave them up around doorways and stuff even when the robot isn’t vacuuming. The beams will reach out to about 8 feet, and the Virtual Walls run on batteries. The 560 comes with two, and buying another one will cost you $40.
The Roomba and the Neato XV-11 use significantly different techniques to vacuum areas. The Roomba uses a variety of cleaning behaviors to cover a room, using input from its sensors to decide where to go next. It doesn’t know where it has or has not been in the absolute sense, but on average, it will cover each area of a room 3-4 times, which helps it to clean more thoroughly.
The XV-11, on the other hand, has a laser sensor that creates a map of walls, doorways, and obstacles. The robot then plans a route to cover the entire area efficiently, generally with a single pass over most points.
We should point out that neither the XV-11 nor the Roomba is a total replacement for a human wielding an upright vacuum with a hose attachment. Rather, they’re maintenance tools, designed to minimize the amount of vacuuming that you have to do. That said, we found both robots to clean very effectively on hardwood, comparable to a conventional upright vacuum over most of the floor. Because of their shapes, however, the robots aren’t quite as good close to obstacles, along walls, and in corners.
The XV-11 is better at cleaning along walls and corners in most cases, since its square front allows it to get in closer, although it doesn’t always get into corners in the ideal orientation. Because the Roomba is round, it relies on a spinning brush to sweep into corners, which is less effective than getting the entire vacuum in there. It’s worth noting, though, that this spinning brush extends beyond the reach of the vacuum, outside the body of the robot, which means that the Roomba can (sort of) clean beyond its own chassis, while the XV-11 can’t. The effectiveness of the spinning brush is mediocre at best, however, since it often just kicks dirt somewhere else where the Roomba may or may not get later. Basically, neither robot can make up for the hose attachment on a conventional upright vacuum when it comes to tight areas.
On carpet, both the XV-11 and the Roomba did fairly well, although not as good as an upright. The Roomba cleaned slightly better in general, and significantly better when it came to pet hair, probably because of its bristle brush. The rubber brush on the XV-11 tended to leave streaks of pet hair behind it. Neither robot got pet hair completely cleaned up, though, and they did especially poorly around table and chair legs. Also, iRobot has pointed out that crossing over carpet from multiple angles changes the nap of the carpet and is better for getting dirt out, which I tend to believe… The XV-11 cleans in a single pass.
The XV-11 is significantly faster than the Roomba, about four times faster, cleaning my living room in 12 minutes as opposed to the Roomba’s 45. This difference will increase as the robots are asked to clean larger rooms or more rooms. The XV-11 doesn’t move faster, but since it doesn’t cover most areas more than once, it’s done much faster. Also, it knows exactly where its dock is, and doesn’t have to spend time searching for it after it’s finished. Of course, if you’re taking advantage of the scheduling feature, these vacuums are running by themselves when you’re not home, in which case speed (and noise) may not matter nearly as much. In this case, the question changes from is it faster to how much area can each robot cover per charge, how long does it take to recharge, and how effectively can it resume coverage of multiple rooms? The XV-11 has a pronounced advantage here, because it cleans more efficiently: It spends significantly less time on each room, is better at finding its way from room to room (since it can see doorways), can more reliably find its way back to its charging dock if it needs to (since it creates a map), and then can return to exactly where it left off and finish cleaning without any redundancy in coverage. Some models of Roombas include Lighthouse technology which helps them clean multiple rooms more efficiently, but the 560 does not.
The XV-11 seems significantly louder than the Roomba; both are significantly quieter than an upright vacuum. We’re waiting for exact decibel numbers.
Both robots have minimal issues cleaning entirely autonomously, meaning that in general, you really can just let them do their thing from start to finish without having to worry about them getting lost or stuck.
Both robots require you to empty their dustbins on a regular basis. Depending on how many rooms you have them clean, and how dirty your floors get, this could be anywhere from every cleaning to every three cleanings or so. Both robots will inform you when their dustbins need to be changed, so it’s not something you really have to worry about… Although it’s better to empty them before they fill completely, especially if you have the robots clean autonomously.
The dustbin on the XV-11 is marginally easier to access than the one on the Roomba, since it lifts out of the top of the robot instead of out of the back. Also, the XV-11’s air filter keeps the dust in when you lift the bin out; you remove the filter to empty the bin. The Roomba’s bin doesn’t have a cover like that, so there’s the potential to make a huge mess unless you pull the bin out carefully and keep it in the correct orientation. The XV-11 also has a larger dustbin, but I wouldn’t call it significantly larger.
The air filters on both robots are easy to access and replace, being integrated into the dust bins themselves. Replacement filters for the XV-11 cost $19 for 6, and for the Roomba it’s $19 for 3.
The XV-11 is much better at keeping itself clean as it cleans, especially when it comes to hair (pet and otherwise). I have a couple cats, and while the Roomba was significantly better at picking up cat hair, it also got a lot of cat hair wrapped around its bristle brushes, as well around the bearings holding the brushes in place. After just a few vacuumings, you’ll need to take the brushes and bearings out and clean them by hand, which is a dirty and annoying process. iRobot includes a tool to help with this, but I’ve often had to resort to scissors and brute strength to get the hair out of the bristle brush. The XV-11, on the other hand, while not as good at picking up pet hair, remains very clean, on both its brush and bearings. After 3 rounds of my living room, the Roomba was very dirty and tangled underneath, while the XV-11 looked brand new.
Lastly, there are maintenance tasks that you shouldn’t have to do very often, or (ideally) at all, like replacing brushes, bearings, and batteries. We didn’t get a chance to test the XV-11 to this point, but my guess is that the XV-11 would be more resistant to bearing damage (something I’ve experienced with my own personal Roomba), simply because not as much stuff gets caught up in its cleaning system.
Both iRobot and Neato offer replacement components for their robots. iRobot’s website has nearly every component for the robot available, while Neato mostly focuses on accessories. I didn’t try to take either robot apart, so I can’t comment on how easy it is to replace major components, but I like the fact that iRobot gives you the option to try to fix things yourself.
The Neato XV-11 is currently on pre-order for $400, to be available “this summer.” The iRobot Roomba 560 is available now for $350. However, the Roomba 560 does not include the Lighthouse multi-room technology. To get that, you’d need to upgrade to the Roomba 570 for $450, which might be a more realistic robot to compare the XV-11 to in terms of multi-room cleaning capability. And even then, the XV-11 is still likely to be significantly better at cleaning multiple rooms due to its mapping technology.
So, to summarize:
-Both robots clean hardwood equally well, about as well as a traditional upright vacuum.
-The Roomba cleans carpet noticeably better than the XV-11, and is significantly better at picking up pet hair. Neither robot is as good at these tasks as a traditional upright vacuum.
-The Roomba requires significantly more maintenance than the XV-11, especially if it picks up hair of any kind.
-The XV-11 cleans rooms about four times as fast as the Roomba.
-The XV-11 is significantly better at cleaning multiple rooms than the Roomba.
-The XV-11 seems louder than the Roomba.
-The fact that the Roomba uses cleaning behaviors derived from foraging insects is very cool.
-The fact that the XV-11 uses a laser to map rooms is very cool.
There are a few other things to potentially consider… If cost is an issue, iRobot sells Roomba models that are less expensive than the 560. If you only need to clean one or two rooms, and don’t need the scheduling feature, you could get a Roomba 530 for $300.
Also, iRobot has been selling Roombas for a long time, while Neato is introducing a new product. The fifth generation of Roombas embodies many years of improvements and refinements while the XV-11 has yet to prove itself as a commercial product. That said, the mapping technology in the XV-11 is very impressive, and I feel like irrespective of which robot makes a better vacuum, there’s a lot of potential there.
What it comes down to, though, is that both the iRobot Roomba 560 and the Neato XV-11 are solid autonomous robot vacuums that use different techniques and technologies to get your floor clean and keep it that way without you having to lift a finger.
And once again, I’d encourage you to read our individual reviews of each robot, since there are lots more details (plus more pictures and video):