Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

IAM Robotics Takes on Automated Warehouse Picking

A Pittsburgh startup wants to build a robot that can pick items off of shelves, the holy grail of warehouse fulfillment

7 min read

IAM Robotics robot picking object at warehouse
Photo: IAM Robotics

There’s a small but growing handful of robotics companies trying to make it in the warehouse market with systems that work with humans on order fulfillment. Generally, we’re talking about clever wheeled platforms that can autonomously deliver goods from one place to another, while humans continue do the most challenging part: picking items off of shelves. There’s a lot of value here, since using robots to move stuff frees up humans to spend more of their time picking.

Ideally, however, you’d have the robot doing the picking as well, but this is a very difficult problem in terms of sensing, motion planning, and manipulation. And getting a robot do pick reliably at a speed that could make it a viable human replacement is more difficult still.

IAM Robotics, a startup based in Pittsburgh, Pa., is one of the first companies to take on the picking problem on a commercial level. Founded in 2012, they’ve developed an autonomous mobile picking robots called Swift that consists of a wheeled base and a Fanuc arm with a 15-lb payload and suction gripper that can reach 18 inches back into shelves. A height-adjustable carriage can access shelves between 3 and 85 inches, and an autonomously swappable tote carries up to 50 pounds of stuff. According to the company, the robot can autonomously navigate around warehouses and is “capable of picking at human-level speeds” of 200 picks per hour.

We spoke with IAM Robotics founder and CEO Tom Galluzzo to find out how they’re making this happen.

Prior to IAM Robotics, Galluzzo worked developing autonomous systems for Carnegie Mellon, Harris Corporation, the Air Force Research Laboratory, Boeing, and others organizations. He told us that his experience building robust real-world robotic systems had a big influence in the development of Swift. As an example, he said that the robot, instead of dynamically calculating the best way to pick every item that it runs across, queries a database that consists of items that have already been scanned in 3D, modeled, and analyzed to figure out the best possible grasping strategies. 

IAM Robotics is currently conducting a pilot project with Rochester Drug Cooperative, one of the largest healthcare distributors in the United States. RDC is testings the Swift robots along with inventory tracking technology and fleet management software also developed by the IAM. Here’s the rest of our interview with Galluzzo.

IEEE Spectrum: IAM Robotics is working on autonomously picking items off of shelves, which is something that most warehouse fulfillment companies aren’t trying to do yet, because it’s a really hard problem. How’d you decide to start there?

Tom Galluzzo:I had been working at Carnegie Mellon’s National Robotics Engineering Center as a robotics researcher, and we were working on a project for DARPA called ARM-S, which was a predecessor to the DARPA Robotics Challenge. Basically, what DARPA wanted us to do was have an autonomous manipulation robot manipulate everyday objects. So we were doing things autonomously like finding objects on a table, picking them up, moving them around…we tried to do really challenging things like changing a tire, assembling parts, all kinds of stuff.

We got quite good at just general object “find it, pick it up and move it.” When we did that, we started looking for low-hanging fruit in industry, and we were led on a natural trajectory to picking in warehouses. This is a pretty challenging problem, we felt confident that we could bring value and that manipulation was really the key differentiator there, being able to see and manipulate those products. 

Can you put into context for us why other companies aren’t doing this yet? Why is it so challenging?

The thing you have to realize is that with technology being in the state it’s at today, no one is going to solve all of picking right now. We can’t get a general machine to do it all; you have to pick and choose which problems you’re going to solve. We started focusing on some lower hanging fruit, which includes consumer products, things in boxes, things in bottles, pharmaceuticals. Initially we chose the suction cup for speed because the number one thing in this industry is we have be as productive as a person. Everything that we had, even all the academic stuff at CMU, was way too slow, and we kind of had to start from scratch. So that’s what we did.

When we started doing this, we wrote our whole perception pipeline from scratch, and tried to get the robots to just pick things as fast as they could, in a semi-realistic scenario. When we did that we surprised ourselves with what we could pick with the suction cup: it was more than what we expected. And, we could also pick it really, really fast. With a robot, it’s really the perception speed and the speed of the arm. And what we solved was the perception speed– we were able see these products really fast and the arm speed became the bottleneck.

A [human] pick worker on a sustained basis, really can’t pick more than 600 products per hour, and that’s without traveling much. I think we’ve done a demo of 1,100 products per hour with the robot sitting next to a shelf. We were very encouraged by that, and the fact that our robot has certain applications that work, and we can’t do it all, we’re fine with that. If we can knock down a couple of key applications, we have time to expand our breadth in terms of what we can pick. 

imgRather than dynamically calculating the best way to pick every item that it runs across, Swift instead queries a database that consists of items that have already been scanned in 3D, modeled, and analyzed to figure out the best possible grasping strategies.Photo: IAM Robotics

What kind of constraints are there on your picking system?

It has to be fairly organized. The products have to back one another up, because the way we’re picking them, we’re pushing on them from the front, and if you have light products that don’t have anything behind them, they’re going to tip over. The majority of the work is done manually on the put away, where people are going to go on to make things a little bit more organized, but you’re saving five times what you spend in addition to your current workforce by automating the picking side.

The economics of picking are such that, when you’re putting stuff away, you walk to a location one time with a case of product. But when you pick it, you have to walk back ten or twenty times to pick all that product. There’s a lot more labor on the picking side. So we figure, let’s automate that. The benefit to the customer is that they get very efficient picking for both humans and robots, people get an easier job, and you get a very very well-organized warehouse.

Can you talk about what kind of hardware and software you’re using for perception, since that seems like a big part of the challenge here?

Yeah, it is. I can’t disclose too much since that’s kind of like our secret sauce: the perception pipeline that we’ve built on top of off-the-shelf color cameras and depth sensors. What we’ve done is we’ve layered on what we call our rapid vision algorithms that process the data extremely fast, extracts the information that we need to know know about the pick, and matches that that has already been visually trained for the system. [The visual training comes from] taking our rapid vision system and packaging it into a standalone little photo studio scanner that we call Flash. It’s just a standalone miniature photo booth, and we walk up and down each aisle one time, barcode scan each product, and put a sample of the product in the scanner. Then we collect images, we weight the product and see how heavy it is, we collect dimensions, surface features, those kind of things. 

Okay, so you do all of this data collection in advance, so that the robot is already trained on how to pick each item?

Yes, absolutely. We hook the up to the customer’s warehouse management system, and we marry that up with the data we have from Flash to send down a visual pick list to the robot. 

It sounds picking relatively small, boxy things is your target market. Is there potential to go into more general e-commerce? I imagine that that might be a little more difficult for picking. 

Absolutely. We have two models that allow us to go into those general applications. One is doing things like bin retrieval, almost like a light version of a Kiva system. Instead of bringing over a whole shelf, we just retrieve a box of whatever the product is. I also think there’s going to be lots of amazing applications in grocery e-commerce. 

And how would you have to change your system in order to address that market? 

We wouldn’t be able to pick produce or anything like that on day 1. I would say 75 percent of the stuff in the grocery store is pickable with a suction cup, but we just need more breadth in terms of size and shape and weight capability. So not just one suction cup, but an array, enabling us to pick lots of different sizes. Beyond that, it’s all good. 

It sounds like there are plenty of good reasons to use suction grippers, but over the last couple of years there’s been a lot of innovation in that space. Have you thought about trying other gripper designs? 

There’s Grabit, there’s Empire Robotics... I think they all have their niches. Robotics is a very physics-constrained world, and things have to fit very precisely in size, weight, and power. That’s just reality of what one type of machine is going to be able to grab and pick up as opposed to another. So to a certain extent I think we’re going to see this granularity and differentiation in grippers and arms and we’re going to need to be able to support all of those to fill all the niches. 

Humans do well with just one kind of gripper. Do you think that a more anthropomorphic design might be a realistic approach for generalized picking, or do you think it’s going to be one of these other designs that are less capable but more forgiving? 

I don’t think in the short term it is a realistic approach. We used anthropomorphic hands on RMS, and I’m not sold on what their exact niche is just yet. There are lots of good reasons why suction cups and two-finger grippers have been the prevalent gripping technology for the automation industry for decades. Until we see a significant step-up in software and even some hardware design... Some of this, to a certain extent, I think we want to do at IAM Robotics, but it’s just not quite there yet, and we just want to get some killer applications under our belt before we expand.

IAM Robotics was at Modex this year, where Tom says they met with something like 150 different organizations who were interested in their technology. Not bad for a small robotics company’s first trade show ever. Over the next six months or so, IAM Robotics will be focusing on building up their successful initial deployment with Rochester Drug, and starting to schedule additional pilot projects with other customers. If you want in, they’re currently taking orders for late 2016.

[ IAM Robotics ]

The Conversation (0)