Automaton iconAutomaton

Interview: iRobot's Colin Angle on Robotics Industry, Remote Presence Robots

colin angle irobot

At the InnoRobo conference in Lyon, France, last month, I got a chance to speak with Colin Angle, CEO of iRobot -- in a very candid interview about his view on the robotics industry, his vision for AVA, a new robot platform his company is developing, and how he sees things shaping up in the coming years.

Cool over function

In keeping with a presentation Colin gave earlier in the day, he started off our conversation with a discussion on how there have been hundreds of millions of dollars spent on making cool demos – but relatively little in the way of solving high value business needs.

To illustrate his point, he mentioned the incredible effort that has been undertaken on the development of humanoid robots. He calls this an exercise of “cool over utility." As he explained it, the challenge of having to build a system that supports the model of bipedal legs and actually executing walking and balancing has been a costly adventure. Even the most exciting systems often have a team of scientists walking behind them, and the systems have a mean-time to failure of about 45 minutes, with limited performance – all to the cost of millions of dollars.

Compare that to the iRobot Warrior, which Colin feels is the first practical human-sized robot ever designed. Handling drops of up to 6 meters [20 feet], it's able to carry payloads of over 90 kilograms [200 pounds] and navigate rough terrain to go where human-sized systems should go -- in other words, the Warrior shows you don't need bipedal systems to solves a high-value mobility problem.

irobot ava robotThoughts on remote presence

So, in keeping with my focus on remote presence systems, I steered the conversation to remote presence and how he saw their AVA prototype [photo right] potentially accomplishing this. Colin quite nicely broke down the problem and how AVA is an attempt to resolve the puzzle.

First and foremost, he wants to deliver an experience better than being there yourself -- regardless of the travel time. He wants to mimic "presence" in such a way that the experience for you (the pilot) is rich, deep, and intuitive. And keeping with many people in this space, he does not feel that remote-controlled webcams or the Cisco telepresence solutions are solving this.

To achieve ubiquitous remote presence, a remote controlled webcam is not effective since there is limited ability for the pilot to truly understand the environment. While a person could learn the environment over time (e.g., where the offices are, where the conference rooms are), wouldn't it be better to have the remote presence system know the entire layout and allow you to request the place to go and simply take you there? Cisco telepresence solutions are not effective in other cases simply due to the very nature of the systems themselves -- limited in freedom, tied down to a single location, and very limited in being able to represent you outside of the magic screen.

Colin's vision is the ability to have a surrogate "you" -- one that could, in any location, be able to be present and do things that you normally would do. Go to the room you wish to go, carry on a conversation outside of a room, be aware of who is around, where they are spatially and go to them with minimal effort.

How AVA fits in

Colin was amused that people thought the AVA was iRobot's effort into robotic telepresence -- he sees the AVA as a "generic platform" for supporting all of the robotic functions that are necessary for enabling remote presence that iRobot is known for (“Do what iRobot is great at”). For instance, as we discussed the functional components of AVA, he pointed out the various features that the robot platform supports:

  • Downward facing IR for cliff detection

  • Braking systems to ensure the system is not going to fall

  • Small physical footprint (on the order of a human) to ensure fast turning radius and strong stability

  • Bumpers and upward facing sonar for detection of objects that could potentially collar the head of the device

  • Two PrimeSense sensors to enable a better understanding of the world through 3D mapping both of the navigation environment (downward facing) and the environment in front (on the camera assembly)

  • A LIDAR component that he wants to reinvent to bring the cost down (most expensive piece of the system)

  • Control surfaces for participants to move the system without physically pushing the system (through the bumper pads on the neck) to improve management of the system

  • Telescoping neck (via lead-screw) to ensure a lower center of gravity for movement/motion while affording a variable height for engagement with participants either standing or seated

  • Positioning control for the neck/head component

  • Supports adding manipulators on the system through a rail mechanism on the back of the neck of the AVA

A platform for application development

Colin said that iRobot's primary focus is on the "robotic functions" for a "generic platform" -- to help others overcome the liability issues. iRobot has done a lot of work -- through their previous designs and their own operating system (AWARE2) -- to make as safe and reliable platforms as possible. Rather than trying to make a specific platform for remote presence, Colin said that it is iRobot's intent to build the platform and let developers/designers create a solid system.

I got somewhat confused here -- it sounded like he was suggesting that iRobot would not compete in the application development and would not build a system for specific purposes -- like remote presence. And, when I pressed, he clarified that iRobot would not get in the way of something like Pad-to-Internet-to-Pad communications (e.g., FaceTime, qik, Skype), but in terms of building a navigation interface (e.g., a web front-end for piloting the system) for the pilot to interface with AVA, iRobot might offer a solution. Like Apple, iRobot's solutions for various applications could sit alongside of any other third-party solutions -- enabling these developers to build a better interface/application that would interface as well with AWARE2 and control the AVA platform. Here's how he put it:

Yes, it is our intention to develop apps for AVA alongside other developers, as we need to, as you say, “prime the pump”. As we look at the way things are likely to play out, iRobot is committed to being best in the world at autonomy/navigation software, platforms, manipulation, and the integration of 3rd party hardware – while we aspire to be a one of many application developers.

But for remote presence, the idea of having a tablet with a camera and a large screen (like the Motorola Xoom or the iPad 2) connecting to the AWARE2 API would easily support the creation of a remote presence system and allow the developers to rapidly iterate versions. And with an extendable head and telescoping neck, the placement of the pilot's face would be an easy effort and allow for remote presence to potentially become true.

Other juicy bits

From other conversations, I learned that there are a number of the AVA prototypes out in the market space already -- in the midst of prototype development for various problems. I could see the vision Colin has allows for an augmented reality for the pilot -- being able to have a click-and-response action within the view of the remote presence system (e.g, open doors, tag people in a meeting, set vision points to track where people are and respond to them rapidly by turning the head). How this comes about will be an interesting exercise in the coming years.

This article appeared originally at Pilot Presence.

Sanford Dickert is a technologist and product manager focusing on the intersection of engineering, collaboration, and team dynamics. He's held numerous senior positions in engineering, product development, and digital marketing. He writes about remote presence systems at Pilot Presence.

Robots Play Soccer, Make Cereal at RoboCup German Open

The RoboCup German Open 2011 wrapped up last weekend, and we've got a couple video highlights to share from the event.

This first clip is from the RoboCup@Home competition, which aims to develop service and assistive robot technology that will eventually make its way into your home. Here, Dynamaid and Cosero, two robots from Team NimbRo at the University of Bonn, team up to autonomously to make breakfast (of a sort):

RoboCup is perhaps best know for soccer, and the Darmstadt Dribblers (we've been big fans for years) took first place in the Kidsize soccer competition, defending their 2010 title. The 3v3 fully autonomous matches feature thrills, spills, violence, dives, and unprecedented speed and skill... Those robots are as good or better at aiming for the corners than most humans I know. In the first half of the match, stick around until the very end to see some tricky ball-handling skills:

And in the second half, check out one of the bots go from left footed to right footed and score, and make sure to hang on until minute nine to witness the first ever successful goalkeeper save and throw in a regulation robot soccer match:

Remember, the goal of RoboCup is to field a team of human robots capable of defeating a world-class team of humans at full field soccer. Obviously, we're not there yet, but the magnitude of improvements that we've seen over just the last two or three years has me convinced that the 2050 target is, if anything, pessimistic.

[ RoboCup German Open ]

[ Team NimbRo ]

[ Darmstadt Dribblers ]

Willow Garage's TurtleBot Brings Mobile 3D Mapping and ROS to Your Budget

Just a year or two ago, if you'd wanted to buy yourself a mobile robot base with an on-board computer and 3D vision system, you'd probably have been looking at mid-four to five figures. But today is the future, baby, and Willow Garage is introducing TurtleBot, an eminently hackable pre-configured platform designed to give mobility to a Kinect sensor on the cheap.

TurtleBot consists of an already sensored iRobot Create base, a 3000 mAh battery pack, a gyro, a Kinect sensor, an Asus 1215N laptop with a dual core processor to run everything, and a mounting structure for you to get creative with. TurtleBot runs ROS, of course, and will come with everything preconfigured so that the robot can make maps, navigate, and follow you around straight out of the box.

I know I've beaten this to death with respect to Willow Garage and ROS before, but remember that the whole point (or much of the point) of this kind of open source hardware and software is to keep hard working roboticists like you from having to start from scratch every time you want to invent something. Like, why waste your time and money designing and constructing a mobile robot with a 3D sensor and then waste more time teaching it to navigate, when it's all already been done a thousand times before? Where's the progress, man? See, now you have time to move on to more interesting things, like getting your robot to do cool stuff, which is the whole point of robots in the first place.

So, what can you do with TurtleBot? Well, it can bring you food, explore your house on its own, bring you food, build 3D pictures, bring you food, take panoramas, bring you drinks, bring you food, and more. Check it out:

If this platform looks vaguely familiar, that's because it is... Bilibot is the same basic idea: a cheap, effective platform for developing applications for Kinect using ROS. Both of these platforms were developed in parallel, though, and they're both after the same thing (more accessibility), so don't worry, there's nothing shady going on here. Willow Garage expects both robots to be able to collaborate on hardware and software while still maintaining their individuality.

TurtleBot will be available for pre-order later this week. The core kit is $500, which includes:

  • USB Communications Cable
  • TurtleBot Power and Sensor Board
  • TurtleBot Hardware
  • Microsoft Kinect
  • TurtleBot to Kinect Power Cable
  • USB Stick TurtleBot Installer

The complete TurtleBot (which is what's in the pictures and video) is $1200, and adds the following to the core kit:

  • iRobot Create Robot
  • 3000 mAh Ni-MH Battery
  • Fast Charger
  • Asus EeePc 1215N

The reason that they're doing it this way is to make it as cheap as possible for you to put this kit together yourself, if (say) you have your own laptop already, or even your own iRobot Create. For reference, an iRobot Create with a battery is about $200 from iRobot, and an Asus 1215N is about $500.

Turtle costume sold separately.

[ TurtleBot ]

Geminoid Robots and Human Originals Get Together

geminoid android humanlike robot

The Geminoid family has gathered together for the first time.

The ultrarealistic androids, each a copy of a real person, met on March 30 at Japan's ATR laboratory, near Kyoto.

Attending were Geminoid F, Geminoid HI-1, and Geminoid DK, as well as their respective originals: a twentysomething woman (whose identity remains a secret), Prof. Hiroshi Ishiguro of Osaka University, and Prof. Henrik Scharfe of Aalborg University, in Denmark [photo above].

The Geminoid robots, conceived by Prof. Ishiguro and a team at ATR, are manufactured by Japanese firm Kokoro. The robots work as a person's telepresence avatar: A computer captures the person's voice, facial expressions, and upper-body movements and transmits this data to the android.

Anyone can teleoperate the androids, but the experience is certainly unique for those individuals who served as templates.

"We wanted to get together and share our experience of having robot copies," Scharfe told me. "The three of us has a lot of fun doing this."

Watch what happened:

But the meeting was also an opportunity to conduct experiments. With the three robots sitting around a table, the human originals teleoperated their own copies and tried to have a conversation. Then they took turns operating each other's Geminoids.

"Returning to your own Geminoid felt like coming home," Scharfe said.

The researchers also tried other configurations, for example by having the human originals sitting with their androids on the table while other people teleoperated the robots.

According to Prof. Scharfe, whose Geminoid cost some US $200,000 and will be shipped to Denmark soon, some situations felt more natural than others, but generally he could accept the different conditions as "real conversations."

He will now take time to interpret the material from these experiments and hopes to publish his findings at some point.

As for the next Geminoid reunion -- have the researchers schedule it yet?

"It's very costly to ship [the androids] around," Scharfe says. "So it might not happen again!"

More photos:

geminoid android humanlike robots

geminoid android humanlike robot

geminoid android humanlike robot

Images and video: Geminoid DK

[ Geminoid DK ] via [ CNET ]

Robot Videos: Festo's SmartBird, Social Robots, and Autonomous Cars

There's no better way to start off the week than with a trio of fascinating robot videos, each of which is easily educational enough that you should be able to convince yourself (and anyone else) that watching them is definitely not procrastinating. 

This first video is a follow-up to Festo's SmartBird robotic seagull that we posted about last month. Creating a heavier than air fully functional robotic bird is no small feat, and this 17 minute video takes you through the development process, including lots of juicy details and behind the scenes test footage:


Cynthia Breazeal gave a seminar at CMU's Robotics Institute on "The Social Side of Personal Robotics." As you may have noticed, robots tend to be pretty lousy at interacting socially with humans, largely because robots have a hard time understanding what's going on inside our heads. I can totally relate to this because I have a hard time understanding what's going on inside other people's heads too, and if it's difficult for me, it's practically impossible for a robot.

Dr. Breazeal talks about new capabilities that her lab is developing to allow robots to employ a higher degree of insight (if you want to call it that) into how humans think, to enable robots to interact with us more naturally and more successfully. For example, the seminar includes video of experiments with Leonardo, where the robot demonstrates how it can understand not just what a human wants, but also what a human believes, which allows the robot to be much more... Well, I'm not sure what else to say but "insightful." Other experiments show how Leonardo can successfully pick up on unknown rules based on behavioral feedback, which is a skill that could hypothetically be extended to abstract social situations.

This talk is just over an hour long, but it's definitely worth watching in its entirety:


Lastly, we've got a (rather brief) TED Talk from Sebastian Thrun, who's been developing autonomous cars at Stanford and, more recently, Google. I never get tired of hearing his vision for the future where we all ride around in safe and efficient robotic vehicles, but it's somewhat ironic that no matter how much safer autonomous cars are over human drivers, it's the risk of accidents that's keeping them out of the hands of consumers. We have a ways to go both socially and legally before sharing the road with robots is going to be acceptable, but there are ways to ease us into it that may help to make the transition both smoother and quicker.

Why We Should Build Humanlike Robots

hanson robotics robokind zeno humanoid robot
Robokind Zeno, a small walking humanoid with an expressive face created by Hanson Robotics.

People often ask me why I build humanlike robots. Why make robots that look and act like people? Why can't robots be more like ... appliances?

In fact, some people argue that it's pointless for robotic researchers to build robots in our image; existing humanoids fall short of what science-fiction authors have dreamed up, so we should just give up. Others even say we'll never have humanoid androids around us, because when you try to make robots look more human, you end up making them look grotesque.

I disagree. I believe robotic researchers should aspire as grandly and broadly as possible. Robots can be useful in many shapes and forms, and the field is young—with so much room left for innovation and diversification in design. Let a thousand robot flowers bloom.

On the tree of robotic life, humanlike robots play a particularly valuable role. It makes sense. Humans are brilliant, beautiful, compassionate, loveable, and capable of love, so why shouldn’t we aspire to make robots humanlike in these ways? Don’t we want robots to have such marvelous capabilities as love, compassion, and genius?

Certainly robots don’t have these capacities yet, but only by striving towards such goals do we stand a chance of achieving them. In designing human-inspired robotics, we hold our machines to the highest standards we know—humanlike robots being the apex of bio-inspired engineering.

In the process, humanoid robots result in good science. They push the boundaries of biology, cognitive science, and engineering, generating a mountain of scientific publications in many fields related to humanoid robotics, including: computational neuroscience, A.I., speech recognition, compliant grasping and manipulation, cognitive robotics, robotic navigation, perception, and the integration of these amazing technologies within total humanoids. This integrative approach mirrors recent progress in systems biology, and in this way humanoid robotics can be considered a kind of meta-biology. They cross-pollinate among the sciences, and represent a subject of scientific inquiry themselves.

hanson robotics humanike robots
Some of Hanson Robotics' creations [from left]: Alice, Zeno, and Albert Hubo.

In addition, humanlike robots do prove genuinely useful in real applications. Numerous studies, including those with humanoids Nao, Bandit, Kaspar, and RoboKind Zeno, show that autistic children respond favorably to such robots, promising treatments and social training uses. Additionally, consider a humanoid robot like NASA's Robonaut (just to name one). Its capabilities for use in space and in factory automation promise safer, more efficient work environments for people. And then, there is the simple wonder and psychological power of humanoid robots. Just as human-inspired depictions brought joy and insights throughout history—such as in the sculptures of Michelangelo, in great works of literature, and in film animation such as those of Disney, Miyazaki, and others, there is no reason that robots can’t inspire similarly. Humanlike robotics already bring us wonder and joy. Why can’t robots communicate just as much wisdom, knowledge and ardor, as do other figurative arts? In addition to known uses for humanlike robots, new uses for humanlike robots will certainly emerge, expand and surprise us, as the capabilities of robots evolve onwards.

It is true that humanlike robots are not nearly human-level in their abilities today. Yes, humanlike robots fail. They fall, they lose the topic in conversation, misunderstand us, and they disappoint as much as they exhilarate us. At times these failures frustrate the public and robotics researchers alike. But we can’t give up. Humanoid robots are still in their infancy. Though they falter, the abilities of humanoid robots continue to grow and improve. Just as babies can’t walk, talk, or really do anything as well as adults do, or do anything particularly useful, this doesn’t mean that babies deserve our contempt. Let’s not give up on our robotic children. They need nurturing. And as a researcher in humanoid robotics, I can attest that it’s a pleasure to raise these robots. They are a lot of fun to develop.

Looking forward, we can find an additional moral prerogative in building robots in our image. Simply put: if we do not humanize our intelligent machines, then they may eventually be dangerous. To be safe when they “awaken” (by which I mean gain creative, free, adaptive general intelligence), then machines must attain deep understanding and compassion towards people. They must appreciate our values, be our friends, and express their feelings in ways that we can understand. Only if they have humanlike character, can there be cooperation and peace with such machines. It is not too early to prepare for this eventuality. That day when machines become truly smart, it will be too late to ask the machines to suddenly adopt our values. Now is the time to start raising robots to be kind, loving, and giving members of our human family.

david hanson hanson roboticsSo I can see no legitimate reason not to make humanlike robots, and many good reasons to make them so. Humanlike robots result in good science and practical applications; they push robots to a higher standard, and may eventually prevent a war with our intelligent machines. What’s not to love about all of that?

David Hanson, Ph.D. [photo right] is the founder and CTO of Hanson Robotics, in Richardson, Texas, a maker of humanlike robots and AI software. His most recent creation is Robokind, a small walking humanoid with an expressive face designed for research.

Da Vinci Surgical Bot Folds, Throws Tiny Paper Airplane

da vinci surgical robot

Everybody already thinks that robot surgery is way cool, but I suppose there's no harm in taking a few minutes to show off the precision that tiny little robot grippers are capable of. On the other end of these steely claws is an even steelier-eyed surgeon with a questionable amount of aeronautical experience, and in between the two is a da Vinci surgical system. This particular robot hails from Swedish Hospital in, you guessed it, Seattle.

The da Vinci system, if you recall, provides surgeons with an interface that allows them to control little robotic hands with their own (much larger) hands, enabling much finer control in a much tighter space. For patients, this means smaller incisions that heal faster, and for surgeons, it means no more going elbow deep into someone else's guts.

I do feel obligated to point out that depending on your definition of robot, the da Vinci system may not qualify as one, in that it doesn't have much of an autonomous component: all of those motions are controlled directly by the surgeon using a master/slave system. However, robots with actual autonomous surgical capabilities aren't that far off, and now that we've seen demos of robots autonomously sucking your blood out and autonomously taking biopsies on simulated turkey prostates, it's just a matter of time before you start having to choose your surgeon based on whether it's running Windows or Linux.

[ Intuitive Surgical ] via [ Nerdist ]

TED Talk: Berkeley Bionics

We were on hand when Berkeley Bionics introduced their eLEGS exoskeleton last October, and there's no doubt that it's a pretty amazing piece of hardware. The same company is also responsible for the HULC exoskeleton, which they've licensed to Lockheed Martin. If you're already familiar with Berkeley Bionics' stuff, there isn't too much new in the presentation, but it's always great to see these incredible exoskeletons in action:

Incidentally, media coverage of the eLEGS launch focused extensively on how the exoskeleton had the potential to "free" people with disabilities from what they seemed to assume is some kind of lousy and pitiable quality of life, which is certainly not the case. I'd encourage you to read this wonderful article by Gary Karp on the subject, and also consider how sometimes, people with "disabilities" can actually be super human in some ways.

[ Berkeley Bionics ] via [ TED ]

Weird French Robot Reeti Wants To Be Your Home Theater

This curious really kinda weird looking robot is Reeti, who's apparently what you get when a robot and a media center PC have offspring. Reeti is designed to provide an interface between your TV and your computer, offering a variety of additional capabilities, or something... I'm honestly not quite sure what it, um, does.

Setting practical uses aside, Reeti is very emotionally expressive, considering its relative simplicity. It has cheeks that glow to communicate mood, and there are touch sensors in its face to enable it to react when you prod it. Each of Reeti's eyes has its own HD camera, and its 3D perceptual view lets it recognize people and objects and track motion. Reeti can understand (and localize) spoken commands, and its speech synthesis allows it to read emails and RSS feeds to you. Oh hey, something it can do!

If you're still wondering what other things Reeti can do for you besides reading aloud, maybe this will answer your question:

Or, uh, maybe not.

I guess what I still don't really understand is why I'd want a Reeti in my house. I mean, I want one, because it's a robot, and it's expressive and funny looking, but at this point I'm not quite sure what Reeti plans to do for me that I couldn't do more efficiently with a mouse and keyboard, you know? It looks like Reeti is designed to be more of an open platform where people can write their own apps to extend the capabilities of the robot, which is fine, but if you look at what makes an app store successful, they're mostly targeted towards devices with enough inherent capability that you can establish a large, happy consumer base without any apps at all, creating your own market. So that goes back to my original question: what can Reeti do for me?

Setting practical uses aside (for the second time), I do appreciate Reeti's overall aesthetic, if you can call it that. Reeti is likely as strange looking as it is, in order to distance itself from any sort of anthropomorphic impressions. It's got eyes and a mouth to help it communicate, but it's so far from looking human that we don't get caught up in how it doesn't look human, if that makes sense

Reeti is made by the French company Robopec, and apparently there will be some way of pre-ordering one at some point for about $7,000 (!). Until we get a little more information on all of the spectacular and amazing things that Reeti may or may not be able to do, though, I'd hold off adopting one of these little guys, unless you're so smitten that it's already too late.

[ Reeti ] via [ Robots Dreams ] and [ CNET ]

The Global Robotics Brain Project

global robotics brain

Why is this man smiling?

Because in his brain resides a database with more than 36,000 robotics companies, robotics labs, robotics projects, robotics researchers, and robotics publications, all categorized, tagged, and linked.

No, not in the brain inside his head. We're talking about the Global Robotics Brain, a project that the man, Wolfgang Heller, started to keep track of the robotsphere.

Inspired by Google's PageRank, Heller, a business intelligence consultant from Sweden, asked himself: Could he use a similar approach to draw a map of interactions between the different robotics players and identify who is doing the most relevant work? What trends are emerging?

In 2005, after a visit to the World Robotics Exhibition in Aichi, Japan, he started to systematically feed his database with anything related to robotics he came across. He then created tools to automate the process. Six years later, the result is a "gigantic mindmap of a broad range of robotics resources," he tells me.

Heller isn't building this brain for fun. His hope is that companies and labs will pay him to access it. A free version is available for students and researchers for personal use; an expanded version with more detailed information is available for organizations on an annual subscription basis.

In the expanded version, you'll find insights on robotics trends that Heller generates periodically (using, we should note, both his brain and the database brain). Here's his latest list of robotics trends:

1. Industrial robotics renaissance. Soft mobile robots start working alongside human workers. Examples: Toyota safe human-robot factory assembling, Festo Bionic Handling Assistant, pi4 Workerbot, Robonaut2).

2. Urban service robotics renaissance. Smart mobile robots enter public space for safe and green city living. Examples: Dustbot, Google autonomous car, ubiquitous robotics, Cyber-Physical-Systems.

3. Civil robotics Renaissance. Transfer of military robotics into civilian robot application. Examples: Telepresence robots, civil UAV & UGV, telesurgery, rescue, Ambient Assisted Living.

4. Robotics toy-to-tool renaissance. New generation reinvents and remixes robotics technology, artificial intelligence, information and communication technologies, nano and biotechnology into new toy-to-tool robot platforms. Examples: Nao, PR2, Kinect, ROS.

5. Robotics promotion renaissance. Governments have recognized robotics as strategic technology that requires R&D investments and public awareness. Examples: National robotics roadmaps, flagship research programs, establishment of centers of excellence, robotics science and amusements parks, national robotics weeks, robotics challenges.

Check out the Global Robotics Brain to see if you envisage other trends. Try to look where the investment is coming from, where the research is taking place, where technology gets commercialized, and so forth. Soon you’ll start feeling like you also have a robotics brain.

Samuel Bouchard is a co-founder of Robotiq.

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More