As often happens, it's Thursday night and I've got a slewjumbleheap carefully selected assortment of robot videos that have shown up this week and I figured it would be a good idea to tosssmushpile carefully present them in a post for Friday that you can enjoy without having to listen to me drone on about, you know, drones. Or whatever.
In the realm of dancing robots, researchers seem to pick the weirdest tunes to show off their latest programming tricks. I could delve into why I think that might be, but out of respect for the personal lives of all of these people who have clearly not been out of the lab since the mid-1970s, I'll keep quiet and just let you enjoy HDT Robotics' HDM robot dancing to some disco:
If that thing had a pelvis, it would be going crazy.
An entirely different (and more mature) style altogether is demonstrated by Azusa Amino's Toko Toko Maru robot, which took first place in the Robo Japan 2 Dance Contest on Sunday. Robot-Dreams brings the video:
And finally, we come to the rampant stupidity, and boy is it rampant. But that's the way we want it, because in October is another Bacarobo competition, where the stupidest, most useless robot wins a huge wad of cash. It's sort of like the Antimov Competition, except with more uselessness and less flaming death. Here's a video of some highlights from last year, which you'll enjoy slightly more if you can speak Hungarian:
This is a new vacuuming robot from Toshiba, called the Smarbo. Few of the specs will surprise you: it's got 38 sensors, obstacle detection, edge detection, and a mapping camera (kinda like LG's RoboKing) that lets it recognize where it's been and clean rooms in an efficient, single pass pattern. What caught our eye was the "double brain function," which (if Google Translate is to be relied upon, never a good idea) seems to suggest that this vacuum is smarter than normal.
But do we really care?
Progress is always good, and a faster and more efficient robot is usually a better robot. But when we're talking about an autonomous vacuum, there's a question as to whether or not a marginal improvement in efficiency derived from a more computationally intensive algorithm will really make a difference in your life. As I see it, a robot vacuum can operate in one of two ways: pseudo-randomly, like a Roomba, or using a mapping pattern, like a Neato. iRobot's method involves multiple cleaning passes to clean better (maybe) at the expense of efficiency, while Neato's method covers most areas of your floor approximately once. Obviously, the Neato is much faster, so if speed is what you want, go with a vacuum that makes a map.
But there's a limit to the amount of cleverness you can use to improve that "approximately once" coverage method. Or perhaps I should say, you can throw as much CPU cleverness as you want at the problem, but returns diminish rapidly unless the speed or single-pass coverage area of your robot also increases. Again, I'm not trying to harsh on progress, but on some level a "smarter" robot vacuum is sort of like a camera with 12 megapixels instead of 10: looks good on paper, but will you ever really notice and are you willing to pay a premium for it? Perhaps not.
What I do like is Toshiba's "careful" cleaning method, which is sort of like a compromise between efficiency and multiple passes. If you set the Smarbo to clean carefully, it'll do a single pass while mapping the room, and then go back and do a second, orthogonal pass, like so (bottom left):
Pretty cool. Maybe something Neato could implement in their next tasty little software upgrade? Eh? Yeah? Anyway, unless you live somewhere besides the U.S., much of this rant discussion was to some extent pointless, since the Smarbo is not a vacuum that you'll ever be able to purchase, likely because some of its major components look like they were more or less lifted straight from the Roomba. If you're in Japan, though, you can pick up the Smarbo for about $1,175 starting October 1.
So this Reel Steel movie thing (aka what happens when you mix Rock 'Em Sock 'Em Robots with eighty million dollars and Hugh Jackman) is apparently still on track for an October release. Far be it from me to suggest that a movie which mixes robots with violence might ever end up becoming popular, but this featurette should give you enough of a taste of the action to decide whether or not you'd like to invest a couple hours in it:
I'd like to reiterate that the most interesting part about this entire production (so far) is that they actually went out and built nineteen eight-foot tall boxing robots for the humans to interact with during close-ups and whatnot. The rest of it's CGI, but happily there some real robots in there somewhere, presumably with parts made out of real steel. Yeah, I went there.
The film is due out on October 7, and DreamWorks has already started on the sequel, which may or may not be called Real Steel 2: Stainless Is Painless.
An anonymous worker at Japan’s Fukushima Dai-ichi nuclear power plant has written dozens of blog posts describing the ups and downs of his experience as one of the lead robot operators at the crippled facility.
His blog provides a window into the complex and dangerous work environment faced by the operators, a small group of young technicians who, like other front-line personnel, must approach areas of high radiation, deploying remote-controlled robots to assist with efforts to further stabilize and shut down the plant’s four troubled reactors.
The blog posts, which have recently been deleted, depict the operators’ extensive robot training exercises, as well as actual missions, including surveying damage and contamination in and around the reactors and improvising a robotic vacuum to suck up radioactive dust. The author, who goes by the initials S.H., also used the blog to vent his frustrations with inept supervisors and unreasonable schedules, though he maintains a sense of humor, describing in one post how he punched a hole in a wall while driving a robot and in another entry how a drunken worker slept in his room by mistake.
The material also raises questions about whether Tokyo Electric Power Co. (TEPCO), the plant’s owner, is acting with adequate speed and providing enough robots and supporting resources for the robot teams. It's ironic that, although the robots are remote controlled, the operators still have to work close to the highly damaged and radioactive reactors. There is no communications infrastructure, combining wired and wireless capabilities, that would allow the operators to do their work from a safer location.
Other problems, described by S.H. in some entries, include a lack of coordination and, on at least one occasion, neglect for the workers’ safety. In one incident, a technician who was not part of the robot team recklessly put a robotic mission in jeopardy, driving a truck over a tether and nearly severing the connection between the robot and the operators. S.H also reports that one day his personal dosimeter began sounding an alarm and wouldn’t stop; when he asked radiation personnel in charge about it, he was told to ignore it and continue working.
But what is perhaps most significant about the blog is its technical content. S.H. is part of a team assigned to operate robots provided by U.S. company iRobot. The robots, two PackBots and two Warriors, known for their explosive-disposal work in Iraq and Afghanistan, have performed remarkably well at Fukushima, even after repeated jobs in high-radiation environments, which damage electronics.
By explaining what works and what doesn’t, S.H. made his blog must-read material for companies and researchers developing robots for emergency situations. One Japanese roboticist I spoke to, who asked not to be named because he’s working on a competing robot, called the operators “heroes” for their work and said the blog provides details “crucial for making a good machine.”
The posts show that, although the robots have to be strong and reliable, they also have to be nimble and compact, to be able to maneuver on stair landings and in other tight spots. S.H. also describes challenges that many robot developers may take for granted, such as the difficulty of handling the controls while wearing five pairs of gloves or seeing the user interface from behind a bulky mask. Which means that the controls and interfaces need to be made even easier to operate than they already are.
Another big lesson for roboticists, based on the Fukushima operators experience, is that emergency robots shouldn’t be stand-alone machines: They work best in pairs or teams, so that one robot can work as a wireless base station to allow another unit to travel farther, or help the other if it gets stuck. And where radio signals don’t propagate well, using a combination of wireless and tethered robots is essential.
After all the training, S.H. became confident in his team's skills. "I think there are but very few people in the world who have come to operate robots so agilely," he writes in one post. "I don’t think there are that many even among [the companies] that manufacture them."
WHERE ARE THE ROBOTS?
Since the earthquake and ensuing tsunami struck on 11 March, there have been lots of questions about Japan's lack of robots to assist with the recovery operations. Japan, a robotics-friendly nation with the world's highest levels of automation, had to count on foreign assistance. Less than a week after the earthquake, iRobot donated to Japan two PackBot 510 robots with hazmat kits and two Warrior 710 robots with manipulator arms. iRobot engineers trained Japanese operators the following week.
But it still took three more weeks for TEPCO to actually use the robots, which quickly proved to be an essential tool. In mid-April, two PackBots provided the first video and radiation recordings from the interior of Reactors No. 1, 2, and 3. Later that month, they inspected the cooling water system and piping of Reactor No. 1, to confirm the equipment could be turned on. They’ve also helped detect “hot spots”—areas so high in radiation that a person would receive a lethal dose in seconds—and measured the radiation level of the water that flooded the basements, one of the plant's biggest contamination problems.
The S.H. posts show that, like other parts of the recovery efforts, the robot operations are assigned to contractors, which have only a handful of robots at their disposal and seem to rely on a great deal of improvisation. TEPCO’s robot fleet includes aerial drones, remote-controlled construction machinery, and tank-like exploration robots like Qinetiq’s Talon. But currently only two robots—the PackBot duo used by S.H. and his coworkers—are capable of entering and navigating deep inside the reactors (the Warriors are too big). It seems to me that two robots is a small number, given that the machines might break down or need to undergo maintenance, and there are four reactors with multiple floors that need to be inspected. iRobot said in a statement that it continues to “provide spare parts and technical support” and is “working through our distributor in Japan to explore requirements for additional systems.”
Given the stakes involved, why hasn’t TEPCO acquired more robots and stepped up its robotic efforts? At this pace, will it be able to achieve the plant's “cold shutdown” scheduled for next January? Only time will tell.
A TEPCO spokesman denied that there were delays in deploying the robots at the plant or that they lack robotics resources now. He told me that at this point "we don't have actual plans for adding PackBots. We will use more PackBots if necessary." TEPCO, he said, is planning to improve the safety of robot operators by building a "communication facility in the future."
THE BLOG VANISHES
After the earthquake hit, S.H. wrote one or more posts on a daily basis. Early last month, however, after word of the blog (hosted at http://sh-blog.at.webry.info) began circulating among Japanese Twitter users and bloggers, all posts related to the robot work were deleted (the blog included posts on other topics as well). Not long after, the entire blog disappeared. It’s unclear whether TEPCO or S.H.'s supervisors demanded that the material be removed. Efforts to reach S.H. were unsuccessful.
Before the blog was removed, I used software to make a copy of it. IEEE Spectrum has decided to translate and publish portions of the posts because we consider the information to be in the public interest. The material offers important lessons about the Fukushima disaster—lessons that roboticists and others should heed if we want to be better prepared for tomorrow’s calamities. TEPCO has also been criticized for not being transparent, and these posts provide more information for Japanese citizens to decide whether the company and their government are doing a proper job.
(The contents of the blog have also been available on Google's cache, and recently a Japanese researcher republished some of the posts on his site.)
S.H. also published half a dozen YouTube videos, which have now been made private. The videos—nearly an hour of footage in total—show training exercises with the PackBot and Warrior robots. iRobot confirmed that these were the robots it donated to Japan.
While the videos were still available to the public, I used a program to capture snippets, which I used to put together my own video. We believe we’re making fair use of the snippets, using them as documentation of the training process, which is a newsworthy event. Note that our video shows only brief moments of what might have been many hours of training, and it probably doesn't reflect the operators' current skill levels. The video also includes some candid moments, such as when a worker takes a ride on a Warrior robot. Watch:
The blog posts and videos S.H. published are relevant not only to Japan. Other nations should take notice. Government officials, roboticists, and the public, should demand that the nuclear industry in their countries be better prepared and equipped to handle disasters, and robots should be part of these preparations. Currently, it seems that only France and Germany have plans requiring that an emergency robotics force be available.
Perhaps nuclear power companies should invite the Fukushima robot operators to give talks about their experience and help train robot teams in other countries. “This kind of natural disaster could happen anywhere,” S.H. writes. “If there is a call for this same kind of work, I'll go anywhere in the world!” Let’s not wait for another disaster to make the call.
Below are portions excerpted from nearly 50 robot-related posts that S.H. published on his blog, titled "," or "Say Whatever I Want * Do Whatever I Want," covering a period from late April to early July 2011 [right, screenshot of a post]. This translation attempts to remain as close to the original text as possible as well as preserve the author’s style and tone. The translated version, however, may have inadvertently introduced inaccuracies or altered the author’s views. Also note that we tried to preserve the formatting of the text; S.H. typically writes one sentence per line, grouping them together when they're related to the same topic. Some sections (marked with [...]) were omitted for clarity or space. Please report any errors to firstname.lastname@example.org. And leave a comment below telling us what you think about the material.
Say Whatever I Want * Do Whatever I Want
POSTED: 26 APRIL 2011 Robot Team
Our new task is the operation of the exploration robots.
At some point we’re supposed to have domestic [Japanese] robots too [in addition to the robots from U.S. company iRobot].
For now there are three of us [robot operators], including myself.
The tsunami destruction inside the turbine building.
Mud all over, beyond recognition, a total mess.
To be honest, I feel as though we are fortunate that the building sustained only this much damage.
The nuclear reactor building is not as bad.
A program [to operate the robots] is needed, and we are going to use a Toughbook PC. The controller is the same as those for video game systems, like the PlayStation.
As we expected, that young employee is good [at driving the robot]!
Tomorrow the robots are off duty, so it’s my turn for training while their batteries are getting charged.
Apparently, they not only want the robots to be able to climb over rubble but also go up and down stairs, so we are just going to have to try hard and get the hang of it.
If you see anything on TV, it will probably be me behind the controls.
I hope to be in charge of the robot that is in the lead.
POSTED: 27 APRIL 2011 Robot Training
Today, along with charging the batteries of the robots, it was my turn for training. I did operate them a little yesterday, though.
The training takes place in a low-radiation outdoor area, but we wear full radiation protection clothing and gear, which diminish my thinking abilities.
Ordinarily, I’m not good at thinking (LOL).
The training is designed to replicate, as close as possible, the actual task.
First I tried to open a heavy steel door (with a common round knob).
They said that I was the first person to succeed in opening a door with such a knob, in fact.
As for the rubble, if it were like the angular objects we see at a university laboratory, it would be no problem. But actually the rubble is not so evenly shaped, so things didn’t work so great.
If there are just a few steps/stairs, we can climb them, but the reality is that they are not small stairs. In fact, most of the stairs are destroyed or twisted, so you pretty much can’t climb them.
What's more, the robots have the arm etc. attached, so at the start and end of the climb, you have to move the arm to adjust the [robot’s] center of gravity, or it will flip over.
It’s the same as a backhoe going up and down. But for me, a backhoe that has lots of levers that you can operate simultaneously is easier.
Since it’s a game controller, it’s a pain to have to switch between the driving mode and the arm-control mode every time (on top of which, this has to be done in the middle of unstable tasks, such as climbing stairs).
Also, it’s not like you are looking at something with the naked eye, like in a university laboratory. It is difficult because you are looking at an image from the mounted camera that has a very limited field of view.
I plan to be the No. 1 robot again tomorrow.
One of my dosimeter’s alarms began to go off and would not stop. . . . When I asked one of the radiation management personnel . . . he said, "There is nothing wrong with it, so please continue your work.” So I did!
—S.H., robot operator
POSTED: 30 APRIL 2011 The Temporary Office
We did thermal imaging today.
We will be going to the front line wearing a glass badge that measures the cumulative dose and carrying two personal dosimeters.
The front-line situation is being broadcast on TV, but if you see it in reality on-site, it is even more gruesome . . .
One of my dosimeter’s alarms began to go off and would not stop right before we began working. When I asked one of the radiation management personnel who was with us about my dosimeter, he said that I was given one for which the settings had been incorrectly configured.
He said, "There is nothing wrong with it, so please continue your work.”
So I did!
POSTED: 1 MAY 2011 iRobot
The [robots] we are using right now are the ones provided for free from the U.S. company iRobot.
Most of the information [about the robots] has already been on the Net, so I don’t think it is a problem if I describe them here.
What is being provided for free are just the robots and the controllers.
We got the robots customized (revamped) according to our needs.
The customization parts are not free.
By the way, just the tip of the gripper is approximately 100,000 yen [approximately US $1,300].
Two robots are being used cooperatively by two companies [contracted by TEPCO].
The idea is to always use a pair [of robots], so that if there’s a problem with one of them, then the other is there for support.
I previously wrote that there are two mounted cameras, but one unit has five cameras, and the other one has four.
The mounted devices on each of them are different as well.
The one that has five cameras has four dosimeters mounted.
The one that has four cameras has one dosimeter, a particle sampler (it measures contamination levels), an infrared thermal camera, and a gas detector (it detects the concentration of oxygen, organic gases, and combustible gases).
The actual body is approximately 40 kilograms [88 pounds] per unit.
It can be carried by one person, but generally it is carried by two people.
Roughly, if the battery is fully charged, we can operate for 4 hours easily.
Since it is waterproof and explosion-proof, apparently, if it is connected by cable, then it can be submerged down to 3 meters [10 feet], but we have not had to submerge it yet.
The internal components and drive system are explosion-proof, but the arms and metal chassis are not completely explosion-proof. If combustible gas (hydrogen) is detected, the task is halted.
[The robot] has a mounted GPS and a gyro system, and you can see its cardinal directions (actually this is NSEW, since it is in English) and posture on the PC screen.
We turn on its power while we are outside where the GPS signals can reach it and then drive it indoors, but while you are inside, the directions get messed up. You can’t rely on the directions.
But its posture display from the gyro is immensely helpful.
Even if [the robot] tilts just a little bit in one direction, a 3D image shows its posture, with the current inclination, on the PC screen.
Other 3D systems, the kinds that jump out at you, I don’t like those . . .
We operate [the robot] while looking at the image from the mounted cameras.
Because the power company [TEPCO] employees are all from the headquarters, they aren’t very familiar with the structures on-site.
Because it’s a confined area, those who are not used to the site even lose their bearings.
But we operators (eight to 10 of us, including myself, from two companies and the radiation management personnel) have the site structures totally in our heads.
We even know pretty much where each of the numerous cables and pipes are.
Even when having a discussion while just looking at a flat diagram, we can already see the structures in our mind’s eye.
When the power company employees start saying, “Something like . . . ,” we operators say, “Sure, you mean the so-and-so, right?”
As though we were experts.
(Although sometimes they [the power company employees] get an attitude because of this.)
POSTED: 9 MAY 2011 Trying With Cables
The young guys are doing basic training for going up and down stairs.
We are doing a wired (optical fiber) mock-up [exercise] for going up and down stairs.
There are two types of wireless devices using short wave (VHF). They are 2.4W and 4.9W. [The author probably meant 2.4 and 4.9 GHz.]
The 4.9W [device] is higher powered, but for some reason the one that seems to reach further and has better sensitivity is the 2.4W one.
By the way, since this is not a frequency that has been designated by the Ministry of Internal Affairs and Communications (since these radios are made in the U.S.), technically we are breaking the Japanese Radio Wave Law.
Changing the radio units of the robots wearing triple and quadruple layers of gloves, trying to do such fine manipulation work, I can’t seem to get used to this.
With wireless operation there is a problem with the reliability of communication. It appears that they [the officials in charge] would like us to try using cables within the buildings when going up and down stairs from now on, so we are preparing a mock-up location for training.
We will remove the wireless units of the robots and attach a reel that will automatically wind or unwind the [optical fiber] cable. It’s just like the flux wire in semi-automated welding machines.
The cable is approximately 300 meters [985 feet] long.
The robots have to climb by driving backwards up the stairs, and it is difficult to make the 180-degree turn in the small landing areas.
The key is to avoid stepping on the cable with the treads ("caterpillar treads" is what they call them).
We are good at handling the cable, but we are still not very good at keeping our balance with the arm.
We’re okay in a wide open space, but we are still not so good at operating on the stair landings where it’s tight and narrow.
POSTED: 11 MAY 2011 Robots in Action
Yesterday was my turn to be the pilot, and I went into nuclear reactor No. 3.
There was no going up and down stairs, but there was considerable rubble, so I had to use the arm or the body of the robot to move the rubble around to be able to advance.
The opening and closing of the double doors was a challenge (because it’s a narrow area), but in the end it went quite smoothly.
Today, we switch to using cables and do the final up-and-down stair training (mock-up) using nuclear reactor building No. 5, and it all went pretty well, without any particular hang-ups. [Reactors No. 5 and 6 weren’t in full operation and suffered less damage.]
I know you are aware of the bad working conditions on the front line or at the front-line base from the Aichi University physician’s report.
The working conditions for us operators are much worse than the conditions for the power company employees (with the exception of our sleeping quarters).
As for stress, though we are also victims [of the earthquake and tsunami], they say our stress levels are two or three times higher than the average victim’s.
Frankly, even I am feeling considerable psychological and mental strain.
The No. 4 nuclear reactor building, where I was during the earthquake, is now a disaster because of the tsunami.
When I imagine what would have happened if I had been just 5 minutes late, even now I feel so fearful thinking of it.
The power company and the country have disseminated a considerable amount of information.
I don’t know if there is no interest or what, but the media is not broadcasting the important issues; instead, it’s broadcasting stuff that is irrelevant, or making mountains out of molehills about things that don’t matter.
Well, setting that aside for now, I’ll get back to the subject. I think there are but a very few people in the world who have come to operate robots so agilely.
I don’t think there are that many even among [the companies] that manufacture them.
Now is the time for us to show the world the power of Japan’s technical and human capabilities, to be able to face this calamity, this natural disaster.
I think this is also something that will save Japan and help it resurrect itself.
This kind of natural disaster could happen anywhere.
If there is a call for this same kind of work, I’ll go anywhere in the world!
POSTED: 19 MAY 2011 Stop Working
The robot missions have been suddenly halted.
Everyone on the front line, including our company, is anxious because of the decision from the Fukushima Prefecture.
Our morale dropped instantly.
The Fukushima Prefecture sure has stuck its nose where it doesn’t need to be!
Starting with the governor, they are all trouble.
This situation shouldn’t have affected the robot exploration.
Although the robot missions got cancelled, we’re not the types to just sit around staring at the walls.
I had the three young guys doing operator training on the PackBot (the original unit).
We ended by changing the battery and doing some light maintenance.
[The robot] is starting to wear out (although it is for military use, it has durability issues), so on Monday we’ll be doing a heavy-duty maintenance workshop, which will be conducted in Onahama by an American from the iRobot company.
There is no work until Monday, and I’m not due for bus duty, so I decided to take some time off.
I will be able to see my family for the first time in about two weeks.
POSTED: 30 MAY 2011 Today I’m a Truck Driver
I woke up with the sun this morning.
I feel like I was forced to get up rather than waking up.
The first one who got up to go to the bathroom, Mr. S., said, “Hey, there’s someone else here. Who’s that?” So I got up to look, and there was someone sleeping on the tatami floor. . .
I looked at his face and didn’t recognize him.
I woke him, asking, “Who are you? Are you in the right room?”
He was kind of confused as well.
“? ? ?”
“What room is this?” he said sleepily.
He smelled a little of alcohol, so he probably got drunk and slept in the wrong room.
The neighboring room to the left is for another company.
Although they are a different company, we work together on this project.
Or he may have even gone to the wrong floor.
I don’t think there are that many people in the world who can operate a robot like this. We will go anywhere in the world if there is a call!
—S.H., robot operator
POSTED: 31 MAY 2011 Today’s Work
Today we did some robot operator training.
It was training with all six of the operators of our company together.
We mainly trained the three young operators to go up and down stairs.
All of them are very good!
I don’t think there are that many people in the world who can operate a robot like this.
We will go anywhere in the world if there is a call!
- An injured person
I cannot be certain of the details because it is not related to our company, but I heard that there was a person injured at the Fukushima Dai-ichi plant.
They apparently transported the person by ambulance to the front-line base, and then to the hospital.
Again, the SDF [Japan’s Self-Defense Forces] did nothing . . .
Even though they have shrunk their numbers, for those of us who are working here, they are just in the way.
Instead of just sitting there, if they aren’t even going to move when there are injured people involved, I wish they would shrink even more . . .
POSTED: 1 JUNE 2011 IAEA
Today’s work is a continuation of the training for operating the robots.
Training for the three young ones.
It was training to learn how to open a door with two units and go through, or come out.
We practiced opening a door and going into the company’s locker room.
There were no destroyed lockers, but some had been dislodged, or their contents had spilled out [because of the earthquake].
Training that almost replicates the real on-site conditions is possible.
We were able to get the feel for a narrow space with rubble all over.
The clothing thrown all about gets entangled in the [robot’s] treads.
Removing [the clothing] by operating the arm manually is part of the training.
We changed the battery on the robot and repaired/adjusted the arm.
The robot made by iRobot seems to have a weak arm, or rather it has a problem with durability.
We are using it within standard usage parameters (not overloading it), but there’s a problem with durability.
Both arms from both of the units are beginning to show wear.
iBorot . . . [The author is making a play on words; a borot is a kind of Japanese toy robotboro means shabby, broken, worn-out in Japanese. I thank the readers who pointed out this error.]
We plan to enter the No. 1 nuclear reactor on Friday to do a radiation survey.
We’ll enter through the receiving bay of the nuclear reactor building, but I hear this was an area with high radiation levels.
As for the title [of this post], the report from the IAEA [International Atomic Energy Agency] came out.
Apparently they will be making detailed reports for the government from now on.
The content of the report was:
There are some problems with the government’s response.
A. There are some issues with the independence of the regulatory authorities (nuclear safety, the industrial safety agency).
B. There are no issues with the response on-site from the time of the disaster up until now, and it has been very good (best).
C. The underestimation of the magnitude of the tsunami.
The report was a good report in general, except for C.
I personally was thinking that they would hand out a much more severe report.
And I would like to commend the media agencies that correctly reported these matters as well.
POSTED: 3 JUNE 2011 It’s . . . a . . . ghooooost!!!
As planned, we sent the robots into the No. 1 nuclear reactor for exploration.
We entered from the receiving bay of the building.
We settled into a rather low radiation area and operated [the robots] using the cameras and radio control.
I was one of the operators (one unit) today.
Because a power company employee said, “We will hand today’s images over to the media,” I think you will see the material, even on TV.
My robot, as usual, has had its alarm light on the head camera blinking on and off since yesterday.
I think you will be able to see this on TV: The robot with a red light blinking on and off on its camera is the robot I’m operating.
There is so much rubble and accumulated dust that the robots’ treads had quite a slippery time on the shellacked floors.
On the way back, to get to the receiving bay it is an uphill climb and the robots were unable to make the climb because it was too slippery.
We finally made it back by making running starts and finding areas with more grip by jostling the robot or by getting rid of scrap rubble that had slid under the robot, using the flippers to raise the body.
Today we ran the robots with the arm lifted up to the height of a person’s chest, on which a dosimeter measuring [radiation] is positioned.
At one point, the robot had to climb over a slope that was only the width of the robot, and there was a glass door that was leaning over after collapsing during the tremors. In that situation, we used caution and lowered the robot’s center of gravity to get through.
We put smear filter paper on the grapple (gripper).
The smear paper will tear if we put it on just like that, so we balled up a vinyl bag and taped it up like a sphere. Then we pasted the smear paper onto the spherical vinyl bag and placed it between the grippers to collect samples.
With the smear method, you can analyze the contamination levels and the nuclides.
Not to toot my own horn, but I’m gaining quite a reputation for my operation of the robot.
There is a place in the nuclear reactor building where a pipe comes straight up from below and there is steam shooting out of the floor like hot springs.
The temperature is around 33 °C and not so very hot, and the radiation level is about 60 mSv/h, which is about the same as the surroundings.
The humidity is about 56%.
As it happened, we found a hot spot.
There was a maximum of 4 Sv/h (4,000 mSv/h).
But that was just a momentary value and cannot be considered accurate data.
It fluctuated between 2 and 4 Sv/h, so we measured it as accurately as possible, and the data result was 3.2 Sv/h (3,200 mSv/h).
There’s probably some kind of pulsating ghost.
I think we will need to do an investigation of this in the future.
When lowering the arm to the floor surface, it was about 30 mSv/h.
That's a big difference, even at only a meter away.
POSTED: 15 JUNE 2011 Migraines
Today we had hands-on operator training for the PackBot (made by iRobot), with [operators from] both companies together from early on.
It was more like Ethernet testing than training.
We mounted an antenna and Ethernet-type booster onto one unit, and we hooked up a LAN cable to it.
We were expecting to go from the double doors of the nuclear reactor building to the northeastern stairs, so we tested about 45 meters [150 feet] of cable.
Can one robot unit pull the LAN cable?
Furthermore, can it return collecting the LAN cable by itself?
Because it does not have a reel like the optical fiber, the operator (sub) has to release or retrieve the LAN cable while the robot is moving.
In other words, this unit alone will become like a wireless base.
This one unit will go to the first landing of the northeastern stairs.
Then another unit (one from our company) will go down the stairs to the very lowest basement level (basement floor 2) of the building.
We will go as far down as we can go in the No. 1 nuclear reactor building on [June] 23rd and check on the situation of the contaminated water.
I turned down the operation on the 23rd.
I will give the responsibility to a senior colleague who had done this prior to me.
I decided I will be the navigator.
Today the Warrior robot from the same iRobot company arrived at the Fukushima Dai-ichi plant.
For now, they brought only one unit.
Unless another unit arrives, it doesn’t make much sense.
Tomorrow we are going to do the operator training for the Warriors.
If we can shoot a video, I’ll do so.
It is just a voluntary operator training in the front area around our office, so I don’t think it should be a problem to shoot video.
Migraine headaches have bothered me since around noon.
POSTED: 16 JUNE 2011 Warrior
Today we had operator training for the Warrior from the U.S. iRobot company.
We went over basic operation skills and maintenance methods.
We tried climbing over rubble as well.
The weight is 250 kilograms [550 pounds] and it is six to seven times bigger than the PackBot we have been using.
It can even suspend itself.
In other words, in a vertical direction, it can lift 250 kilograms or higher with its own strength.
Apparently, lifting about up to 100 kilograms [220 pounds] is easily done.
It can move with people riding on it as well (a person of normal weight).
It is difficult to maintain its balance when going up and down stairs (or climbing over rubble), etc., because of its weight.
Of course, there were no instances of it falling over.
The basic control operations are the same, and it uses the same PC and game controller.
The button operations are a little different because the functions vary somewhat, but I didn’t get mixed up.
The height is about 3 meters [9.8 feet] max [images above].
Its speed is about 30 kilometers per hour [20 miles per hour], which is faster than the PackBot.
It covers more ground when going up and down stairs than a PackBot, but because its body is so big, I’m not sure it will make it around the stair landings.
We plan to verify that point tomorrow.
Unfortunately, I’m busy with my bus driving duties, so I won’t be able to attend the test.
The battery is the same as the PackBot.
The batteries for the PC and the robot are the same (1 battery/12 V).
One is used for the PC.
The PackBot has four, but the Warrior uses six.
For the PackBot the batteries were individual units, but for the Warrior, they are in a pack of six.
POSTED: 19 JUNE 2011 Isolation and Loneliness
Today, once again, a decision concerning our work was arbitrarily made without us.
I am one of the members of the robot team.
One of the employees who was confirming the duties for tomorrow asked me,
“Is there a task for the robot team to do tomorrow?”
Tomorrow there are so many [team members] who have the day off, or have to undergo physical examinations, that I was the only one able to work.
Since I can’t do anything by myself, I was planning on no activities for the robot team tomorrow.
Just as I was about to say, “No . . .” (there is no work for the robot team tomorrow), the special government official, who just got appointed today (a person who likes to order others around more than our director), pursed his lips and said, “I don’t understand everything, but I think we should have everyone do training tomorrow, so put it on the schedule!”
This person is one of those who if they think of something won’t budge an inch, so if I counter back, he would most likely say, “This is not the time to be taking days off! Call them up and have them show up for work!”
And that would just ruin the atmosphere. So right on the spot I called a junior employee on the phone, to show what a fuss [the official] was causing.
He is a junior employee who is a newlywed living in a rented apartment in Iwaki.
I said, “I’m so sorry, I know you have the day off . . . “
And as though he knew what was coming, he said, “I’ll take another day off at some other time. I’ll be there tomorrow.”
I don’t understand why [the official] would make this decision arbitrarily, without even discussing or checking with anyone, even when one of the robot team members is standing right there in front of his eyes . . .
I really felt a sense of isolation and loneliness.
[The robot] punched a hole in the plaster wall with its track while sliding down. . . . Director, I’m sorry!
—S.H., robot operator
POSTED: 20 JUNE 2011 Warrior Stair Training
We used the stairs at our company’s office (currently in a destroyed state) for training.
Because [the robot] is so big, we did the going up-and-down-stairs training with the PackBot. It didn’t work with the outside stairs . . . which means that it will not work on the stairs at the site (inside the building).
Furthermore, the controls [of the Warrior] vary a little from the PackBot’s and the center of gravity is different, so it cannot go up and down stairs in the same manner.
We will have to do this by skillfully changing the robot’s center of gravity, and because the controls are reversed [the robot goes up the stairs backwards], we can’t make fine movements, and it’s very cumbersome to use.
It has a lot of weight, so it is very scary . . .
At first, I was sweating all over, and I don’t know where that was coming from.
Because the gripping strength is different between the left and right treads, if you stop in the middle of the stairs like with the PackBot, you are in trouble.
And if it slips and starts to fall, then you’ve lost all control.
Even if you attempt to climb with the forward motion action, it will Dutch roll and come sliding down.
It punched a hole [below, left image] in the plaster wall with its track while sliding down (this was me!).
Director, I’m sorry!
I was the one who made this other hole [above, right], as well.
Just as the robot had got to the top [...], there was not enough room on the landing and the gripper (teeth) stuck into the plaster.
Well, that is why we use our company office that is now reduced to ruins, because we were anticipating such things.
. . . But what I’m thinking is that the Warrior was not intended for indoor use, was it???
POSTED: 23 JUNE 2011 Deployment of a National Robot
The domestic [made in Japan] robot Quince came in today.
This robot was handled by a colleague and two young operators, along with the help of a power company employee.
The Quince is doing a mock-up exercise at the No. 5 nuclear reactor building.
I am investigating the No. 3 nuclear reactor structure with the PackBot and three other members.
Around the No. 3 reactor, the radiation is very high, and the hydrogen explosion did a lot of damage, so the destruction is the worst.
There is almost no area where workers can protect themselves from the radiation, so we hunkered down in a 15-ton vehicle converted to a shielded vehicle to operate the robot.
In the shielded vehicle, there is a rather large generator that works with 200 V, and there is even air conditioning inside.
In the vehicle, there is also a pipe system (localized air-filtering machine) that collects the radioactive substances.
On top of the shielded vehicle there is an ITV (monitoring camera), set up so that we can verify surrounding conditions.
The operation of the Quince robot will be done by the power-company employees.
We are only there to assist when it comes to the Quince.
We will be transporting it and setting it up.
I know the site very well and much better than the power-company employees, so I will be the navigator.
I hear that the Quince (Queence?) was created through a joint project between Chiba Institute of Technology and Tohoku University.
It originally was designed to be a rescue robot for disaster recovery, but as with the Warrior, it doesn’t have the ability to transport people (injured).
A PC and game-style controller are used for controlling the robot.
The game controller is the same commercially available product (the kind sold at electronics stores) that the iRobot uses.
The way to operate the robot is basically the same too.
Perhaps the program is the same as well, because the program for the iRobot is also made by Toshiba.
The PC for the Quince is a Panasonic Toughbook (B5 size).
The OS is Windows.
I don’t know what the OS for the iRobot company is.
It is not Windows or Apple.
They may actually be using something like the Tron from Japan . . .
[Advantages of Quince]
* Because it has flippers (movable crawlers) on both the front and back, it can cover a lot more ground far more nimbly than the PackBot in such areas as going up and down stairs or steps.
* It has five mounted cameras, and you can see their images simultaneously on the PC. (The iRobot robot can display only two cameras on the PC at a time, and you have to switch to see the other cameras.)
* Every few seconds, all the cameras automatically create a record (a movie that you can advance frame by frame is automatically created). (With the iRobot robot, you have to select one camera at a time to record.)
* The camera images are very high resolution. (The iRobot cameras are made in Korea and the image resolution is poor and they break down a lot.) (However, because the iRobot cameras are radiation resistant, they have relatively low noise (flickering) due to the ionization effects . . . although when it gets to the level of 3-digit mSv/h, the noise (flickering) does appear, even for this robot . . .)
* Because it is a design created with the base and the arm separated, it can be used for other applications with some changes.
[Disadvantages of Quince]
* It has quite a bit of heft for its size (it’s heavy).
* The tracks have a width of 20 to 30 millimeters and are narrow. These are not suited for rough and jagged rocky places and rubble
* You can operate it only with wired cables. (They have said that it can operate wirelessly, but it is not set up that way. Perhaps it can be modified?)
* Some wiring cables have the core lines bare, and there is no coating or covering (this creates a problem with water resistance and dirt).
* It seems to have problems with dust and water resistance
* I cannot assess the durability and design of the cameras and other functions. (They say that it is resistant up to a cumulative radiation level of 20 Sv (20,000 mSv) but since I don’t know the specifications, I cannot say for certain.) (I can’t tell if the camera is radiation resistant or not, either.)
Well, I would have to say the PackBot and the Quince each has its pros and cons.
POSTED: 23 JUNE 2011 Cool Vest
The cooling vest I got now is better than the previous model.
It is comfortable and the cooling packs lasts longer.
The previous model used two cooling packs; this one uses five.
Today the PackBots will be working in a high radiation area.
We will operate them from a 15-ton vehicle converted into a shielded vehicle.
Because we will need a radio relay, we will mount a booster [on one of the robots], and then transmit from the controller (PC) via Ethernet [to that robot].
We will use this unit as the transmission unit and another unit [the other robot] will go deeper into the area.
Midcourse, where the receiving dock has been destroyed, there is a large water puddle. We will select a shallow area and proceed through there.
The operators will be two of the young members. Looking at the camera feed of our shielded vehicle, we see a remote controlled dump vehicle coming at the robot head-on and a 4-ton truck coming from behind us.
We placed the Ethernet cable (LAN cable) along the side [of the road] so it wouldn’t be stepped on, and we also placed many cones.
A person got out of the 4-ton truck and started removing the cones, so I and a Tokyo Power Company employee hurriedly rushed to stop him.
He said, “I can’t get by,” ignoring our requests to stop, driving over the cable.
“Who is this guy, and where is he from?”
He said he’s from headquarters!
He proceeded forward running over the cable two or three times. We asked that he “back up and move out straight “ but he just plowed through running all over the cable.
Apparently (according to the operator), the transmission was cut off two or three times while [the cable] was run over, but fortunately the ground was mud and ash, and the cable didn’t break.
Workers from other companies are kindly being careful to take precautions when our company or others are setting up for a project, but if our own company people are acting this way . . . this happens because the people from the headquarters don’t attend the meetings!
Whether there are restraints or not, to run over a cable is out of the question!
On top of which, when we checked, it was an instrumentation guy from our own headquarters! The instrumentation guys would be appalled.
He did not accidentally run over the cables; he even moved the cones, ignored our requests to stop, and knowingly ran over the cable.
This was all done right in front of the power company employee. I did report this to my superiors.
If the cable had been severed, two robots would have been left in high radiation areas and would both be lost.
As for Quince (the Japanese robot), it went out today with one colleague and two young assistants.
The job was to set up a water gauge, but they had some trouble pulling out the gauge cable. The roller wasn’t sliding very well, and attempts to readjust it today didn’t work; they had to give up.
Tomorrow, they’ll prepare the roller surface and try again on Sunday.
We’re not sure yet.
The power-company employees are managing all of this, so if possible, we might have them do it. Given the results, things don’t seem to be going very well.
The company next door already had their instrumentation cable employees
set up the water gauge by hand. It was faster, and in the end the
radiation exposure was low.
But as the future of our national business is at stake, intentionally letting the robot do it, getting the robot to “perform” is necessary.
And then it will get on TV!
“Japanese robot works inside nuclear power plant!”
Or some kind of silliness like that!
POSTED: 28 JUNE 2011 Cleanup Robots
They are planning on sealing the pressure vessel (in the dry well) of the No. 3 reactor and adding nitrogen gas (N2), but the radiation levels are high near the instrumentation rack on the floor.
From our previous exploration mission, we know that rubble and fine sand/dust particles are still in that area, so our mission is to collect this dust.
In my group, there are four other operators and three others from another company. We will modify the robot [a Warrior].
A cleanup robot from iRobot . . .
I’m sure you are familiar with the famous Roomba!
The disc shaped automatic vacuum cleaner robot.
Well, that is good for household clean-up, but it’s not capable for this job.
We are going to take the Warrior and attach a hose with a nozzle on it.
The other company is going to operate the PackBot.
The PackBot is going to pull along the Ethernet and boost the radio signal for the Warrior.
At my suggestion, we mounted a nozzle on the arm of the Warrior, as we did in yesterday’s mock-up.
After use, the nozzle and the hose will both be radiation contaminated, so we need to be able to take them off very easily.
We will wrap rags around the arm to protect it and keep it from getting scratched.
Also, we will wrap the nozzle with rubber matting, and secure it with wire (four wires) so that it won’t slide.
POSTED: 30 JUNE 2011 Shielded Vehicle
My main task is to set-up and transport the shielded vehicle that was converted from a 15-ton three-axle flatbed truck.
I also have the job of being the navigator tomorrow inside the No. 3 nuclear reactor building.
I guess you can say I’m what they call a jack-of-all-trades.
It’s air-conditioned and pretty comfortable!
If there were no air conditioning, it would be a steel box in which we’d die. . .
Of course, there is ventilation as well.
Otherwise, we would suffocate . . .
There is a fan on the exhaust side, and it is naturally structured to pull in air on the intake side.
If it were only this ventilation system, radioactive material would get in, so an ULPA filter that is even higher performance than a HEPA (filter) is used.
Having said that, we still cannot take off our full clothing gear inside the shielded vehicle, and eating and smoking are prohibited.
The water in the PET bottle is from the condensation that drops from the air conditioner.
POSTED: 3 JULY 2011 Results From the Cleanup (Decontamination)
Today we went and used two PackBots to measure the radiation levels on the first floor portion of the No. 3 nuclear reactor building where the Warrior cleaned (decontaminated) yesterday.
As a result, the radiation levels dropped on average about 10%.
If you are just looking at the measurement results, they still look pretty high, but on the other hand, if you just look at the difference [in certain areas], you get the feel that it was quite effective.
There are places where the level dropped to a maximum of 80 Sv/h. [The author made a typo; it should be 80 mSv/h.]
The media is already running ahead to get the scoop.
I guess the media agencies are all striving to get the scoop first for themselves.
But that doesn’t mean they can just broadcast a bunch of nonsense and lies.
They need to accurately relay information to the citizens!
As I wrote in the article before, there is an element of compounded complex issues that come into play with regard to radiation and contamination, so it’s not that the radiation levels will drop drastically with this one clean-up (decontamination).
However, I think it can be said that it had a certain effectiveness in that [the radiation levels] dropped an average of 10% (maximum 80 mSv/h).
POSTED: 4 JULY 2011 Arriving at My Sleeping Quarters
Robot operator collapses from heatstroke . . .
A wheel from a 15-ton shielded vehicle flies off its axle!
Now, those would be good headlines!
No, just kidding.
You know we have a sense of humor, right?
I guess I’ll give it my best again this week.
Images: sh-blog.at.webry.info; videos: YouTube user SH19760926
UPDATED: August 27, 9:52 a.m.: Corrected boro comment; August 28, 10:20 p.m.: Added 2.4/4.9 GHz link; September 1, 5:52 p.m.: Corrected several typos.
The reason that so many people are fixated on building and programming humanoid robots is simple: we live in a world designed for humanoids. If we were all turtles, roboticists would have a much easier time of it, but unfortunately, we're bipeds with sharp eyes and a good sense of balance. Much effort has been expended trying to get robots to be able to cope with the wild diversity and (if we're being honest) rampant randomness and disorder in our lives, but a much simpler solution that will get capable and reliable robots into our homes far sooner is to just make some little tweaks here and their to help them do their jobs, which used to be our jobs.
Designer Diego Trujillo Pisanty‘s With Robots project imagines how robot friendliness might alter the design and functionality of our stuff. In the above picture, for example, sheets have been printed with optical tags at their folding points to assist a robot with making the bed. Here are four other examples, along with Pisanty's captions:
Setting the table is another task robots might be doing, having everything ready for when the owner comes home. The plate does not require a “plate tag” printed on it, the tag has been replaced by a notch on the edge indicating that this object is a plate. The notch also doubles as a holding point, it makes us wonder on the shape of the tool the robot uses to manipulate this object. The edges of the table are marked telling the machine where the limits are. In the scene presented the robot has made a mistake and placed all the cutlery rotated by 90 degrees.
Concentrating a bit more on how robots manipulate objects, a cup with a robot friendly handle was made. This object reveals a lot about the relationship between humans and robots, it creates a tension between the robotic and the human handle. The handle could become a design feature or it could be badly received, considered ugly and uncomfortable. The cupboard in which these cups rest have also been altered in order to accommodate the robot. Not only are there tags marking the position of objects but the doors have also been removed as they were not fit for A.I.
Cooking robots have been a promise for more then 60 years, ranging from Falks robotic toaster to the Jetsons’ Rosie maid-bot. How desirable is robot precision in the kitchen? The situation presented shows how meat has been precisely cut into cubes without leaving any cut marks on the chopping board. The board its self has notches to facilitate robot interaction. In the background the meat package can be seen, it too has been labelled to suggest that the robots operate beyond a single house.
Every living space is different, not only in the architectural layout but also in the tasks that the tenants require robots to do. For this reason robots ship only partially programmed so that through a learning algorithm they might adapt to the home they operate in. To accelerate the learning process special learning tools have been designed to help the robot integrate to a 3D environment. The picture on this page shows a living room after a robot self-training session. We can see it has now mastered the physics of equilibrium. It is also evident that it has mistaken one of the house’s dinner plates which it has broken with robotic precision to complete it’s piece. This scene intends to make us think of what the tenants would think when encountering such a display. It also tries to show what it would feel like to have objects that are useful only to robots in our house.
As far fetched as some of this seems, if you think about it, redesigning out homes this way may make a lot of sense. Would you tolerate sheets or dishes with specific patterns on them if it meant you never had to set the table, clear the table, or wash dishes ever again, for example? And there's really no reason why we'd even have to be aware of such aesthetic compromises, as long as markings are in UV or something. A fringe benefit would also be that implementing the changes pictured above would lower the stress levels of roboticists by about 100%.
Actually, several real-world robots are (or were) using household tags like this to make their jobs easier. The short-lived Ubot vacuuming robot relied on special flooring with embedded ultraviolet barcodes to do its localization, and more recently, Bosch has been using a t-shirt covered in optical tags to help it learn how to fold. Ubot never really took off and Bosch isn't planning on forcing us all to wear betagged clothing, but you know what, if it meant that I never had to do laundry or pick up after myself, I'd seriously consider changing my style to something a little more more robot-friendly.
We've been wondering how BeatBots' new My Keepon toy is going to compare to the original (especially considering that it's on the order of a thousandth the cost of a research Keepon), but this picture seems to show the forthcoming toy and I'll be honest: on the outside, I can't tell the difference.
As far as what the toy will be able to do, we've got more info on that as well:
My Keepon has two modes, selected with buttons on his "stage."
My Keepon dances like no other toy! A built-in microphone and state-of-the-art beat detection allows My Keepon to dance in time with the rhythm of clapping, patting, or your music. With an uncanny sense of timing and incredibly fluid movement, My Keepon will have you mesmerized as he grooves to any style. My Keepon will never dance the same way twice, so you will never tire of watching (or joining in).
My Keepon has an array of invisible sensors underneath his textured skin. Poke, tap, squeeze, or tickle, and My Keepon will react. You can even make it sneeze - just scratch its nose! My Keepon's mood also changes in response to your touch, with emotions ranging from curious, to excited, to sleepy, and everything in between. With My Keepon's rich nonverbal "language" and impossibly cute movements, you will love getting to know his personality and making a new best friend.
My Keepon will also be able to remember different types of interactions and change its behavior based on past history. If you don't play with it for a while, it'll let out "an occasional cry for attention." Expect to see different outfits and accessories show up at some point down the line, as well as a way for programmers to interface with the toy directly.
While we're still waiting to see exactly what it's like to interact with My Keepon, Toys R Us apparently saw it in action and immediately bought the entire first production run, so I guess that's a good sign. Unfortunately, we do now know that My Keepon doesn't include cameras due to cost constraints, and instead relies on touch sensors and sound localization to detect people.
Look for the toy in stores in October for under $50 (it's 35 quid in the U.K.). Since Toys R Us is planning a huge marketing campaign, there's absolutely no way you (or your kids) will be allowed to forget about it. And here's something else not to forget: buying a My Keepon helps fund research Keepons for autistic children. Yay!
What high school student could resist the opportunity to use a robot to wreak havoc aboard the International Space Station? And by wreak havoc, I mean compete with other high school students to remotely control a little floating SPHERES robot under the close supervision of a real live astronaut.
The Zero Robotics competition, hosted by NASA and DARPA in cooperation with MIT, Aurora Flight Sciences, and TopCoder, tasks teams of high school students with programming one of those cute little SPHERES microgravity satellite robots to autonomously complete a "technically challenging" three dimensional race against another robot programmed by another team. The programming can be done in a high-level graphical environment (with an instant-gratification Flash-based simulator), and as the teams get more comfortable with things, they can transition directly into C, meaning that the students may actually learn something practical. Woo!
Once the teams have tested their code out in simulation and physically in two dimensions (using robots at MIT), a full 3D simulation tournament will be held, and the top 27 teams from that event will watch live via webcast as the real SPHERES robots execute their code on board the International Space Station in December of this year.
If you want in, or if you're interested in mentoring a team, you can check out all the details and fill out an application on the Zero Robotics competition website here. The deadline is September 5.
Also, ISS astronauts, if you're reading this (you do read our blog, right?), you should totally have Robonaut pretend to juggle the SPHERES robots in microgravity and make a video. It would be awesome.
Nobody wants to venture into a mine after an accident, but the people who least want to be in there are those who might be already trapped inside. Rather than ask for human volunteers to go in and check things out, Sandia National Labs has developed a robotic platform that doesn't have a choice, called Gemini-Scout.
Collapsed mines offer all kinds of obstacles that threaten to suffocate, burn, crush, drown, electrocute, or otherwise impede even the toughest of robots. With this in mind, Sandia has done their best to mine-proof Gemini-Scout, which can clamber over rubble, wade through up to 18 inches of water, and not cause huge methane gas explosions thanks to sealed and spark-free electronics.
As capable as Gemini-Scout is, it doesn't come with an attached multidimensional quantum escape tunnel or anything. While the robot is theoretically capable of dragging a human behind it, its primary mission is to scout ahead to send back video and sensor readings to help an actual rescue team safely and quickly get where they need to go. In the short term, the robot can also deliver vital supplies such as food, water, medicine, air packs, radios, and Sony PSPs to any miners who may be trapped out of immediate reach.
It sort of seems like a robot with Gemini-Scout's capabilities would be good for more than just working in mines, and that may be the case, but Sandia seems to be pretty focused on optimizing Gemini-Scout for this one task. Their primary customer at this point is the Mine Safety and Health Administration, and miners can look forward to having a robot ready and willing to help them out when Gemini-Scout enters service next year.
I love weird UAVs. I don't mean "weird" in some kind of bad way, but more like, "different" and "creative" and "unexpectedly awesome." These two flying robots, Lockheed Martin's Samarai and Aerovel's Flexrotor, both display innovative and unique designs that highlight some of what's possible when you combine robots and imagination. And, you know, funding.
When we last talked to Lockheed Martin about their Samarai project, they were hard at work making their spinning maple seed-inspired robot into a practical surveillance flyer. Last year, the Samarai project's principal investigator, Kingsley Fregene, explained to us what exactly this design has to offer:
"The Samarai is inherently stable in hover, mechanically simple and has very few moving parts. This makes it a very robust aerodynamically clean airframe, just like nature’s samaras. It does not depend on fragile feathers, delicate wings or precision moving parts to operate. This design was chosen because of its versatility, ease of operation, multiple launch and recovery options (even in tight spaces) and its ability to hover and take-off/land vertically. The rotation of the entire aircraft offers opportunities to achieve omni-directional sensing in a much simpler, lighter-weight and cheaper package."
From the sound of things, those opportunities have been largely realized, since according to Lockheed Martin's press release, they just finished 3D printing a Samarai vehicle last week (!) that flies using a grand total of two moving parts and can stream back live 360 degree video without needing a gimbal.
There's been a video floating around this past week of the Samarai drone in action, but it's from last year. This video, however, shows the latest version being demoed at AUVSI, which just opened its doors for the 2011 expo yesterday. Apparently, this was the very first time that Samarai had been demoed to the public, and the fact that it was flown without a tether shows that Lockheed has either a.) a reckless disregard for spectator safety or b.) a lot of faith in their ability to control a robot that flies while spinning in circles really fast:
No decapitations! Brilliant!
Samarai's hover capability makes it a great surveillance bot, but the big trade-off is range, since helicopters (and helicopter-like robots) are simply not as efficient as aircraft that can utilize static wings for lift. This is the idea behind vehicles like the V-22 Osprey: It's a helicopter when you want to hover or land, and the rest of the time, it's an airplane. Aerovel's Flexrotor aircraft has taken this idea and condensed it into a "tabletop-sized" surveillance drone that can take off vertically, transition to forward flight, and then land vertically again on a specially built platform:
Slick. In this way, Flexrotor offers all those useful VTOL features, along with an endurance of 36 hours and a range of up to 3,000 kilometers. It's hard to say whether this technique is better or worse than some of the other innovative fixed-wing drone capture solutions that we've seen, but the fact that hovering is useful in many other situations makes Flexrotor potentially much handier to have around.
It was only a year ago that the University of Michigan's MABEL biped robot was breaking its ankles trying to walk over rough terrain. Now the robot is defying death once again by becoming the world's fastest bipedal robot, with the ability to sprint at up to 10.9 kph. More specifically, MABEL is the world's fastest "kneed" bipedal robot, which just means that it's the fastest robot that can run in a similar manner to us humans, leaving those Toyota robots (4.3 kph) and ASIMO (3.7 kph) in the dust.
MABEL is capable of such blistering speeds thanks to an innovative mechanical design which, although it may not look like it, incorporates a lot of the characteristics of a human runner. For example, MABEL has a torso that's substantially heavier than its legs, just like a human, and it's also got a system of springs that act like tendons. This gives MABEL a very human-like, bouncing gait, and the robot spends 40% of its time running in a "flight phase" with both feet off the ground, similar to humans:
For reference, MABEL's top speed of 3 meters per second probably isn't enough to catch a tolerably in-shape human, as Olympic sprinters can run at up to 10 meters per second over short distances. But the thing about robots is that they're determined, so in the end, it's a good thing that MABEL is tethered to that pole. And that it doesn't have any arms to grab you with. Or any vision sensors, either. So even if you can't run, at least you can hide.