Are We Automating Ourselves Into a Corner?

Automation Ironies and Paradoxes For the Public At Large

7 min read

Are We Automating Ourselves Into a Corner?

Most of the research into the ironies and especially the paradoxes of automation over the past thirty years or so have focused on areas where human safety has been an overriding concern and time to recover from a problem is short. Early research centered a lot of attention first to the automation used in aircraft, and then to other forms of transportation, and now to fields like weather forecasting and health care. 

My IEEE Spectrum on-line story, "Automated to Death," is about the ironies and paradoxes of automation. The ideas were articulated by researchers  K.S. Bibby et al. in their 1975 paper, "Man’s Role in Control Systems" (Proc. 6th IFAC Congress, Boston) and later expanded upon by Lisanne Bainbridge in her 1983 Automatica paper, "Ironies of Automation."

Quoting from Bainbridge’s paper,

"The classic aim of automation is to replace human manual control, planning and problem solving by automatic devices and computers. However, as Bibby and colleagues (1975) point out : ‘... even highly automated systems, such as electric power networks, need human beings for supervision, adjustment, maintenance, expansion and improvement. Therefore one can draw the paradoxical conclusion that automated systems still are man-machine systems, for which both technical and human factors are important.’ This paper [Bainbridge’s] suggests that the increased interest in human factors among engineers reflects the irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator."

For instance, the increased use of automation in health care such as cueing aids for cancer screening, e-prescribing systems, and electronic health records have also drawn the attention of researchers in the field of human-centered systems, as the same automation-operator issues that were encountered in the automation of cockpits are appearing here as well.

One research study found, for example, that screeners for breast cancer using computer-aid detection (CAD) software would sometimes use the results displayed not as attention cues (which indicated the screeners needed to examine the potential cancer mass that was detected further) but instead as a decision aid to help determine which patients should be called back for further screening or not.  Using the CAD system as a decision aid was not an intended (or foreseen) use by the system’s designers.

The over-reliance on automation is also sometimes called alternatively "automation bias" or "automation misuse" and refers to using automation in ways that its designers never intended.

The good news is that previous research in aircraft automation and the like can be applied to these new fields; however, there is also a cautionary note as well.

Charles Perrow, professor emeritus of sociology at Yale University, author of the landmark book, Normal Accidents: Living with High-Risk Technologies states that, "We've made strides in human factors but we're just barely keeping up, if not lagging behind, the engineering and technical strides being made. We've improved human factors but not faster than the need for human factors."

This is increasingly becoming a concern as what once were expensive automated systems become available to the general public. As these systems become more available, we can expect the ironies and paradoxes to become even more visible, along with their unpleasant side effects.

For instance, GPS has moved from use in military systems to cell phones. With its widespread introduction have come reports of people over-relying on what their GPS systems tell them.

In one case a few years back reported by Reuters, a German motorist drove his BWM straight into the Havel River in eastern Germany. The police reported that the driver drove his car past a stop sign, down a ferry ramp and into the river. The driver reported that he was following his car’s navigation system, which apparent showed what appeared to be a path across the river, but the system failed to note that it required a ferry to do so.

More recently as I blogged about here, one GPS system told a driver to drive off a cliff, while another recommended that the driver turnaround into on-coming traffic.

Clifford Nass, a communication professor at Stanford University makes the point that humans have a tendency to follow the instruction of machines over their common sense: "Rather than trust our judgment of nature, we let technology tell us what's going on," he says.

In New York, Governor David A. Paterson  is trying to fight this tendency the old fashion way: with tough, almost draconian, laws.

New York State officials point out that it is common knowledge that trucks and buses are not allowed on New York State Parkways. Yet for some reason, truckers in particular seem to ignore the law when their GPS units tell them to.

So a few months ago, Gov. Patterson announced legislation he plans to introduce in January 2010 that would require all large commercial trucks to use enhanced GPS devices that route them away from restricted roads.  Some 81 percent of bridge overpass strikes by commercial vehicles in the state are caused by GPS (mis)guidance. By requiring the use of enhanced GPS, New York hopes to reduce its bridge repair costs.  Drivers of commercial vehicle found by police not to be using an enhanced GPS system could not only receive a ticket, but have their vehicles confiscated.

British maritime authorities probably feel this way in relation to some shipboard navigation systems.  In January of this year and for the second time, the British Maritime Accident Investigation Branch (MAIB), which is a part of the UK Department for Transport, had to issue a Safety Flyer warning "about the misuse of computerised navigation systems by ships' officers with insufficient training."

The flyer which is meant for circulation to ferry and other ship operators states in part that they need to review their training requirements/provision with respect to the use of electronic chart display information systems (ECDIS) systems, especially where a system that is not approved as the primary means of navigation is provided and sited prominently on the bridge.

As apparently happened in two recent maritime accidents (one involving the cross-Channel Roll on Roll off passenger ferry Pride of Canterbury hitting a submerged wreck in January 2008, and another involving the grounding of the cargo ship CFL Performer in May 2008), the ships' officers relied on electronic navigation systems that they were untrained on and didn't know the limitations of.

Other examples of the ironies and paradoxes of automation seemed to abound this year.

There was a fatal crash of Colgan Air Flight 3407 in February that killed all 49 on board while the Bombardier Dash 8-Q400 was on a nighttime approach to Buffalo Niagara International Airport.

US National Transportation Safety Board (NTSB) safety investigators found that as the pilots, who were descending using the aircraft’s autopilot, extended the aircraft flaps for landing, the aircraft’s nose began to pitch radically up and down, and then soon rolled left and right. The stick shaker, a mechanical warning device that indicates a plane is about to stall, vibrated the pilots’ control column as well as turned off the autopilot.

If immediate correct action isn't taken or isn't effective, a stick pusher (which is an electrically activated system that pushes the control column forward) activates to push the aircraft’s nose down in order to increase airspeed to avoid a wing stall, which is what happened.

However, the pilots were never trained on how to use the stick pusher to avoid a stall, and apparently reacted in a natural - but opposite - manner than what was required. The pilot overrode the stick pusher by pulling back on the control column; the aircraft stalled and subsequently dived into the ground, killing one person there as well.

In this case, the operators didn't rely on the automation when they should have (but didn't know they should have).

An NTSB animation of the crash can be found here.

Then in June, Air France Flight 447 traveling from Rio de Janeiro, Brazil, to Paris, crashed in the Atlantic Ocean, killing all 228 on board the Airbus A330-200.  The aircraft’s black boxes haven't been recovered, but the aircraft’s automated onboard maintenance messages sent indicated significant problems. First, the auto pilot system disengaged. Then a basic auto flight message warning was sent. Next, there was a warning to the pilots that the Flight Control protection has been degraded. Then, warning flags appeared on the primary flight displays of the captain and co-pilot telling them of degradation of the information now being displayed. And so on over the course of one minute of flying time.

Suspicion has fallen on the problems with the aircraft’s pitot tubes or airspeed indicators. If an aircraft’s pitot tubes provide incorrect air speed to the flight computer, it may command erroneous aircraft actions, as happened to several Northwest Airlines pilots flying Airbus 330s and to two USAF B-2 bombers, causing one to crash on takeoff in 2008 (a video of the crash can be found here).  

France's Bureau of Investigation and Analysis says that it believes pitot tubes are "a factor, but not the cause" of the crash. Recent reports indicate that a new search for Air France Flight 447’s black boxes will begin in February of next year.

There have been other aircraft automation incidents recently that have caught pilots off-guard, such as one involving an Eclipse 500 jet landing at Midway Airport in Chicago in June 2008 and two involving Qantas Airbus aircraft, one in October 2008 and another in December 2008.

Probably most telling about our sometimes misplaced faith in automation is found in the successful ditching of US Airways Flight 1549 in the Hudson River in January.

As co-pilot Jeff Skiles grabbed the Quick Reference Handbook to find the most appropriate procedure for the emergency, he found out that US Airways had eliminated the tabs in the handbook to save money, but costing pilots precious time in finding the right emergency procedures.  You can see Capt. Sully Sullenberger talk about it on the Daily Show with Jon Stewart here.

So what can be done about the paradoxes and ironies that come with highly reliable automation?

One way is to try to build even more reliable systems.  However, Peter Neumann, Principal Scientist at SRI International and who knows as much about technology risk as anyone I know, points out those making IT systems more reliable tends to make those same systems even more complex to design, understand and potentially error prone; another one of automation’s ironies and paradoxes.

Peter also points out another disconcerting trend in IT - making reliability and security invisible to the users.

"Look at what is happening," Peter told me. "There is the idea that security and reliability have to be made invisible, which means there won't be handles in the interface for anyone to deal with them because they will all be taken out of their hands."

Now in cases of trouble, system users won't have a clue of what to look for or do.

Another way may be to try to sensitize system designers and engineers to the issues involving the ironies and automation by redefining causes of system failure, for instance,  the routine blaming of  pilot error for the causes of aircraft accidents. Professor Sidney Dekker and others like NASA's  Key Dismukes suggest (see here and here) that this is what may be needed in the aviation industry, and it is likely needed in other industries as well.

Just yesterday, news reports are saying that Colgan Airlines is blaming pilot error for the Buffalo crash. The crew’s cockpit procedures definitely were not up to spec, but was the crash entirely their fault?  Aircraft automation played no part whatsoever?

A third way is as I noted in the article.  Maybe the best advice is that which is given by Martyn Thomas, a fellow of the Royal Academy of Engineering: we need to routinely turn off our automated systems and see what happens.

In this way, perhaps we will be more aware of our everyday dependence on our computers, and by doing so, be better prepared to deal with the consequences when they don't work as we expect them to.

The Conversation (0)