Last week, the Committee on Patient Safety and Health Information Technology at the Institute of Medicine released a very interesting report concerning its investigation into health IT and improving patient safety. The title of the 197-page study, appropriately called "Health IT and Patient Safety: Building Safer Systems for Better Care," is focused on assessing "... some of the important issues surrounding health IT and its introduction and to indicate the activities most likely to bring the potential value of health IT to the U.S. health care system."
The IOM study was necessarily narrow in scope, and therefore the Committee did not look at issues such as "... whether health IT should be implemented, access to health IT products, medical liability, privacy, security, and standards." The Committee recognizes these critical issues need to be addressed, but its mission was to examine "..the aspects of health IT directly pertaining to safety."
The Committee report stated that:
"It is widely believed that, when designed and used appropriately, health IT can help create an ecosystem of safer care while also producing a variety of benefits such as reductions in administrative costs, improved clinical performance, and better communication between patients and caregivers. In this view, it can be a positive, transformative force for delivering health care. However, the assumption that the aforementioned benefits are highly correlated with health IT has not been adequately tested and there are some indications that the features needed to acquire one benefit may actually frustrate efforts to achieve another. In particular, there is a growing concern that health IT designs that maximize the potential for administrative and economic benefit may be creating new paths to failure.'
While the Committee discovered that "... specific types of health IT can improve patient safety under the right conditions, ... those conditions cannot be replicated easily and require continual effort to achieve."
More disheartening - but not surprising - is that the Committee found that sufficient evidence-based information to make informed judgments about how health IT improves - or even harms - patient safety is not available. The report states that:
"We tried to balance the findings in the literature with anecdotes from the field but came to the realization that the information needed for an objective analysis and assessment of the safety of health IT and its use was not available. This realization was eye-opening and drove the committee to consider ways to make information about the magnitude of the harm discoverable."
The report goes on to state that "... little published evidence could be found quantifying the magnitude of the risk."
The emphasis above is the report's.
The reasons for the lack of health IT-related safety data, the Committee report states, are due to a lack of "... measures and a central repository (or linkages among decentralized repositories) to collect, analyze, and act on information related to safety of this technology."
Furthermore, contracts between HIT vendors and users place "... contractual barriers (e.g., nondisclosure, confidentiality clauses) that can prevent users from sharing information about health IT-related adverse events."
In addition, those contracts often "... include language in their sales contracts and escape responsibility for errors or defects in their software (i.e., 'hold harmless clauses')." Users of HIT are therefore at legal risk when something goes wrong, instead of the vendor.
These and other barriers means that the information needed to assess the safety of HIT is not available. The study group recommended that vendors be made to disclose safety risks with their products, and users be encouraged to report HIT issues under a scheme that is "is voluntary, confidential, and nonpunitive," similar to how pilot safety reports are made. Last year, another government health advisory panel called for a nearly identical scheme to be put into place.
The report also emphasized that HIT safety is more than just the technology. For instance, HIT "... software - such as an EHR [electronic health record] - is neither safe nor unsafe because safety of health IT cannot exist in isolation from its context of use. Safety is an emergent property of a larger system that takes into account not just the software but also how it is used by clinicians."
"The larger system - often called a sociotechnical system - includes technology (e.g., software, hardware), people (e.g., clinicians, patients), processes (e.g., workflow), organization (e.g., capacity, decisions about how health IT is applied, incentives), and the external environment (e.g., regulations, public opinion). Adopting a sociotechnical perspective acknowledges that safety emerges from the interaction among various factors. Comprehensive safety analyses consider these factors taken as a whole and how they affect each other in an attempt to reduce the likelihood of an adverse event, rather than focusing on eliminating one 'root cause' and ignoring other possible contributing factors."
The Committee made a number of recommendations in addition to collecting safety data that would help improve health IT in relation to safety. For instance, the Committee recommended that the US Department of Health and Human Services (HSS) "... fund a new Health IT Safety Council to evaluate criteria for assessing and monitoring the safe use of health IT and the use of health IT to enhance safety."
Another was HHS "... should specify the quality and risk management process requirements that health IT vendors must adopt, with a particular focus on human factors, safety culture, and usability..." as well as provide funds for future research into the issue of HIT safety.
As I mentioned earlier, the Committee did not look at the issue of IT security in relation to HIT in its report, but it is an important issue. According to a story also last week in Government Executive, Senator Tom Coburn, the ranking Republican on the Senate Judiciary Privacy, Technology and the Law Subcommittee and a practicing obstetrician, wants IT security of EHRs be given a "rethink." Senator Coburn apparently worries that the ongoing push - or rush - to implement EHRs for all US citizens by 2014 is giving short shrift to IT security, something I tend to believe as well.
Of course, it would have been nice to have the issues involving EHR safety and security addressed before tens of billions of taxpayers dollars were committed towards providing incentives for their use.
Maybe next time.
Robert N. Charette is a Contributing Editor to IEEE Spectrum and an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.