There seems to be a slow but steady backlash growing among healthcare providers against the U.S. government’s $30 billion initiative to get all its citizens an electronic health record, initially set to happen by 2014 but now looking at 2020 or beyond. The backlash isn’t so much about the need for, or eventual benefits of, electronic health records but more about the perceived (and real) difficulties caused by the government's incentive program and a growing realization of the actual financial and operational costs involved in rolling out, using, and paying for EHR systems.

The backlash began to publicly surface last September when the U.S. government accused healthcare providers of “upcoding,” i.e., claiming with a single click on a field in a electronic health record to have provided a medical service or procedure when it wasn’t really performed. Kathleen Sebelius, the current HHS Secretary, and Eric Holder, the Attorney General, sent a letter to five major hospital trade associations (pdf) warning them that electronic health records were not to be used to “game the system” and “possibly” obtain “illegal payments” from Medicare. The letter said that Medicare billing is being scrutinized for fraud, and implied that those using EHRs to bill Medicare will be scrutinized even more carefully.

Healthcare providers were outraged by accusations in the letter, and said that the reason for the increased billing was that EHRs facilitated billing for services they used to provide to the government without charging for them.

About the same time, professors Stephen Soumerai from Harvard Medical School and Ross Koppel from the University of Pennsylvania wrote an article for the Wall Street Journalcontending that EHRs don’t save money as claimed. They wrote that, “…. the most rigorous studies to date contradict the widely broadcast claims that the national investment in health IT—some $1 trillion will be spent, by our estimate—will pay off in reducing medical costs. Those studies that do claim savings rarely include the full cost of installation, training and maintenance—a large chunk of that trillion dollars—for the nation's nearly 6000 hospitals and more than 600 000 physicians. But by the time these health-care providers find out that the promised cost savings are an illusion, it will be too late. Having spent hundreds of millions on the technology, they won't be able to afford to throw it out like a defective toaster.”

The professors went on to say that, “We fully share the hope that health IT will achieve the promised cost and quality benefits. As applied researchers and evaluators, we actively work to realize both goals. But this will require an accurate appraisal of the technology's successes and failures, not a mixture of cheerleading and financial pressure by government agencies based on unsubstantiated promises.”

The professors’ conclusions soon came under attack by EHR vendors, but the article seemed to have struck a nerve with many EHR adopters.

Next, in November, the U.S. Department of Health and Human Services inspector general (IG) released a report  that stated in part that the U.S. electronic health record incentive program administered by the Centers for Medicare & Medicaid Services (CMS) was “vulnerable” to fraud. The IG said that CMS “has not implemented strong prepayment safeguards” to keep healthcare providers from falsely claiming that they are meeting the required meaningful use standard [i.e., capture, use, and share data], and that CMS’s “ability to safeguard incentive payments postpayment [i.e., conduct audits] is also limited.”

CMS agreed with the IG that it needed to start verifying that healthcare providers are indeed meeting the meaningful use criteria, but disagreed that it should do more than the minimal cross-checking needed to determine whether healthcare providers are being truthful or not when submitting their claims for incentive payments. Healthcare providers also expressed their unhappiness about having to offer more proof that there were indeed meeting the meaningful use standards.

The backlash gained momentum when RAND published a new EHR study in January of this year that basically repudiated a key RAND EHR study from 2005. The 2005 study, paid for by several large EHR vendors, claimed that the U.S. could save at least $81 billion per year in health care costs, as well as drive down the rate of healthcare spending, through the widespread use of EHR systems. The study was a major point behind selling Congress on the U.S. EHR initiative.

For EHR vendors, the 2005 study was money well spent. However, the latest RAND study now admits that it was overly optimistic—or, more to the point, hopelessly unrealistic—as its critics at the time said. RAND’s latest report has studiously avoided putting any numbers on how much EHRs reduce (or increase) costs.

Also in January, the American Medical Association (AMA) sent a letter [pdf] to the Office of the National Coordinator for Health Information Technology (ONC) that it needs to slow down and rethink the EHR “meaningful use” criteria that healthcare providers have to meet in order to get reimbursed for their EHR investments. The AMA wrote that while it “shares the administration’s goal of widespread EHR adoption and use, …  we again stress our continuing concern that the meaningful use program is moving forward without a comprehensive evaluation of previous stages to resolve existing problem. A full evaluation of past stages and more flexible program requirements will help physicians in different specialties and practice arrangements successfully adopt and use EHRs.”

The basic AMA complaint is that the ONC is rushing the adoption of EHR technology at the expensive of its effective use in realistic medical settings.

Then in February, a survey of over 17 000 EHR adopters found that some 17 percent are already considering changing their EHR vendor because their EHR systems fail to meet their basic needs. The opinion survey, conducted by Black Book Rankings, indicates that 2013 might be the “year of the great EHR vendor switch,” a story at Healthcare IT News reports. As was predicted, the U.S. government’s EHR incentive program created a “gold rush” mentality where new EHR vendors popped out of the woodwork offering highly immature products along with extremely poor customer support, and healthcare practitioners bought them nevertheless, so as to not lose out on ONC EHR incentive payments.

Even those incentive payments to healthcare providers may not be enough to make EHR adoption worthwhile. Just a few days ago, a study published in the March issue of the journal Health Affairs and reported in HealthLeaders Media found that, “The average physician would lose $43 743 over five years after adopting EHRs and only 27 percent of physicians would profit through the transition away from paper records without federal financial aid. And even when the $44 000 in meaningful use incentives are added to the pot, only 41 percent of physicians would be in the black.”

The study also states, “The largest difference between practices with a positive return on investment and those with a negative return was the extent to which they used their EHRs to increase revenue, primarily by seeing more patients per day or by improved billing that resulted in fewer rejected claims and more accurate coding.” However, as we noted above, using EHRs to increase revenue might be greeted with an audit for fraud by the U.S.  Government.

"EHR: money loser, or federal government audit magnet?" is not exactly a good marketing slogan.

Another contributor to the perception that EHR conversion is a money losing proposition is the fact that for the military, it is.The U.S. Departments of Veterans Affairs and Defense told Congress that the cost of integrating their two EHR systems has climbed from between $4 billion and $6 billion to some $12 billion, and so they were calling a halt to the effort, which was scheduled to be completed by 2017, until they figured out what to do next. Congress, which had been vigorously pushing for the integration since 2008, was highly not amused, especially since over $1 billion has already been spent with little to show for it.  

When VA and DoD record systems will interoperate seamlessly is anyone’s guess, as is the cost to make it happen. As is the date when doctors, hospitals, and other health care providers—everyone but the EHR vendors themselves—will start seeing an adequate return on their investment in terms of time or money saved.

Photo: Cuneyt Hizal/iStockphoto

The Conversation (0)

The Spectacular Collapse of CryptoKitties, the First Big Blockchain Game

A cautionary tale of NFTs, Ethereum, and cryptocurrency security

8 min read
Mountains and cresting waves made of cartoon cats and large green coins.
Frank Stockton

On 4 September 2018, someone known only as Rabono bought an angry cartoon cat named Dragon for 600 ether—an amount of Ethereum cryptocurrency worth about US $170,000 at the time, or $745,000 at the cryptocurrency’s value in July 2022.

It was by far the highest transaction yet for a nonfungible token (NFT), the then-new concept of a unique digital asset. And it was a headline-grabbing opportunity for CryptoKitties, the world’s first blockchain gaming hit. But the sky-high transaction obscured a more difficult truth: CryptoKitties was dying, and it had been for some time.

The launch of CryptoKitties drove up the value of Ether and the number of transactions on its blockchain. Even as the game's transaction volume plummeted, the number of Ethereum transactions continued to rise, possibly because of the arrival of multiple copycat NFT games.

That perhaps unrealistic wish becomes impossible once the downward spiral begins. Players, feeling no other attachment to the game than growing an investment, quickly flee and don’t return.

Whereas some blockchain games have seemingly ignored the perils of CryptoKitties’ quick growth and long decline, others have learned from the strain it placed on the Ethereum network. Most blockchain games now use a sidechain, a blockchain that exists independently but connects to another, more prominent “parent” blockchain. The chains are connected by a bridge that facilitates the transfer of tokens between each chain. This prevents a rise in fees on the primary blockchain, as all game activity occurs on the sidechain.

Yet even this new strategy comes with problems, because sidechains are proving to be less secure than the parent blockchain. An attack on Ronin, the sidechain used by Axie Infinity, let the hackers get away with the equivalent of $600 million. Polygon, another sidechain often used by blockchain games, had to patch an exploit that put $850 million at risk and pay a bug bounty of $2 million to the hacker who spotted the issue. Players who own NFTs on a sidechain are now warily eyeing its security.

Remember Dragon

The cryptocurrency wallet that owns the near million dollar kitten Dragon now holds barely 30 dollars’ worth of ether and hasn’t traded in NFTs for years. Wallets are anonymous, so it’s possible the person behind the wallet moved on to another. Still, it’s hard not to see the wallet’s inactivity as a sign that, for Rabono, the fun didn’t last.

Whether blockchain games and NFTs shoot to the moon or fall to zero, Bladon remains proud of what CryptoKitties accomplished and hopeful it nudged the blockchain industry in a more approachable direction.

“Before CryptoKitties, if you were to say ‘blockchain,’ everyone would have assumed you’re talking about cryptocurrency,” says Bladon. “What I’m proudest of is that it was something genuinely novel. There was real technical innovation, and seemingly, a real culture impact.”

This article was corrected on 11 August 2022 to give the correct date of Bryce Bladon's departure from Dapper Labs.

This article appears in the September 2022 print issue as “The Spectacular Collapse of CryptoKitties.”

Keep Reading ↓Show less