This year, AI continued looming large in the software world. But more than before, people are wrestling with both its amazing capabilities and its striking shortcomings. New research has found that AI agents are doubling the length of task they can do every seven months—an astounding rate of exponential growth. But the quality of their work still suffers, clocking in at about a 50 percent success rate on the hardest tasks. Chatbots are assisting coders and even coding autonomously, but this may not help solve the biggest and costliest IT failures, which stem from managerial failures that have remained constant for the past twenty years or more.
AI’s energy demands continue to be a major concern. To try to alleviate the situation, a startup is working on cutting the heat produced in computation by making computing reversible. Another is building a computer of actual human brain cells, capable of running tests on drug candidates. And some are even considering sending data centers to the moon.
1. The Top Programming Languages 2025

While the rankings of software languages this year were rather predictable—yes, Python is still number one—the future of software engineering is as uncertain as can be. With AI chatbots assisting many with coding tasks, or just coding themselves, it is becoming increasingly different to gather reliable data on what software engineers are working on day-to-day. People no longer post their questions on StackExchange or a similar site—they simply ask a chatbot.
This year’s top programming languages list does its best to work with this limited data, but it also poses a question: In a world where AI writes much of our code, how will programming languages change? Will we even need them, or will the AI simply bust out optimized assembly code, without the need for abstraction?
2. How IT Managers Fail Software Projects

Robert Charette, lifelong technologist and frequent IEEE Spectrum contributor, wrote back in 2005 about all the known, preventable reasons software projects end in disaster. Twenty years later, nothing has changed—except for trillions of more dollars lost on software failures. In this over 3,500-word screed, Charette recounts multiple case studies, backed up by statistics, recounting the paltry state of IT management as it is—still—done today. And to top it off, he explains why AI will not come to the rescue.
3. Human Brain Cells on a Chip For Sale

Australian startup Cortical Labs announced that they are selling a biocomputer powered by 800,000 living human neurons on a silicon chip. For US $35,000, you get what amounts to a mini-brain in a box that can learn, adapt, and respond to stimuli in real time. The company already proved the concept by teaching lab-grown brain cells to play Pong (they often beat standard AI algorithms at learning efficiency). But the real application is drug discovery. This “little brain in a vat,” as one scientist put it, lets researchers test whether experimental drugs restore function to impaired neural cultures.
4. Large Language Models Are Improving Exponentially

It’s difficult to agree on a consistent way to evaluate how well large language models (LLMs) are performing. The nonprofit research organization Model Evaluation & Threat Research (METR) proposed an intuitive metric—tracking how long it would take a human to do the tasks AI can do. By this metric, LLM capabilities are doubling every seven months. If the trend continues, by 2030, the most advanced models could quickly handle tasks that currently take humans a full month of work. But, for now, the AI doesn’t always do a good job—the chance the work will be done correctly, for the longest and most challenging tasks, is about 50 percent. So the question is: How useful is a fast, cheap employee that produces garbage about half the time?
5. Reversible Computing Escapes the Lab in 2025

There is a surprising principle that connects all software to the underlying physics of hardware: Erasing a bit of information in a computer necessarily costs energy, usually lost as heat. The only way to avoid losing this energy is to never erase information. This is the basic idea behind reversible computing—an approach that has remained in the academic sphere until this year.
After three decades of academic research, reversible computing is finally going commercial with startup Vaire Computing. Vaire’s first prototype chip recovers energy in an arithmetic circuit. The team claims that with their approach, they could eventually deliver a 4,000x energy efficiency improvement over conventional chips. The catch is that this requires new gate architectures, new design tools, and integrating MEMS resonators on chip. But with a prototype already in the works, reversible computing has graduated from “interesting theory” to “we’re actually building this.”
6. Airbnb’s Dying Software Gets a Second Life
Nicole Millman
Apache Airflow—the open-source workflow orchestration software originally built by Airbnb—was basically dead by 2019. Then, one enthusiastic open-source contributor stumbled across it while working in IoT and thought “this is too good to die.” He rallied the community, and by late 2020 they shipped Airflow 2.0. Now the project is thriving. It boasts 35 to 40 million downloads per month and over 3,000 contributors worldwide. And Airflow 3.0 launched with a modular architecture that can run anywhere.
7. The Doctor Will See Your Electronic Health Records Now

In 2004, President Bush set a goal for the United States to transition to electronic health records (HER) by 2014, promising transformed healthcare and huge cost savings. Twenty years and over $100 billion later, we’ve achieved widespread EHR adoption—and created a different nightmare. Doctors now spend on average 4.5 hours per day staring at screens instead of looking at patients, and clicking through poorly designed software systems.
The rush to adopt EHRs before they were ready meant ignoring warnings about systems engineering, interoperability, and cybersecurity. Now we’re stuck with fragmented systems that don’t talk to each other (the average hospital uses 10 different EHR vendors internally) and physicians experiencing record burnout levels. And to top it off, data breaches have exposed 520 million records since 2009. Healthcare costs haven’t bent downward as promised—they’ve hit $4.8 trillion, or 17.6 percent of GDP. The irony? AI scribes are now being developed to solve the problems that the last generation of technology created, allowing doctors to actually look at patients again instead of their keyboards.
8. Is it Lunacy to Put a Data Center On the Moon?

- The Story Behind Pixar’s RenderMan CGI Software ›
- The Real Story of Stuxnet ›
- The Top 8 Computing Stories of 2024 ›
Dina Genkina is an associate editor at IEEE Spectrum focused on computing and hardware. She holds a PhD in atomic physics and lives in Brooklyn.



