The December 2022 issue of IEEE Spectrum is here!

Close bar

A story in today's Wall Street Journal says that 2,400 corporations' and numerous governments' computer systems have been successfully hacked into by coordinated attacks over the past 18 months that have originated from Europe and China.

According to the WSJ, the attack (which was discovered by the security company NetWitness) was much more extensive than the one recently launched against Google. Some 75,000 computers in 196 countries have reportedly been hacked. Cardinal Health and Merck & Co. have been identified as among those attacked. Merck said in the WSJ article that no sensitive information had been compromised.

The attack used a version of the botnet ZeuS, which is in the midst of a botnet war with a smaller rival that is trying to steal its data and then displace it.

You can see how botnets work here in this BBC tutorial.

This latest attack shows how vulnerable the state of IT security is right now. This was brought out again in several more stories this week.

First were the results of the latest cyberwar test called "Cyber Shockwave" that took place earlier this week.  It was developed by the Bipartisan Policy Center to see how senior government policy officials would react in real-time to a major cyber attack. Ten former senior government officials played out a scenario they had not been briefed on to see how they would react. You can read more about it here.

The results were reportedly not good (see this article in the Los Angeles Times and this in Bloomberg News). The best summation of the current state of IT security as the scenario was being run and the officials debated what to do was this statement from one of the participants: "I'm not hearing any answers here as to how to fix this."

When the Cyber Shockwave was complete, everyone agreed that the US was vulnerable and that a lot more need to be done including possible changes in US law regarding how it can respond to a cyber attack.

Next, Mitre and the SANS Institute released its updated 2010 Top 25 Most Dangerous Programming Errors. The new list has a more focused approach to help define what specific actions programmers can take to mitigate them.

The basic message is the same as last year: there are too many long known and easily fixable programming errors that are creating unnecessary IT security holes.

And my message on the subject is the same as last year: it is time to treat these dangerous programming errors as software never events.

Finally, there was a story in yesterday's ComputerWorld that says that some 784 PCs in the city of Norfolk, Virginia were taken out by malware resident in an internal virtual print server. The malware activated when the PCs were shut down. When users tried to boot back up again, they discovered that they could not - their C drives were wiped clear.

City officials say that they don't know what attack caused the problem or where it came from since the information was destroyed when the print server was cleaned the virtual print server. The FBI has been called in to assist in the investigation.

The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
A plate of spaghetti made from code
Shira Inbar

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less