Risk Factor iconRisk Factor

NPfIT in Trouble?

The UK National Program for IT (NPfIT), the national electronics health record program may be in deep trouble. A report last week in ComputerWeekly said that "a board of an NHS trust has learned of a "significant" risk of ­Fujitsu ending its £900m contract to supply and implement hospital systems across southern England."

If Fujitsu pulls out, the NPfIT roll-out would highly probably be delayed, and the whole program called into question. At the very least, a high level review that has been repeatedly called for would be hard to dodge any longer.

In 2006, Accenture pulled out of its £2bn contract with the project, writing off £230m in the process.

A year ago, a senior manager at Fujitsu gave a presentation at a conference on whether NPfIT could be made to work, and was severely beat up for it. Maybe Fujitsu should have bailed out then while the bailing was good.

Who Watches the Automated Watch Watchers?

The French bank Société Générale SA admitted that a "rogue trader" who lost $7.2 billion in trades was able to by-pass five levels of controls for a year before finally slipping up and getting caught.

The trader, by the name of Jérÿme Kerviel, hid the trades by making fake orders to balance each of the genuine orders he placed. Although the bank says he operated alone, many are skeptical. It is known that he used to work in the bank's back office, and therefore had detailed knowledge of how trades were processed and monitored.

Apparently Kerviel spent time hacking the risk control system which enabled him to hide his trades. He was able to do so by using his colleagues' passwords, although how he got them has not been disclosed.

A determined person can probably circumvent any set of automated risk control system, and that the control system itself needs to be monitored for signs of tampering. The UK government financial regulators are now looking at UK banks for such a problem.

Cable Company Loses Customers' Emails

A software error during routine maintenance at Charter Communications in St. Louis erased the email accounts of 14,000 of its customers last week. There is no way to recover any of them.

When a new Internet user joins Charter, the company provides the user a free e-mail account. However, some users don't activate it, so every three months the company deletes inactive accounts.

During last weeks maintenance, Chareter erroneously deleted active accounts along with the others.

Said a spokesperson for Charter: "It's never happened before. They are taking steps to make sure it never happens again."

"We really are sincerely sorry for having had this happen and do apologize to all those folks who were affected by the error," she said.

In 2007, Charter was named Cable Operator of the Year by CableWorld Magazine. I wonder if this will keep them out of the running for 2008.

Computer Problem Delays East Coast Flights

A computer problem at 1800 local time Wednesday night at the Nashua flight center, officially known as Boston Air Route Traffic Control Center (or Boston Center) had to shut down and reboot a computer system used to track flight routes, aircraft type, and other key information about planes flying in and out of the region, according to the Boston Globe. The system was down for about 45 minutes, although the National Air Traffic Controllers Association (NATCA) said it was longer.

During the outage, controllers went back to entering flight information manually, and needed to call other air traffic centers to obtain aircraft information for flights entering New England airspace. The Federal Aviation Administration (FAA) said there were no safety issues, but NATCA disagreed.

" 'This was, in every possible sense, a dangerously unsafe and chaotic situation,' said Kevin Bianchi, Boston Centerâ''s NATCA facility representative. 'Controllers were in essence working blind and, in many cases, actually had to question pilots to determine their location and routes of flight. Controllers were required to use a secondary backup system to safely track aircraft.' "

The problem caused delays to flights at Logan International Airport and other New England airports as well as several international routes that travel in New England airspace on the way in and out of New York.

Why the problem occurred is not known. The FAA said that it is now investigating.

Two-Tier Security: One for Celebs, One for the Common Folk

The London Telegraph over the weekend reported that, "Thousands of 'high profile' people have been secretly barred from using the online tax return system amid concerns that their confidential details would be put at risk."

HM Revenue and Customs (HMRC) admitted the on-line tax system used by 3 million "common folk" was not secure enough to be used by MPs, celebrities and the Royal Family.


Those barred from using the on-line system must send in hard copies of their tax returns.

The HMRC says that, "HMRC online services are designed with security as an integral part of the service. We use leading technologies and encryption software to safeguard data and operate strict security standards."

Come again?

The HMRC claims that the celebs need extra security, and the rumor is that there is a highly-secure tax database just for them. I guess the HMRC doesn't trust its employees too much, does it?

London Crash Update


Attention in last week's crash of the Boeing 777 in London now seem to be moving away from computer error to something wrong with the fuel, according to several reports.

The UK's Air Accident Investigation Branch released an update which said: "As previously reported, whilst the aircraft was stabilised on an ILS approach with the autopilot engaged, the autothrust system commanded an increase in thrust from both engines. The engines both initially responded but after about 3 seconds the thrust of the right engine reduced. Some eight seconds later the thrust reduced on the left engine to a similar level. The engines did not shut down and both engines continued to produce thrust at an engine speed above flight idle, but less than the commanded thrust."

"Recorded data indicates that an adequate fuel quantity was on board the aircraft and that the autothrottle and engine control commands were performing as expected prior to, and after, the reduction in thrust."

The AAIB goes on to say that, "All possible scenarios that could explain the thrust reduction and continued lack of response of the engines," will be investigated.

The computer as culprit theory, however, still is popular, as shown in this cockpit video.

FCS: Double the Code; Double the Success?

I neglected to mention in yesterday's post on Future Combat Systems that the Washington Post story also noted that the, "Boeing's program manager on Future Combat Systems, said, 'The scope and scale of the software job was well understood from the start.' "

From this statement, one would have to assume that the 55 million lines of code in 5.5 years was known at contract let time; the development of that much code in that much time was deemed suitable, acceptable, and feasible, and the Army kept it under wraps because it wasn't (and isn't) believable.

This might go a long way towards explaining another FCS program history anomaly. According to former chief of staff of the Army General Peter Schoomaker, when the contract was let in 2003, the FCS program had only a 28 percent chance of success.

After the FCS program was restructured with phased deliveries and a longer schedule, the probability of success of the program climbed to 70%, according to Schoomaker.

Now, if the estimated software code was really only 34 million in 2003, and the program had a 28% probability of success, it hardly seems likely that nearly doubling the lines of code to 64 million would raise the probability of program success to 70%, even if the program was stretched out a few more years and bits and pieces of it spun off.

Therefore, the evidence points to the Army to have known that it would need 55 million or more lines of software code, but did not want to tell anyone for reasons I outlined yesterday.

On the other hand, if the Army honestly estimated that FCS would need 34 million lines of code in 2003 and the program had a 28% probability of success, how did going to 64 million lines of code increase the program's probability of success by 41%?

Future Combat Systems: Did the Code Size Double, or Did Somebody Disassemble?


There is a long article on the US Army's $200 billion Future Combat System (FCS) in today's Washington Post and its dependence on software. The article points out a few of the "challenges" confronting the program like making a 63.8 million line of code system work, let alone be hacker proof.

One bit of the story caught my eye, however. The supposedly original 2003 estimate by the Army was for FCS to require 33.7 million lines of code; it has now blossomed to a current estimate of 63.8 million lines of code. However, Boeing, the program's lead contractor, claims that the original estimate was for actually 55 million lines of code, implying code growth isn't that bad.

The reason I find this curious is that the 33.7 million lines of code estimate has been around for several years, and appears in Congressional testimony many, many times. That number gave lots of folks pause in 2003, since the Army claimed at the time that it would complete FCS in five and a half-years. Questions were raised then about whether that amount of code could be developed in that time frame, but the ever-confident Army said it could be accomplished.

I have never heard or seen that 55 million lines of code number ever mentioned before this article. If that was the true estimate at proposal time, did the contractor and the Army "forget" to let Congress, the Governmental Accountability Office (GAO), and a whole bunch of other people know the true system size so that they wouldn't ask questions in 2003 like, "Tell me again how you plan to develop and integrate an average of 10 million lines of native and commercial-off-the-shelf software per year over each of the next five years?" "Can you point to any military software-intensive development of 10 million lines of code successfully completed in 5 years?" "Can you prove you are not legally insane?"

As Ricky used to say to Lucy, "Lucy, you got some explaining to do." If not Boeing, then certainly the Army needs to explain where this 55 million lines of code number came from, if it was the originally proposed number, and why it hasn't ever been disclosed before.

Second Life Becomes Second Swipe


Linden Lab's virtual world Second Life has had to crack down on virtual banking, giving them until yesterday to shut down their operations, according to the LA Times.

The reason for Linden's action is that after the infamous Ginko Financial ponzi-scheme scam that blew up last August, Linden Lab's cool response to those scammed once the scheme was discovered, and the continued appearance of "too good to be true" investment offers, confidence and trust in Second Life's economic underpinnings has started to erode. Linden Labs must have started to figure out that if people started viewing Second Life as a place to be scammed, major corporations might start rethinking their association with it.

Are Future US Programmers Being Taught to be Unemployable?

In an article titled, "Computer Science Education: Where Are the Software Engineers of Tomorrow?" in this month's CrossTalk (the Journal of Defense Software Engineering) and in a subsequent interview in Datamation under the title of "Who Killed the Software Engineers", two emeritus computer science professors from New York University argue that universities are so desperate to keep computer science student enrollments up, that they are dumbing down the curriculum to attract prospect students. This dumbing down, professors Robert B.K. Dewar and Edmond Schonberg say, is producing software engineers with a "set of skills insufficient for todayâ''s software industry (in particular for safety and security purposes), and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals."

Dewar says in the interview that, " 'A lot of it is, â''Letâ''s make this [computer science and programming] all more fun.â'' You know, â''Math is not fun, letâ''s reduce math requirements. Algorithms are not fun, letâ''s get rid of them. Ewww â'' graphic libraries, theyâ''re fun. Letâ''s have people mess with libraries. And [forget] all this business about â''command lineâ'' â'' weâ''ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun.' "

Dewar goes on, " 'Universities tend to be in the raw numbers mode. Oh my God, the number of computer science majors has dropped by a factor of two, how are we going to reverse that?â'' â''

Dewar and Schonberg point out in their article that companies like UK-based Praxis (see an article on the company published in IEEE Spectrum) who use formal methods to develop safety-critical systems are having a hard time finding people with the proper mathematical training, even though formal methods are taught in more in the UK than in the US.

I blogged a few months ago about Cambridge University having trouble recruiting computer science students, with part of the reason for the troubles being that the program, in Cambridge's words, "is a rigorous and demanding course." Yesterday's Globe and Mail also had a story about computer science enrollments dropping at many Canadian Universities by 36% to 64%.

The article has caused a stir in the defense community, with Dewar saying that he has received a lot of support for the position in their CrossTalk article.

But is the situation as dire as professors Dewar and Schonberg claim, or a natural issue of supply and demand, or is it over-blown, being one of those, "When I was your age, I had to walk fifty miles to school" arguments, or is it something else?


Risk Factor

IEEE Spectrum's risk analysis blog, featuring daily news, updates and analysis on computing and IT projects, software and systems failures, successes and innovations, security threats, and more.

Willie D. Jones
Load More