Risk Factor iconRisk Factor

Two-Tier Security: One for Celebs, One for the Common Folk

The London Telegraph over the weekend reported that, "Thousands of 'high profile' people have been secretly barred from using the online tax return system amid concerns that their confidential details would be put at risk."

HM Revenue and Customs (HMRC) admitted the on-line tax system used by 3 million "common folk" was not secure enough to be used by MPs, celebrities and the Royal Family.

Lovely.

Those barred from using the on-line system must send in hard copies of their tax returns.

The HMRC says that, "HMRC online services are designed with security as an integral part of the service. We use leading technologies and encryption software to safeguard data and operate strict security standards."

Come again?

The HMRC claims that the celebs need extra security, and the rumor is that there is a highly-secure tax database just for them. I guess the HMRC doesn't trust its employees too much, does it?

London Crash Update

Boeing.gif

Attention in last week's crash of the Boeing 777 in London now seem to be moving away from computer error to something wrong with the fuel, according to several reports.

The UK's Air Accident Investigation Branch released an update which said: "As previously reported, whilst the aircraft was stabilised on an ILS approach with the autopilot engaged, the autothrust system commanded an increase in thrust from both engines. The engines both initially responded but after about 3 seconds the thrust of the right engine reduced. Some eight seconds later the thrust reduced on the left engine to a similar level. The engines did not shut down and both engines continued to produce thrust at an engine speed above flight idle, but less than the commanded thrust."

"Recorded data indicates that an adequate fuel quantity was on board the aircraft and that the autothrottle and engine control commands were performing as expected prior to, and after, the reduction in thrust."

The AAIB goes on to say that, "All possible scenarios that could explain the thrust reduction and continued lack of response of the engines," will be investigated.

The computer as culprit theory, however, still is popular, as shown in this cockpit video.

FCS: Double the Code; Double the Success?

I neglected to mention in yesterday's post on Future Combat Systems that the Washington Post story also noted that the, "Boeing's program manager on Future Combat Systems, said, 'The scope and scale of the software job was well understood from the start.' "

From this statement, one would have to assume that the 55 million lines of code in 5.5 years was known at contract let time; the development of that much code in that much time was deemed suitable, acceptable, and feasible, and the Army kept it under wraps because it wasn't (and isn't) believable.

This might go a long way towards explaining another FCS program history anomaly. According to former chief of staff of the Army General Peter Schoomaker, when the contract was let in 2003, the FCS program had only a 28 percent chance of success.

After the FCS program was restructured with phased deliveries and a longer schedule, the probability of success of the program climbed to 70%, according to Schoomaker.

Now, if the estimated software code was really only 34 million in 2003, and the program had a 28% probability of success, it hardly seems likely that nearly doubling the lines of code to 64 million would raise the probability of program success to 70%, even if the program was stretched out a few more years and bits and pieces of it spun off.

Therefore, the evidence points to the Army to have known that it would need 55 million or more lines of software code, but did not want to tell anyone for reasons I outlined yesterday.

On the other hand, if the Army honestly estimated that FCS would need 34 million lines of code in 2003 and the program had a 28% probability of success, how did going to 64 million lines of code increase the program's probability of success by 41%?

Future Combat Systems: Did the Code Size Double, or Did Somebody Disassemble?

FCS.gif

There is a long article on the US Army's $200 billion Future Combat System (FCS) in today's Washington Post and its dependence on software. The article points out a few of the "challenges" confronting the program like making a 63.8 million line of code system work, let alone be hacker proof.

One bit of the story caught my eye, however. The supposedly original 2003 estimate by the Army was for FCS to require 33.7 million lines of code; it has now blossomed to a current estimate of 63.8 million lines of code. However, Boeing, the program's lead contractor, claims that the original estimate was for actually 55 million lines of code, implying code growth isn't that bad.

The reason I find this curious is that the 33.7 million lines of code estimate has been around for several years, and appears in Congressional testimony many, many times. That number gave lots of folks pause in 2003, since the Army claimed at the time that it would complete FCS in five and a half-years. Questions were raised then about whether that amount of code could be developed in that time frame, but the ever-confident Army said it could be accomplished.

I have never heard or seen that 55 million lines of code number ever mentioned before this article. If that was the true estimate at proposal time, did the contractor and the Army "forget" to let Congress, the Governmental Accountability Office (GAO), and a whole bunch of other people know the true system size so that they wouldn't ask questions in 2003 like, "Tell me again how you plan to develop and integrate an average of 10 million lines of native and commercial-off-the-shelf software per year over each of the next five years?" "Can you point to any military software-intensive development of 10 million lines of code successfully completed in 5 years?" "Can you prove you are not legally insane?"

As Ricky used to say to Lucy, "Lucy, you got some explaining to do." If not Boeing, then certainly the Army needs to explain where this 55 million lines of code number came from, if it was the originally proposed number, and why it hasn't ever been disclosed before.

Second Life Becomes Second Swipe

Thief.gif

Linden Lab's virtual world Second Life has had to crack down on virtual banking, giving them until yesterday to shut down their operations, according to the LA Times.

The reason for Linden's action is that after the infamous Ginko Financial ponzi-scheme scam that blew up last August, Linden Lab's cool response to those scammed once the scheme was discovered, and the continued appearance of "too good to be true" investment offers, confidence and trust in Second Life's economic underpinnings has started to erode. Linden Labs must have started to figure out that if people started viewing Second Life as a place to be scammed, major corporations might start rethinking their association with it.

Are Future US Programmers Being Taught to be Unemployable?

In an article titled, "Computer Science Education: Where Are the Software Engineers of Tomorrow?" in this month's CrossTalk (the Journal of Defense Software Engineering) and in a subsequent interview in Datamation under the title of "Who Killed the Software Engineers", two emeritus computer science professors from New York University argue that universities are so desperate to keep computer science student enrollments up, that they are dumbing down the curriculum to attract prospect students. This dumbing down, professors Robert B.K. Dewar and Edmond Schonberg say, is producing software engineers with a "set of skills insufficient for todayâ''s software industry (in particular for safety and security purposes), and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals."

Dewar says in the interview that, " 'A lot of it is, â''Letâ''s make this [computer science and programming] all more fun.â'' You know, â''Math is not fun, letâ''s reduce math requirements. Algorithms are not fun, letâ''s get rid of them. Ewww â'' graphic libraries, theyâ''re fun. Letâ''s have people mess with libraries. And [forget] all this business about â''command lineâ'' â'' weâ''ll have people use nice visual interfaces where they can point and click and do fancy graphic stuff and have fun.' "

Dewar goes on, " 'Universities tend to be in the raw numbers mode. Oh my God, the number of computer science majors has dropped by a factor of two, how are we going to reverse that?â'' â''

Dewar and Schonberg point out in their article that companies like UK-based Praxis (see an article on the company published in IEEE Spectrum) who use formal methods to develop safety-critical systems are having a hard time finding people with the proper mathematical training, even though formal methods are taught in more in the UK than in the US.

I blogged a few months ago about Cambridge University having trouble recruiting computer science students, with part of the reason for the troubles being that the program, in Cambridge's words, "is a rigorous and demanding course." Yesterday's Globe and Mail also had a story about computer science enrollments dropping at many Canadian Universities by 36% to 64%.

The article has caused a stir in the defense community, with Dewar saying that he has received a lot of support for the position in their CrossTalk article.

But is the situation as dire as professors Dewar and Schonberg claim, or a natural issue of supply and demand, or is it over-blown, being one of those, "When I was your age, I had to walk fifty miles to school" arguments, or is it something else?

UK Loses Same Personal Data Twice

In a highly embarrassing, politically damaging and somewhat bizarre admission, the UK government over the past few days announced that (at least) three Ministry of Defence (MoD) laptops containing the personal details of hundreds of thousands of military personnel and recruits have been lost.

An MoD laptop containing details of over a half a million individuals who applied to join the military over the past decade was lost October 2006. Another laptop was lost in December 2005 that had the details on 500 individuals. And then there was the one lost on the 9th of January of this year that contained the personal details of 153,000 potential recruits, as well as the banking details of 3,700 service members.

What has made members of Parliament furious is that the data was not encrypted; much of the same data apparently has been lost twice; no one can explain to them why personal information was on these laptops in the first place; and the gravest sin of all is that members were never told about the 2005 and 2006 incidents until this week. They only came to light because of the investigation into the 2008 lost laptop incident.

Promises by the MoD to safeguard information in the future have been met with skepticism - to put it mildly.

I wonder if we are witnessing a UK government - or at least a Prime Minister - ready to fall because of failure to protect its citizens' personal information. All it may take is one more loss of good size to do it, I think.

Utilities Act Risk of Being Hacked: CIA

Power.gif

A story that appeared over the weekend in the Washington Post and elsewhere tells of a CIA warning to US utilities that hackers have broken "into the computer systems of utility companies outside the United States and made demands, in at least one case causing a power outage that affected multiple cities."

The warning was made by Tom Donahue, the CIA's top IT security analyst, last Wednesday at a trade conference in New Orleans sponsored by the SANS Institute.

According to the Post story, "We suspect, but cannot confirm, that some of the attackers had the benefit of inside knowledge,' Donahue said. He did not specify where or when the attacks took place, their duration or the amount of money demanded. Little said the agency would not comment further."

The warning was taken more seriously than most because the CIA is normally pretty mum on what it knows or is doing in the area of cyber-security.

As a footnote, the Post said that, "On Thursday, the Federal Energy Regulatory Commission approved eight cybersecurity standards for electric utilities. They involve identity controls, training, security 'perimeters,' physical security of critical cyber equipment, incident reporting and recovery." You can read more about the standards here and see the 221 pages of detail here.

Boeing Crash: Speculation Continues Unabated

Boeing.gif

The cause of last week's crash at London Heathrow's airport of a British Airways Boeing 777 is still unclear. Crash investigators promise a preliminary report within a month.

Speculation about the cause currently run from a problem with the airplane's electrics, avionics system and/or engine control automation (reported in the Sunday Times and yesterday's London Guardian) to something wrong with either the aircraft's fuel system or the fuel itself that led to fuel starvation (Sunday Express). Just about every British paper has a theory, it seems.

What is known that about 2 miles from the airport and 600 feet up, the "the autothrottle demanded more thrust. It was a normal procedure, a small adjustment intended to keep the plane at the correct speed and height. Nothing happened. The computer system again ordered more thrust. Again, no response." The pilots apparently then tried to increase the throttle manually, and again, no response. Skilled airmanship brought the 777 into what one could called a semi-controlled crash, which fortunately, didn't result in any loss of life.

The plane's wreckage is being moved to British Airway's Hatton Cross engineering facility about 500 meters from the crash site for further investigation. If a rare software anomaly is found to be the problem - as it was in the Malaysian 777-200 incident of 2005 (see the Australian Transport Safety Bureau incident report, and a brief description of it in today's Sunday Times) - then expect there to be some additional fall out towards the Boeing 787 development.

UPDATE: Peter Ladkin point out that a preliminary crash report is required within 30 days (I wrote promised, which implies something else). As Peter noted, the UK is an International Civil Aviation Organization (ICAO) signatory, and ICAO signatories are required to produce accident reports according to a general standard format; they are also required to issue a preliminary report within 30 days of the accident.

UPDATE 1: Today's London Times is claiming that, "British Airways technical staff believe that the Boeing aircraftâ''s computerised control system caused both engines to fail during its final descent towards Heathrow on Thursday." We shall see.

Boeing B787 network certification requirement

Greetings, folks. I am Peter Ladkin and hope to be contributing on safety matters, especially in transportation.

Bob wrote recently about the FAA's new certification requirement on the Boeing B787 "Dreamliner" networks. I checked it out.

The FAA makes regulatory requirements (which are administrative law) by publishing a Notice of Proposed Rulemaking (NPR) in the Federal Register (FR) , collecting comments, and implementing the rule in the light of comments. The NPR was published in FR 72(71) on April 13, 2007, eight months ago. The FAA received comments from Airbus and from the Air Line Pilots Association, and issued the rule, unchanged, with answers to the comments, in FR 73(1) on January 2, 2008, whence the brouhaha in Wired.

So far, this all looks routine. Let's look at what the rule does.

There are three "domains" for networks in the B787: the Aircraft Control Domain (ACD), the Airline Information Domain (AID) and the Passenger Information and Entertainment Domain (PIES). The ACD is the safety-critical bit. The PIES is the passenger network. The rule says "the design shall prevent all inadvertent or malicious changes to, and all adverse impacts upon, all systems, networks, hardware, software, and data in the Aircraft Control Domain and in the Airline Information Domain from all points within the Passenger Information and Entertainment Domain." It is harder to get any more stringent than that.

Why are the FAA doing this now? Because they have perceived a gap in existing regulation which needs to be filled. And it needs to come now because Boeing are certifying the aircraft now. Airbus wanted more generally applicable conditions along with guidance on how to comply. The FAA replied that they are working on that, but the B787 needs it right now.

A colleague suggested the least expensive way of fulfilling this criterion might be to separate the domains physically. Well, I am not sure that can be done, since some of the AID as well as PIED are wireless. In some current fleets, for example, sensor data and other data in the aircraft control networks is siphoned off to go to, amongst other things, the Quick Access Recorder (QAR), which records data on the flight for airline flight quality control and maintenance. At least one major airline downloads the QAR data at the end of each flight directly through the local cell phone network at the destination. So one already has potential interconnections between public networks and aircraft control networks in which all the bad stuff must be controlled (and is, by obvious means).

Why aren't the FAA requiring similar for ACD/AID interaction? They are; they say this is covered by existing regulation as well as other special conditions (which I haven't yet seen).

So this looks all routine admin stuff. I don't see anything below the surface. Except, of course, for the monster question of how one does assure absolute security of the sort that looks to be required. I don't know who can answer that question, and I doubt if Boeing's answer will enter the public domain.

Advertisement

Risk Factor

IEEE Spectrum's risk analysis blog, featuring daily news, updates and analysis on computing and IT projects, software and systems failures, successes and innovations, security threats, and more.

Contributor
Willie D. Jones
 
Advertisement
Load More