Canadian Auditor General Michael Ferguson’s latest assessment of the country’s misbegotten attempt to develop a new government-wide payroll system was blunt: “The building and implementation of Phoenix was an incomprehensible failure of project management and oversight… Overall, we found that there was no oversight of the Phoenix project, which allowed Phoenix executives to implement the system even though they knew it had significant problems.”
As a result of world-class project mismanagement on the Phoenix project, the Canadian government now owns and operates a payroll system “that so far has been less efficient and more costly than the 40-year-old system it replaced,” Ferguson states.
Exactly what went wrong and why? The answers have only been hinted at in government documents until last week, when Ferguson published his second audit of the Phoenix project. While his first audit focused on the project’s operational impacts on Canadian civil servants, this latest audit focuses on the management decisions made during its development and go-live period. Those decisions are directly responsible for the cost of the system rising from the original C$310 million estimate to at least C$1.2 billion through 2019—with tens of millions more likely to be spent before its hoped for replacement comes on line in 2025.
The Canadian government finished rolling out Phoenix in April 2016, despite numerous problems that became evident when it initially went live in February of that year. The decision to move forward, despite growing operational problems and calls to suspend its deployment, has made life a Dante’s inferno for some 193,000 civil servants. That group―more than half of Canada’s federal workers―has at various times over the past two years received either too much, too little, or no pay at all.
As of April 2018, there were still some 372,000 Phoenix payroll-related transactions waiting to be processed and corrected. Unless the current clearance rate improves, the last of the payroll problems won’t be fixed for another five years.
Ferguson’s audit describes what is essentially a manual for senior managers desiring to sabotage IT projects. Phoenix executives decided to defer or remove more than 100 of Phoenix’s 984 pay processing functions, restoring them only after it was fully deployed. The executives decided to scale back Phoenix functionality in order to save both development time and money because the estimated software development cost was C$119 million more than the C$155 million originally budgeted for.
Instead of asking for more money, which would undoubtedly lead to a lot of uncomfortable questions from politicians who were wary about the project in the first place, Phoenix executives worked with the prime contractor, IBM, to force-fit the project into the existing budget. This required reducing its functionality, testing, schedule, and project development staffing. How much the development of Phoenix was compromised by these decisions was never communicated to the departments and agencies whose employees would be bear the brunt of the program’s defectiveness.
The audit report makes clear that during the development of Phoenix, which was understood to be high risk from the start, a string of decisions turned those risks into insurmountable problems. For instance, in reviewing a sample of 81 pay processing functions, the auditors found that 20 percent failed testing. Worse, the functions that failed never were retested. Furthermore, the system was never subjected to end-to-end testing. The wrapping paper and bow on those miscues was the decision to scrub the sole Phoenix pilot rollout that was supposed to be conducted with one department in order to assess how well the system worked under real-world conditions. The rationale: to save money.
Executives decided to launch Phoenix with known significant security weaknesses, the audit reveals. Although deployment began in February 2016, there was no plan to address these high-security risks until December 2016. Similarly, a separate set of system weaknesses putting civil servant personal privacy at risk were not fully assessed before deployment. This led to nearly a dozen documented privacy breaches post-launch.
Moreover, a contingency plan existed in name only. The audit report states that the “plan” was finalized less than two weeks before Phoenix went live. But it “did not explain how problems would be resolved, what specific tasks would be needed to carry out the contingency plan, and who would be responsible for these tasks.” It’s no surprise that the contingency plan was never tested to see whether it would work.
Brimming with confidence based on nothing, Phoenix executives ignored the advice offered by an outside risk assessment and shut down the old payroll system when Phoenix went live rather than run them in parallel.
As if the rollout wasn’t already a perfect storm, they based Phoenix on a version of PeopleSoft that will not be supported by Oracle beyond 2018. And it would have been completely uncharacteristic of the Phoenix executives if they had bothered to make plans to upgrade the software to a newer version. They apparently believed that software upgrades were unnecessary for the payroll system to remain operationally effective and secure for at least a decade.
The audit concludes that those executives “had received more than enough information and warning that Phoenix was not ready to be implemented, and therefore, they should not have proceeded as planned,” Ferguson noted. However, the executives actively dismissed all negative information and decided to go forward anyway, “prioritiz[ing] meeting schedule and cost over other critical elements, such as functionality and security.”
Just as damning: Even as Phoenix was obviously failing by the summer of 2016, Phoenix executives continued to downplay the rampant payroll problems, proclaiming instead that Phoenix was “functioning as designed.” Unfortunately, because of the project’s poor setup and lack of oversight, that statement was entirely true. It took until early this year before the government finally realized that, no matter how much money was thrown at it, there was no way to ever make Phoenix work properly.
I have read and documented many examples of incredibly poor IT management decisions over the years, such as those made in relation to the Queensland Health payroll system project and the U.S. Coast Guard electronic health record system project. However, the total number of poor executive decisions made in relation to the Phoenix effort ranks among the greatest I have ever seen on a single project.
Ferguson’s audit report lists a number of obvious recommendations for avoiding a similar debacle in the future, which Public Services and Procurement Canada, the government organization in charge of implementing Phoenix, promises to implement. However, as Ferguson admits, to successfully put his recommendations into place will require a major culture shift in both government managers’ and line-workers’ behavior. This means creating a culture where senior executives, presented with evidence that their IT-related decisions may actually be wrong, actually admit it. The culture should also make civil servants willing to speak out publicly when their managers make senseless decisions.
Unfortunately, IT history is filled with similar auditor recommendations following a major governmental IT project failure. One won’t ever lose money betting against cultural shifts of the type Ferguson recommends actually being implemented.
Robert N. Charette is a Contributing Editor to IEEE Spectrum and an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.