A controversy has erupted in England over a report that the Norfolk Constabulary has been quietly experimenting with a computer algorithm to decide whether a burglary is worth police efforts to investigate. The algorithm, developed in conjunction with the University of Cambridge, examines some 29 factors and then suggests whether or not there is a realistic prospect of solving the crime and recovering the goods.
According to the reports, the trial began in January after university scientists analyzed thousands of burglary cases in Norfolk and identified factors that seem to indicate the “solvability” of the crime. Earlier this year, the University touted its work in helping the Durham Constabulary make prisoner custodial decisions based on another AI-based algorithm that used the histories of 104,000 bail decision cases.
After the news broke, the Norfolk Constabulary tried to downplay the importance of the algorithm in making investigative decisions. A police spokesperson said in a statement, “In all cases of house burglary an officer will attend the scene and carry out an initial investigation. An assessment will then be made as to whether further inquiries are required and it is at this stage that we test the use of the algorithm.” The statement also emphasized that the algorithm’s recommendation could be overridden by Norfolk’s Investigation Management Unit.
The Constabulary did not state, however, how many cases have been investigated with the aid of the algorithm, nor how many cases it indicated were not solvable but were subsequently overridden by the Investigation Management Unit. During the past 12 months, there have been 4,012 burglaries in the Norfolk area, so the algorithm has likely been employed hundreds of times so far this year.
Norfolk’s Constabulary also hasn’t been forthcoming over the specific 29 factors being used. Speculation is that it includes the “availability and quality of CCTV footage, forensic clues such as fingerprints, footwear marks or blood left at the crime scene, and similarity of the offence to others committed by known criminals.”
One can suppose that the algorithm’s exact factors and how they are weighted will be kept secret, not only for intellectual property reasons but also to keep criminals from possibly gaming the algorithm. That may be more effort than it’s worth, however, given that the current solve rate for burglary across England and Wales is around 10 percent, and already some 65 percent of cases are closed without further investigation.
One reason Norfolk Constabulary’s use of an algorithm to determine whether it’s worth police time to investigate has struck a nerve is that burglary has increased across England and Wales during a period that police forces have experienced dramatic cutbacks in resources. According to the latest statistics, domestic burglary has risen 32 percent over the past year, while the number of police has been slashed by over 15 percent since 2009. The Norfolk police force itself has been reduced some 31 percent over the same period, with over 100 staff let go this year alone. Some are now wondering whether the years of cutbacks have reached a dangerous tipping point.
While the use of the algorithm may indeed help the Norfolk Constabulary best allocate its increasingly scarce police assets, it may be creating its own set of public trust issues as well. For example, the head of the Police Federation, John Apter, called the use of the algorithm “insulting” to burglary victims and that it risked alienating the public. The public’s trust in the police is not very high across England and Wales, and he warned that the police force’s increasing dependence on technology jeopardizes and erodes that trust even further.
Apter’s sentiment was echoed this week by the incoming president of the British Science Association, James Al-Khalili. He cautioned that using the artificial intelligence algorithms risks creating a public backlash if there isn’t more transparency about where, when, and how these algorithms are being used. The fact that the Norfolk Constabulary’s test of its burglary algorithm only came to light after a newspaper story about it is a case in point.
The U.K. government has had a long infatuation with applying technology as a means to reduce the cost and increase the effectiveness of policing. For instance, in the mid-1990s, the government touted CCTV cameras as an effective and inexpensive substitute for the “bobbies on the beat” method of policing under its Partners Against Crime Initiative. Successive governments have since poured several hundred million pounds into procuring hundreds of thousands of CCTV cameras claiming that they not only saved money but also deterred crime, helped solve crimes, and made citizens feel safer.
These contentions are debatable, especially in light of the fact that government-directed police force cutbacks have reduced the number of CCTVs being actively monitored. In addition, some police forces like the Dyfed-Powys police have been acknowledging over the past few years that CCTVs are not as effective for crime prevention or postcrime investigation as expected, and have decided to put more bobbies back on the beat.
The use of CCTVs by police forces has also faced criticism that they force officers into a passive, detached oversight role concealed in the police station instead of being highly visible and active members of the community they police. This, along with the force reductions, is seen as one reason why one in three people reported they hadn’t seen a police officer walking a beat in over a year.
Nevertheless, other British police forces haven’t given up on their CCTV investments, and instead have tried to increase their effectiveness with still more technology, namely combining them with automated facial recognition (AFR) software. These experiments have not exactly been an overwhelming success.
However, that hasn’t dampened the police forces’ enthusiasm for AFR or other emerging AI technologies like predictive policing. That unbridled enthusiasm, however, only seems to increase the British public’s belief that the police are more interested in deploying their latest technological toys than interacting with them.
While the Norfolk Constabulary claims that it is only testing the use of its burglary solvability algorithm, it would be surprising if it gave up using the algorithm after the trial ends. The ongoing police resource reductions make using such a technology almost an imperative.
The real question is whether the Constabulary decides—given that future budget reductions are almost inevitable—to yield to the algorithm the ultimate authority to decide whether to send an investigator to the scene of a burglary in the first place based on its solvability score. If that happens, the British public may begin to seriously question, as it has already started to do, whether they really need a police force at all.
Robert N. Charette is a Contributing Editor to IEEE Spectrum and an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.