“Police are innovating in silos”: Use of algorithms in the justice system needs “urgent oversight”
A year after the Law Society’s legal and regulatory policy director Sophia Adams Bhatti conceived of the idea for a commission on the use of AI in the criminal justice system, the Society has published its final report, which finds that the ad hoc use of complex algorithms in the justice system needs urgent oversight in the face of “a worrying lack of oversight or framework.”
The Technology and Law Public Policy Commission set out last year to review how AI is used and the ethical and rule of law implications of that, led by Law Society president and commissioner Christina Blacklaws.
Having conducted around 121 interviews, the report’s key findings are:
There are a lack of explicit standards, best practice, and openness or transparency about the use of algorithmic systems in criminal justice across England and Wales.
There are significant challenges of bias and discrimination, opacity and due process, consistency, amenability to scrutiny, effectiveness, disregard of qualitative and contextual factors, against a backdrop of the potential of these systems to more deeply change the nature of the evolution of the law. The Commission recommends that a National Register of Algorithmic Systems should be created as a crucial initial scaffold for further openness, cross-sector learning and scrutiny.
Some systems and databases operating algorithms in the criminal justice system such as facial recognition in policing and some uses of mobile device extraction lack a clear and lawful basis. “This must be urgently examined, publicly clarified and rectified if necessary,” the report finds. “While the United Kingdom has more explicit provisions covering algorithmic systems than many other parts of the world, these contain significant omissions and loopholes that need joined-up consideration.”
Several clarifications and changes to data protection legislation, procurement codes, freedom of information law, equality duties and statutory oversight and scrutiny bodies are needed to provide key safeguards to the integrity of criminal justice in the digital age.
“Many of the heavily individualised, legal safeguards proposed to algorithmic systems in commercial domains, such as individual explanation rights, are unlikely to be very helpful in criminal justice, where imbalances of power can be extreme and are exacerbated by dwindling levels of legal aid,” the report says.
“Societal, systemic oversight must be placed at the forefront of algorithmic systems in this sector, which will require innovative and world leading policies. The United Kingdom has a window of opportunity to become a beacon for a justice system trusted to use technology well, with a social licence to operate and in line with the values and human rights underpinning criminal justice. It must take proactive steps to seize that window now.”
While many deployments are in a pilot or experimental stage, the Commission notes that the technologies being deployed are not so technically novel that they cannot be critically assessed by multi-disciplinary teams for their effectiveness, their conformity to real challenges, and their potential for unintended and undesirable side effects, particularly from optimising for some goals or aspects of an issue to the detriment of others.
“It is key that inhouse capacity is built and retained for overseeing and steering these systems, and that coordination occurs across the justice system to ensure this capacity is worldclass,” the report finds.
In order to help decide on ‘value-laden choices’ and tensions, will require the involvements of broad stakeholders in civil society, academia and technology firms as well as the justice system more broadly.
The Commission has published an interactive map (see link below) that allows the public to see for the first time the beginnings of an overview of where algorithms are being used to assist decision-making across the justice system across England and Wales.
The report recommends:
Oversight – A range of new mechanisms and institutional arrangements should be created and enhanced to improve oversight of algorithms in the criminal justice system.
Strengthening Algorithmic Protections in Data Protection – The protections concerning algorithmic systems in Part 3 of the Data Protection Act 2018 should be clarified and strengthened.
Protection beyond Data Protection – Existing regulations concerning fairness and transparency of activities in the justice sector should be strengthened in relation to algorithmic systems.
Procurement – Algorithmic systems in the criminal justice system must allow for maximal control, amendment and public-facing transparency, and be tested and monitored for relevant human rights considerations.
Lawfulness – The lawful basis of all algorithmic systems in the criminal justice system must be clear and explicitly declared in advance.
Analytical Capacity and Capability – Significant investment must be carried out to support the ability of public bodies to understand the appropriateness of algorithmic systems, and where appropriate, how to deploy them responsibly.
“Police, prisons and border forces are innovating in silos to help them manage and use the vast quantities of data they hold about people, places and events,” said Blacklaws.
“Complex algorithms are crunching data to help officials make judgement calls about all sorts of things – from where to send a bobby on the beat to who is at risk of being a victim or perpetrator of domestic violence; who to pick out of a crowd, let out on parole or which visa application to scrutinise.
“While there are obvious efficiency wins, there is a worrying lack of oversight or framework to mitigate some hefty risks – of unlawful deployment, of discrimination or bias that may be unwittingly built in by an operator.
“These dangers are exacerbated by the absence of transparency, centralised coordination or systematic knowledge-sharing between public bodies. Although some forces are open about their use of algorithms, this is by no means uniform.”
You can read the report in full here: file: http://www.legaltechnology.com//wp-content/uploads/2019/06/algorithms-in-criminal-justice-system-report-2019.pdf
See which police forces are using algorithms and for what: https://www.lawsociety.org.uk/support-services/research-trends/algorithms-in-the-justice-system/