The New Zealand government claims to have become the first in the world to outline a set of standards to guide the use of algorithms by public agencies, once again shining a light on what governments around the world are doing to ensure scrutiny of tools that are increasingly being relied on in crucial areas such as criminal assessment and sentencing.

New Zealand’s Minister for Statistics, James Shaw, on 28 July launched the Algorithm Charter for Aotearoa New Zealand, commenting: “We live in a data rich world where algorithms play a crucial role in helping us to make connections, and identify relationships and patterns across vast quantities of information. This helps to improve decision-making and leads to benefits such as the faster delivery of targeted public services.

“However, using algorithms to analyse data and inform decisions does not come without its risks. It is important, therefore, that people have confidence that these algorithms are being used in a fair, ethical, and transparent way. And that’s what this Charter is all about.”

Where algorithms are being employed by government agencies in a way that can significantly impact on the wellbeing of people, or there is a high likelihood many people will suffer an unintended adverse impact, it is appropriate to apply the Charter.

The commitment includes a need to clearly explain how decisions are informed by algorithms, including publishing information about how data is collected, secured and stored. There is also a commitment to make sure that data is fit for purpose by understanding its limitations and managing bias.

The Charter has been signed by 21 agencies, including the Ministry for the Environment, Ministry of Education, the Department of Internal Affairs, Ministry of Justice, and Inland Revenue. The Charter commits these agencies to a range of measures, including explaining how decisions are informed by algorithms and embedding a Te Ao Māori perspective in the development and use of algorithms.

“Most New Zealanders recognise the important role algorithms play in supporting government decision-making and policy delivery, however they also want to know that these systems are being used safely and responsibly,” said Shaw. “The Charter will give people that confidence. It will help to build public trust over the long term, meaning that we can unlock the full potential of data to improve people’s lives.

“Today we have set a world-leading example of how government can work with diverse groups of people, communities and organisations to improve transparency and accountability in the use of data. It is an example that we hope others will follow.”

Government agencies do not write their own algorithms and have limited insight into how the software makes decisions. However they are increasingly relied on in areas such as facial recognition and criminal sentencing, and in cases such as Wisconsin v Loomis, the judge gave Eric Loomis a long sentence for a drive by shooting on the basis of risk assessment tool Compas, disallowing a challenge to assess the algorithm.

Commenting on the news of the new Algorithm Charter, Stevie Ghiassi, CEO of Legaler, said on LinkedIn: “In light of all the hype, hopes and concerns around OpenAI’s GTP-3 #artificialintelligence, here’s a timely world-first from the New Zealand government (why do they always have to shame everyone else?)

“They are the first in the world to outline a set of standards to guide the use of algorithms by public agencies. Will this lay the foundation for other jurisdictions and maybe even the private sector?”

In June, A heavyweight group of AI experts from the likes of Google and Microsoft plus researchers and academics urged against the publication of a new study that claims to identify or predict criminality based on biometric or criminal legal data, saying that such studies are inherently racially biased and naturalise discriminatory outcomes.

The publication in question is A Deep Neural Network Model to Predict Criminality Using Image Processing planned for publication by Springer Publishing. But in a letter dated 22 June to Springer Editorial Committee, the group of around 1,700 expert researchers and practitioners said: “We urge the review committee to publicly rescind the offer for publication of this specific study, along with an explanation of the criteria used to evaluate it.” The group further says they want: “Springer to issue a statement condemning the use of criminal justice statistics to predict criminality, and acknowledging their role in incentivising such harmful scholarship in the past.” Also that: “All publishers refrain from publishing similar studies in the future.”