Algorithmic transparency in government
The government is increasingly using algorithms to speed up work processes and improve services. This also applies to the police, for example in the case of the Intelligent Crime Reporting Tool: an algorithmic recommendation system that should facilitate and speed up the online reporting process for both citizens and the police. The use of algorithms by public organisations offers many advantages, such as more efficient and effective work processes and services. At the same time, there are risks, such as biased or opaque decision-making. Transparency about algorithms is often mentioned as a solution to provide insight into how they work, what data they use and what impact they have. However, this emphasis on transparency raises questions that Esther Nieuwenhuizen investigated.
On Friday 19 September Esther Nieuwenhuizen defended her PhD thesis Algorithmic transparency in government. A multi-level perspective on transparency of and trust in algorithm use by governments successfully.
In social debates, we often hear the optimistic view that making government algorithms transparent will automatically lead to trust. In her dissertation, Esther Nieuwenhuizen investigates how transparency actually influences citizens' trust in algorithm use by public organisations, such as the police. Although transparency is often mentioned as the key to trust, it is unclear what this exactly means and how it works. She investigates this in four sub-studies, based on a literature review, document analysis, interviews and survey experiments.
No one-size-fits-all solution for transparency
First of all, Nieuwenhuizen shows that transparency has a communicative function. Transparency can directly lead to increased trust, by providing citizens and other stakeholders with information about why algorithms result in a certain outcome.
Furthermore, in addition to a communicative function, transparency also has a disciplinary function. Because public organizations have to provide insight into their algorithm use, you often see that they improve their algorithmic processes. This can indirectly create more trust.
Finally, the findings in her dissertation show that designing transparency practices is difficult. Unfortunately, there is no one-size-fits-all solution. You cannot serve all target groups (citizens, regulators, journalists, NGOs, politicians and others) with the same information. The information you make public must be meaningful to the group you want to inform.
Thinking about transparency prior to the use of algorithms
In her dissertation, Nieuwenhuizen calls for transparency-by-design: she hopes that her insights will encourage public organisations to think about how they can be transparent about the functioning and use of their algorithms before they develop and deploy an algorithm. If an organization cannot explain why an algorithm comes to a certain decision, or how employees use it in decision-making processes, the question arises as to whether it should be used at all.
More information
Esther Nieuwenhuizen is Senior Inspector Police at the Inspection of Justice and Security and was a PhD student at the School of Governance (USG).
E.N. Nieuwenhuizen, Algorithmic transparency in government. A multi-level perspective on transparency of and trust in algorithm use by governments.
You can find the full text via the Utrecht 木瓜福利影视 Repository.