For many years, the Netherlands was proud of its advanced digitisation, but the scandal resulting from an algorithm-based policy to detect benefits fraud shows that the haphazard use of such algorithms in data processing can have far-reaching consequences. Marlies van Eck, lawyer and researcher of technology and tax law at Radboud University, sounded the alarm several years ago. “For a long time, we did not want to face how interdependent all those systems are and how enormous the consequences could be if something went wrong.” “We must now learn lessons from the past to prevent bigger problems in the future,” van Eck warned in an Universiteit van Nederland video in 2018. The fact that people have been slow to learn these lessons is evident from the benefits scandal: slowly but surely, in recent months and years it became clear how the house of cards built on algorithms was about to collapse.” “In the Netherlands we have been working with algorithms for so long that we sometimes deal with them too uncritically.” Van Eck was awarded her PhD in 2018 for her thesis on automated chain decisions by the government. These automated decisions were first used in the Netherlands almost 50 years ago, when the government introduced a vehicle registry to automate the motor vehicle tax. Since then, a huge network of separate systems has developed, all of which send data to each other and influence each other. For example, the Basisregistratie Personen (Personal Records Database) keeps track of your personal data such as the composition of your family, which in turn influences the amount of your government allowance, which in turn influences your income tax, and so on. From flexible laws to rigid algorithms This chain approach seems very efficient, at least on paper. While many other countries still require that you go physically from one government agency to another with a document in hand, in the Netherlands a single change in your personal records reaches all affiliated government agencies within 24 hours. But that also makes it more difficult to undo a mistake. An incorrectly recorded detail in one corner of the system can cause chaos in your life for years to come, as the benefits scandal has shown. This is especially the case if the government chooses to use all those algorithms to detect fraud. Because, as Van Eck emphasises, that is a major weakness in the chain decision system. “The benefits scandal resulted from a combination of public administrators who were under pressure to tackle fraud, and who then decided to act, assuming that technology could easily deal with the problem. In a manner of speaking, you can stop the benefits of thousands of people with a few clicks of a button, without having any insight into the consequences. This would have been impossible in the past, as officials would have to manually search hundreds of thousands of applications for signals of fraud. Now you simply tell the computer what to look for, and the technology will do the rest. However, this does not tie in with how administrative law works in practice. Van Eck: “In a certain sense, the laws themselves are flexible. Laws are often formulated with terms such as ‘if reasonable’ and ‘where appropriate’.” But an algorithm has difficulty seeing the underlying factors and must in fact interpret everything in terms of black and white. It is up to politicians and other policy makers to take sufficient account of the human dimension when using algorithms. “For a long time, we assumed that algorithms are objective and actually prevent discrimination. And to some extent that is true. I have students from South America who are happy that your identity is reduced to your social security number in the Dutch system, and not, for example, to your last name. In their experience, this provides more assurance of equal opportunities. But it is a mistake to assume that the law can be directly translated into computer code. This is because laws are rarely so detailed that all steps of a process are specified.” And that is precisely what an algorithm needs to perform its job effectively. Of course, it is then up to the courts to take account of that human dimension, which is something that went wrong during the benefits scandal. If a victim of the scandal could not submit the correct invoices quickly enough, repayment of all benefits was demanded almost immediately. Van Eck also came across this in her research. “Everyone thought there was a reason for everything. If the system detected a problem, they assumed the system was correct. During my research, everyone kept sending me to someone else who they thought was responsible. Without a human measure, we suffer from our own regulatory mania.” Simplification Unfortunately, it will be quite a job to get the genie back into the bottle. The Netherlands has progressed so far with digitisation that it cannot be reversed. “The days when thousands of civil servants manually checked everything are over. It is, in fact, the ‘paradox of the early adapter’. In the Netherlands, we started digitisation so early that we began stacking algorithms together. For every new situation we devised a new registry with new algorithms to connect it all together. As a result, we have old systems – some of them up to 40 years old – that still have to work with new systems.” It's time to simplify, she argues. “People must again be central. Stop compartmentalisation, which results in new solutions being sought for everything. Many government agencies still create a new system for every situation, but that only makes the problem worse in the long term.” That is also what the Netherlands Institute for Human Rights recently recommended: map out the impact of chain decisions and other algorithms on human rights, and then ensure that lawyers and technicians can arrive at a solution that takes people into account. This requires clear guidelines and standards for new algorithms, and inspections to ensure compliance. Supermarket Some researchers see a solution in a central government office that citizens can contact if an algorithm makes mistakes in processing their data. They believe that everyone should understand the entire decision-making process. “As far as I am concerned, such a central office is not a good solution, because this again leaves it to individual citizens to take action. Citizens should be able to assume that the government is doing its job properly,” says van Eck. Algorithms and the artificial intelligence that surrounds them can be incredibly complicated, the lawyer admits. In fact, she regularly calls on fellow AI researchers to explain the precise operation of a system. “Transparency is now cited by some as the main goal, the solution to the problem. But citizens don't need to understand all the details. Please leave it to the people who understand it, both on the legal and technical side.” “It is better for the government to make clear agreements and implement guidelines about algorithms, and to perform regular checks to ensure compliance. When I buy milk in the supermarket, I don't need to understand every step from cow to cooler. I know that everything that is sold in the supermarket meets certain quality requirements and is therefore safe. Why should it not work that way for algorithms too?” Education and research Van Eck also emphasises that part of the change must start with education and research. “The law curriculum is inadequate regarding contemporary technological issues. Public administration has undergone unprecedented changes in the past 20 years, but the corresponding legal education at most universities has hardly changed. Knowledge of technology is essential to understand modern law. I graduated in Nijmegen in 1995 with a dual degree in administrative law and law & information technology. But technology now plays a major role in every jurisdiction and should be part of everyday education.” “The benefits scandal is a wake-up call for researchers in this field. Let's think carefully about which phenomena deserve our attention.” When she began her PhD study, the outside world had little interest in her field of research, so she had to work as an external PhD student. “Supplements and other benefits have never received much attention, but almost everyone in our society has to deal with them. We invest millions in blockchain, in artificial intelligence, in quantum computers, but actually very little in safeguarding these older technologies that continue to have a tremendous impact on our daily lives.” Photo by Rene Mensen via Flickr