The algorithm register, a useful solution in practice?
The Municipality of Amsterdam has announced the launch of an algorithm register with the objective to increase transparency about the algorithms that are being used and in order to increase the public support thereof. But what exactly is the 'problem' with the use of algorithms? And does such a register (sufficiently) address the public debate surrounding this topic and the rights of individuals in practice?
The City of Amsterdam has, together with the Finnish capital Helsinki, launched an algorithm register. The aim is to increase openness and transparency about algorithms and to gain the public support for their use. The register enables everyone to see algorithms that are being used and to participate in recognizing or tackling their potential risks.
It is well known that there is plenty to be done about the use of algorithms within the business community and the government. This subject is often equated with artificial intelligence (AI). At the end of 2019, the Dutch State Secretary Keijzer informed the Lower House of Parliament about the Strategic Action Plan for AI. In addition to this, a white paper from the European Commission on AI leaked in early 2020. Both documents focus on the use of AI – or rather – on the use of algorithms in practice and the need for more far-reaching legislation and regulations to guide this practice in the right direction. But what are we really talking about?
Algorithms are basically nothing more than a predetermined set of instructions directed to a computer, which determine how to deal with information or data. Depending on the algorithm used, the results can be (re)used to train or improve an algorithm. Certain algorithms can therefore, over time, become 'smarter' or better for the purpose for which they are used. In addition, the computer cannot discriminate and will therefore follow the same set of instructions for all situations. That sounds promising. For example, the Municipality of Amsterdam has already used algorithms in the context of parking controls and the fight against illegal housing rentals.
However, there is also has a downside. The lack of customization can lead to unreasonable or unfair outcomes, certainly in the relationship between the government and citizens. That risk may arise, for example, from the bias of the party developing the algorithm. Another possibility is that the lack of a proper outcome may stick to the data itself. The use of algorithms therefore requires careful consideration of i) on one hand the factors that play a role in the decision-making process, and ii) of the data on the basis of which the analysis is carried out on the other.
However, the problem with algorithms is that it is not always possible to identify the defects. The so-called 'black box' phenomenon, in which the outcome can no longer be explained, lurks in the use of algorithms. This is precisely the reason why, for example, the General Data Protection Regulation (GDPR) tries to guarantee the right to human intervention in fully automated decision making (i.e. based on algorithms). The purpose of this is precisely to prevent data subjects from having their decisions overturned or significantly affected by an outcome based on an algorithm, without any reference to the human dimension.
Another guarantee offered by the GDPR is that data subjects must be meaningfully informed about the underlying logic and the expected consequences of using an algorithm. There is a lot of discussion surrounding the way in which these obligations must be implemented in practice. In general, it is agreed that both human intervention and information must be 'meaningful' for the individual concerned, but what is actually meant by this is - of course - subject to discussion.
In view of the social and academic debate on this subject, the algorithm register of the Municipality of Amsterdam seems a valuable initiative. In any case, it offers (concerned) citizens and companies the opportunity to take a look behind the scenes and ask critical questions about the policy being pursued. Its practical usefulness, however, has yet to be demonstrated. It is doubtful whether the information provided about the algorithms will actually provide sufficient insight in order to limit the most impactful risks and whether such information can be understood by the average citizen. Another question is what will then happen to an identified risk. Can human intervention or an adaptation of the algorithm then be enforced, or is this a choice made by the municipality? Does the affected individual simply no longer benefit from a complete reconsideration of the result if the algorithm turns out to be flawed, and should the process not also be adjusted with the objective to prevent future errors? And what if no one is looking at the appropriateness of the algorithms, can the municipality then hide behind the transparency provided? Is it then up to the citizens and the business community to prevent the municipality from making mistakes in the application of algorithms in practice?
The register and its use can in any case contribute to valuable insights. In particular because the use of algorithms is indispensable in practice and the call for more far-reaching regulation of them persists. The initiative for the register and the pursuit of transparency and openness is therefore commendable.
Loyens & Loeff has set up a dedicated and multidisciplinary Digital Economy Team. This firm-wide team unites top experts from our various practice groups in all of our jurisdictions. Together our specialists represent a broad variety of technology companies, from major online platforms to smaller but innovative start-ups. It is our forward thinking and practical advice that helps our clients to stay ahead in the digital world and to find solutions to any (potential) Digital Economy related concerns that may arise. If our team can be of any assistance to you, please do reach out to us!
Nina OrlićAssociate Attorney at law
Nina Orlić, attorney at law, is a member of the Competition & Regulatory practice group. She specialises in data protection and privacy law, telecommunications law, regulated markets and (international) contract law and has experience in employment law.T: +31 20 578 53 44 M: +31 653 38 97 74 E: [email protected]