Skip to Content

Automated decision-making in Kela

Register number: OKV/131/70/2020
Date of issue: 19.4.2021
Decision-maker: Chancellor of Justice
Subject: Government or ministry Social welfare and social insurance
Measure: Opinion

The Chancellor of Justice investigated Kela’s automated decision-making concerning social security. He informed Kela of his statements and opinions concerning, among other things, the inadequate legal basis of automated decision-making, the transparency of decision-making and the justification of automated decisions, as well as the impact assessments of the processing activities carried out in automated decision-making.

The Chancellor of Justice also informed the Ministry of Social Affairs and Health, which bears the primary responsibility for the benefit legislation applicable to and applied by Kela, of his statements and opinions concerning the inadequacies of the legal basis of automated decision-making.

Kela’s duty is to enforce the right to social security, laid down as a basic right, by making administrative decisions concerning social security benefits. The Chancellor of Justice concluded that these decisions and the related decision-making procedures and processes should, irrespective of the decision-making method, be lawful and secure the rights of customers, as required by the basic right concerning protection under the law referred to in the Constitution. According to the Chancellor of Justice, the expanding digitalisation and developing automation and the applications enabled by it have generally brought up the issue of the realisation of protection under the law when automation is used in administration and decision-making. Fundamentally, the realisation of the procedural rights concerning protection under the law refers to the protection of substantive rights, in other words, that people access their rights and that their cases are resolved lawfully.

The Chancellor of Justice concluded that ultimately, the matter is about the inviolability of human dignity and standing as well as securing the factually relevant influencing opportunities in a digital operating environment, where matters meaningful to individuals are increasingly processed through automation. The implementation of basic rights related to good governance and data protection requires that legislation concerning automation and the information systems applying automation as well as the use of these in administration comply with the European General Data Protection Regulation (hereinafter referred to as the General Data Protection Regulation) with regard to securing data protection by default and by design as well as, correspondingly, good governance and the realisation of the rights of an individual by default and by design. He deemed that the provisions of the Constitution of Finland concerning the inviolability of human dignity and the freedoms and rights of the individual as well as the obligation of the public authorities to promote an individual’s opportunities to influence decision-making concerning themselves, direct, for their part, the interpretation of the public authorities’ obligations in the implementation of human-centric good governance that emphasises the agency of people. The Chancellor of Justice examined automation in Kela’s decision-making concerning social security from the aforementioned perspective.

Based on the available information, the Chancellor of Justice concluded that Kela’s decisions, at least in part, fall under automated individual decision-making, referred to in the General Data Protection Regulation, which is prohibited without the legal basis required by the Regulation. Automated decision-making by Kela does not have a legal basis required by the Finnish Constitution and the General Data Protection Regulation.

According to the Chancellor of Justice, general legislation concerning automated decision-making, being prepared in the Ministry of Justice, will contribute to solving problems associated with automated decision-making. However, based on the information available to the Chancellor of Justice at the time of making his decision, it alone was not adequate to solve issues related to the use of automation by Kela. The Chancellor of Justice deemed that functioning and adequate legislation is a method for ensuring good governance and customers’ protection under the law included by default and by design in automated decision-making and the decision-making systems required by it. To correct the unregulated situation and to secure the opportunities to develop automated decision-making and the rights of customers, he considered it extremely important that the need for regulation concerning automated decision-making be investigated in full. According to him, swift measures should therefore also be taken to determine the need for special legislation related to automated decision-making in Kela.

The Chancellor of Justice concluded that although the General Data Protection Regulation does not regulate the administrative procedure, it sets requirements for decision-making when decision-making is based on the processing of personal data and, specifically, on solely automated processing of personal data. According to the Chancellor of Justice, the Administrative Procedure Act and the General Data Protection Regulation share the starting point of transparency and openness: open information presented in an understandable manner and stemming from the needs of the customer / data subject about the processing of personal data that impacts the person’s rights and obligations, in other words, about decision-making concerning the matter being processed. For the protection of their rights, the customers should receive information about automated decision-making in a manner required by the General Data Protection Regulation as well as a decision justified based on substantive legislation in accordance with the Administrative Procedure Act. Therefore, according to the Chancellor of Justice, information about automated decision-making, as well as the decision-making rules applied in automated decision-making, should be part of a consistent set of decisions so that the decision would realise transparency as required by the General Data Protection Regulation concerning safeguards, among other things, and would also be a linguistically and structurally clear, understandable and justified administrative decision as required by the Administrative Procedure Act.

The Chancellor of Justice concluded that the transparency and openness of automated decision-making emphasise human-centric good governance based on the agency of people, which will build trust in decision-making and the activities of the authorities overall. According to him, trust is key in the justification and acceptability of using automation in decision-making and in administrative activities overall.

In automated decision-making by Kela, the constitutional liability for acts in office is, according to Kela’s report, realised so that the unit and official responsible for the decision can always be determined on the basis of documentation based on the information systems development method and on the definitions concerning the division of work, sphere of competence and operating models. According to the Chancellor of Justice’s view, this kind of determination of responsibility may, depending on the case, blur and distance the causal relation between an act and a consequence required by criminal liability for acts in office, in particular, and make it into such a complex chain that in practice, assigning responsibility is no longer possible or at least the probability of it happening is very low. Then the liability for acts in office would, at worst, be indirect and mostly ostensible, and it could not be realised in all situations in practice. However, the starting point of the provision concerning liability for acts in office in the Finnish Constitution is that the realisation of liability for acts in office should, in practice, be possible also when administrative decisions are made automatically. He stated that the realisation of liability for acts in office in automated decision-making should be assessed carefully.

According to the General Data Protection Regulation, the controller must demonstrate their compliance with data protection legislation, the starting points of which include the risks arising from the processing of personal data to privacy protection and the protection of personal data, identifying them and preparing for them (a risk-based approach). The impact assessment laid down in the Regulation is a method for examining risks potentially arising from the processing of personal data to the rights and freedoms of the data subject. By fulfilling their accountability, the controller verifies that they have fulfilled their said examination duty. According to Kela’s report, very few impact assessments have been performed on automated decision-making used by it.

According to the Chancellor of Justice’s view, the General Data Protection Regulation requires that Kela should, if it has not already done so, examine the necessity of impact assessments from its processing activities concerning decision-making and perform the said assessment, if needed. It would also be possible to render the situation concerning impact assessments to a state required by the General Data Protection Regulation potentially as part of special legislation concerning automated decision-making by Kela. The Chancellor of Justice concluded that irrespective of the implementation method, the matter should be rendered to a state required by the General Data Protection Regulation for the purpose of conducting an impact assessment and meeting the accountability requirement stated in the General Data Protection Regulation.

The Chancellor of Justice asked the Ministry and Kela to inform him of the measures arising from his opinion.

In its statement, the Ministry of Social Affairs and Health concluded that it had reviewed the need for special legislation in its own administrative sector simultaneously when participating in the work of a working group preparing general legislation concerning automated decision-making. The need for special legislation on potential automated decision-making by Kela is also being reviewed and examined in cooperation with Kela. 

In the report provided by it as the result of the decision, Kela stated, among other things, that general legislation being prepared on automated decision-making may cover Kela’s needs related to automated decision-making, and the need for regulating the matter depends of the extent of general legislation. According to the report, Kela is in the process of conducting an extensive impact assessment of the processing measures under way.