top of page
  • Foto van schrijverCynthia Jansen

Dutch tax authorities fined by the Dutch Data Protection Authority

The Dutch Data Protection Authority (DDPA) has imposed the Dutch tax authorities a € 2.75 million fine for the illegal processing of personal data over a period of years. The DDPA came to this decision because for many years the Tax Administration processed data in a ‘fraud identification facility’. The Tax Administration used this as a blacklist to register indications of fraud. The processed data was on the (dual) nationality of childcare benefit applicants in a way that was unlawful and improper. The DDPA uncovered over the course of a long investigation multiple violations of the GDPR.

The Dutch childcare benefits scandal

The Dutch tax authorities introduced an algorithmic system in 2013 to detect incorrect applications for child benefits and potential fraud. The Dutch tax authorities used certain personal information, like the nationality of a citizen, in their algorithmic system. Non-Dutch nationals or nationals with a double nationality received higher risk-scores with this algorithmic system. The parents and caregivers that were selected by the algorithmic system were subjected to investigations and a rigid interpretation of the law. Those selected had to not only pay back the childcare benefits they had received, but also had to pay large administrative fines. There were also people on the blacklist that were wrongly branded as tax frauds. In some cases, people were not offered debt rescheduling. This led to devastating financial problems for the families affected. The Dutch childcare scandal gained great (inter)national attention and is still a topic of discussion. Some families have still not received any form of recompensation or reparation.

The algorithm that was used by the Dutch tax authorities was a self-learning algorithm that reinforce an existing bias of a link between race and nationality and crime. The self-learning algorithm was adapted over time, with no human oversight. The algorithm used a ‘black box’ system, in which there was no information why the algorithm had generated a certain score.[1] This ‘black box’ system made it difficult to investigate where exactly mistakes were made, because it was unclear how or why the system gave its ratings. There was no insight as to exactly how the algorithm came to certain scores.

The legal framework on data protection in the EU

At the center of legal data protection in the EU lies the General Data Protection Regulation (GDPR). The GDPR has a wide scope of application and has a very broad definition of ‘personal data’. The GDPR also regulates how data should be processed. Personal data can only be processed when it is done in accordance with the core principles of data protection enshrined in article 5 of the GDPR. These core principles are formulated in a general manner and are further concretized in other articles of the GDPR. Article 6 of the GDPR states the legal grounds that are required for the processing of personal data. In the public sector, such as with the childcare benefits, the GDPR allows national bodies to process personal data without the consent of citizen if there is a legal ground in national legislation.[2] The state administration is subjected to a proportionality test, so that it is ensured that the national bodies do not collect and process more personal data than is needed for achieving their purpose.

The purpose limitation principle of the GDPR requires that personal data can only be collected for legitimate purposes that are explicitly specified. [3] Personal data can not be further processed in a way that is incompatible with these specific purposes. The function of this purpose limitation principle is to limit the processing of personal data - “data minimisation”. Article 5 GDPR also states that the processing of personal data should be done in a transparent manner.[4] Section 2 of the GDPR states the requirements for the principle of the transparency of personal data. According to the transparency rights of the GDPR, citizens who are beneficiary of welfare have a right to be informed by the controller about their personal data and the purpose of the data processing.[5] The data controller has an active duty to be transparent about the usage of data.

The use of algorithms in the Dutch childcare benefits-case

The Dutch Data Protection Authority stated that the processing of personal data in the system that was used by the Dutch tax authorities in the case of the Dutch childcare benefits, can be qualified as ‘profiling’ in the sense of article 4(4) GDPR.[6] The DDPA stated that the detecting and investigating of fraud with childcare benefits is a public task of the Dutch tax authorities. However, the processing of personal data regarding nationality to indicate if a citizen was Dutch or non-Dutch was not necessary for achieving the purpose of the Dutch tax authorities.The DDPA further stated that the processing of the nationality of the citizens who used the childcare benefits was discriminatory and went against the core principles of the GDPR.

Following the investigation by the DDPA, the Dutch tax administration has begun to clean up their administration and has eliminated the data on the dual nationalities of Dutch nationals in 2020.

[2] Articles 6(1)(c) and (e) GDPR.

[3] Article 5(1)(b) GDPR.

[4] Article 5(1)(a) GDPR.

[5] Articles 13(1), 14(1) and 15(1) GDPR.


bottom of page