A blog about Japanese cooking and banter, from a kitchen based in South West London

Machine Risk Assessment in Netherlands Gone Wrong in Welfare

This article talks about an algorithm used to determine welfare fraud risk.

The first thing I realize that is flawed in this algorithm is the way it semantically categorizes “risk”. After taking into account your marital status, living location, age, sex, education level, job status, and more, it spits out a number that is said to represent your “welfare fraud risk”.

In this article, it claims that a disabled non-Native single mother living on the outskirts of the city will instantly be classified as a risk. There is certainly a higher chance that someone who is disabled or a single mother will require welfare. However, is it a fraud risk?

We can say that someone who has the odds stacked against them in life has a lot more insecurity because they have fewer safety nets if something goes wrong. When a country or an organization takes in someone with high insecurity factors, the liability rubs off on them. It is an unfortunate reality that a healthy refugee will already have language disabilities, a lack of connections, an unfamiliar cultural environment, and a lack of housing, leading to a higher possibility of requiring welfare.

What I believe happened here is that the Dutch algorithm generated an “liability risk value” – the risk that you will seek help. Due to the high ratio of single mothers on welfare to begin with, the ratio of “welfare fraud” is already skewed against their interests.

This is getting eerily close to Minority Report. Your dad was mugged and killed in a poorer part of town? Sorry kiddo, the computer says your mom is likely a criminal now, and we have to interrogate her.

I am not against using algorithms in very specific conditions to assist us in improving human weaknesses. There was a story a while back where an algorithm was more accurate than a judge in determining the risk of a criminal becoming a repeat offender just by looking at past records. Another study found that education levels correlated with the leniency of a sentence, rather than the facts of the crime itself. It is clear that we have intrinsic biases towards charming people, and computers can help with that.

However, when we do not understand the methods and semantics behind numbers, they can quickly become a weapon of abuse. DNA testing had a similar issue in the past, and many were accused of a crime they did not commit.