How A Flawed Algorithm In UK's Welfare Program Is ‘Pushing People Into Poverty’

01/10/2020

Digitizing and automating tasks can help emerge what it's called the "digital welfare state".

This way, governments can improve their ability in providing financial or other aid to individuals or groups who cannot support themselves.

However, a UN human rights expert has expressed concerns about the emergence of this trend, saying that all too often the real motives behind such programs are to slash welfare spending, set up intrusive government surveillance systems and generate profits for private corporate interests.

"As humankind moves, perhaps inexorably, towards the digital welfare future it needs to alter course significantly and rapidly to avoid stumbling zombie-like into a digital welfare dystopia," the Special Rapporteur on extreme poverty and human rights, Philip Alston, said a report, back in 2019.

And in 2020, this opinion has been proven to be true.

The Human Rights Watch has warned that there is a flawed algorithm that determines the social security benefits received by people in the UK.

Universal Credit
'Universal Credit', which is a payment to help citizens in the UK with their living costs, is building a digital barrier between some individuals and their social rights. (Credit: Alamy Stock Photo)

The model is created to calculate the benefits people are entitled to each month based on changes in their earnings. But the Human Rights Watch discovered a defect in the system, finding that the algorithm only analyzes the wages people receive within a calendar month, and ignores how often they’re paid.

What this means, people who receive multiple paychecks in a month, which is common for those who have irregular works or engage in low-paid jobs, can have their earnings overestimated and their payments dramatically decreased.

The Human Rights Watch said the system should be improved by using shorter periods of income assessment, or using averaged earnings over longer periods of time.

While the government is evaluating these proposals, the watchdog has called for urgent measures to be implemented, because the flawed algorithm is causing hunger, debt, and psychological distress.

“The government has put a flawed algorithm in charge of deciding how much money to give people to pay rent and feed their families,” said Amos Toh, senior AI and human rights researcher at Human Rights Watch. “The government‘s bid to automate the benefits system – no matter the human cost – is pushing people to the brink of poverty.”

The algorithm is a core part of Universal Credit, a revamp of the UK‘s welfare system that combines six benefits into one monthly sum.

The system was designed to streamline payments, but has been widely criticized since its launch in 2016.

For example, those who have applied for it, have to wait for at least five weeks before they can ever receive their first payment. And for people with limited skills or internet access, have struggled to manage the online system.

The digital welfare state is designed to ensure that citizens can benefit from new and emerging technologies, experience more efficient government services, and enjoy higher levels of wellbeing.

But, Alston said that digitization of welfare has very often been used to promote things that are the opposite of its original intention.

"Digital welfare states thereby risk becoming Trojan Horses for neoliberal hostility towards social protection and regulation," said the UN Special Rapporteur.

"Moreover, empowering governments in countries with significant rule of law deficits by endowing them with the level of control and the potential for abuse provided by these biometric ID systems should send shudders down the spine of anyone even vaguely concerned to ensure that the digital age will be a human rights friendly one".

The Human Rights Watch wants the UK government to take a more human-centered approach to address this issue, and urging policymakers to not leave the algorithms to do all the work.