UK to stop using ‘racist’ visa algorithm after legal challenge

UK to stop using ‘racist’ visa algorithm after legal challenge

Britain’s government announced on Tuesday that it will stop using a computer algorithm to streamline visa applications after NGOs labelled it “racist”.

The Home Office’s decision, which will take effect on August 7, comes after the Joint Council for the Welfare of Immigrants (JCWI) and Foxglove, a digital rights group, launched a legal challenge against the use of the algorithm.

The Home Office said in a statement that it has “been reviewing how the visa application streaming tool operates and will be redesigning our processes to make them even more streamlined and secure.”

We do not accept the allegations the Joint Council for the Welfare of Immigrants made in their Judicial Review claim and whilst litigation is still on-going it would not be appropriate for the Department to comment any further,” it added.

The redesign of the system is to be completed at the latest on October 30, it said.

‘Entrenched racism’

Foxglove welcomed the government’s announcement, saying “it’s great news because the algorithm entrenched racism and bias into the visa system”.

The system has been in place since at least 2015 and used a traffic light system to grade every entry visa application to the UK, assigning Red, Amber or Green risk rating to applicants, according to the JCWI.

“The visa algorithm discriminated on the basis of nationality — by design,” the organisation said in a statement.

According to the two NGOs, a number of “suspect” countries had been blacklisted so that applications from there were automatically graded Red.

“Their applications received intense scrutiny by Home Office officials, were approached with more scepticism, took longer to determine, and were more likely to be refused,” the JCWI added.

Foxglove supplemented that “it got so bad that academic and nonprofit organisations told us they no longer even tried to have colleagues from certain countries visit the UK to work with them”.

They also argue that the algorithm suffered from a feedback loop, “where past bias and discrimination, fed into a computer programme, reinforce future bias and discrimination,” Foxglove explained.

Furthermore, they denounced the system’s lack of transparency.

In a letter sent to Foxglove, the government argued that it “had already moved away from the use of the Streaming Tool in many application types”.

“Indeed recently, its use has been limited to applications for visit visas and a small number of other entry clearance routes including Short-Term Study, Overseas Domestic Worker and applications made overseas from non-EEA family members,” it went on.