This article appears in the Witness section of the Winter 2019 issue of the New Humanist. Subscribe today.

Artificial intelligence is big business, and it already has a wide range of uses across public life. One of the many organisations making use of AI is the Home Office, which uses an algorithm to filter visa applications to the UK. It is not clear how exactly this technology works. That is why the Joint Council for the Welfare of Immigrants and Foxglove, an advocacy group promoting justice in the new technology sector, have brought a legal case to force the Home Office to explain on what basis the algorithm “streams” visa applicants. The Home Office insists that the algorithm is used only to allocate applications, and that final decisions are in the hands of human caseworkers.

Cori Crider, a director at Foxglove, summed up their concerns: “The Home Office insists its ‘visa streaming’ algorithm has no racial bias, but that claim is pretty threadbare. We’re told the system uses nationality to ‘stream’ applicants green, yellow and red – and it’s easy to guess who ends up in the green queue and who gets pushed to the back of the bus in red. If your algorithm singles out people for a digital pat-down and offers speedy boarding to white people, well, that’s unlawful.”

The case taps into a broader concern that many of the algorithms that make decisions about our lives are trained on data sets that do not include a diverse range of people, leading to inbuilt racial bias. This phenomenon is well documented. A recent study of COMPAS, an algorithm widely used in the US to guide sentencing by predicting the likelihood of a criminal reoffending, found evidence of racial bias. A recent study at MIT found that facial recognition technology tends to identify the gender of white men far more accurately than other groups.

As journalist and software developer Meredith Broussard told New Humanist online in May 2018, algorithms are “constrained by hardware and software – in other words, they break. Computational systems are only as good as the people who make them.”