INDEPENDENT NEWS

Gordon Campbell on racial profiling at Immigration NZ

Published: Fri 6 Apr 2018 11:33 AM
So Immigration New Zealand has been using racial profiling to guide its decisions on which kinds of immigrants it should deport.
Oh sure, they’re saying that race is “only” one of the indices that INZ considers, as if that somehow makes it acceptable. (Only a little bit of institutionalized racism here, folks. Nothing to see here, move on. )
Apparently….age, gender, ethnicity and country of origin are all “data sets” that go into the model guiding how INZ makes its decisions about which immigrants should be allowed access to this country’s resources and opportunities …and as Alistair Murray. INZ’s head of Compliance and Investigations helpfully told RNZ yesterday, the profiling involved even extends to “interactions” with Police, as to whether a Police decision to prosecute should simply be superseded by an INZ decision to deport the person in question. Yep, no danger there of confirming each other’s biases, surely.
The Murray interview is hair-raising, and can be heard here.
It seems that INZ has been loading data into its “harm model” for the past 18 months, on such things as unpaid hospital bills, use of healthcare or criminal activity. Thus, the demographic attributes of Person A who has been responsible for such costs in the past gets loaded into the model, and if and when another immigrant showing some of the same data characteristics ( age, gender, country of origin) comes over the horizon then bam ! ….”We’ll move to deport them at the first opportunity,” Murray says, “ so that they don’t have the chance to do that sort of harm.”
That’s right. People are being deported for future health costs they haven't yet incurred, or for crimes they haven’t yet committed, on the basis of the risk factors that the algorithms spit out. This month, anyway. (The data keeps changing month to month.) Murray was precise about how it works. “It [the model] is predicting how someone is most likely to behave, based on how their predecessors have behaved.” Just like in this Tom Cruise movie.
As RNZ’s Guyon Espiner pointed out yesterday to Immigration Minister Ian Lees-Galloway, making deportation decisions based on age, gender and racial profiling would surely contravene our Human Rights Act
There’s good reason why this sort of modeling should be discouraged. History is full of examples of the social harm done to immigrants by preconceptions about wily Orientals, drunken Irish, avaricious Jews and criminally inclined Italians. Of course, the proponents of the data-driven approach to social decision-making will argue that computers will apply the algorithms to every case consistently, and in ways that will eliminate discrimination and unconscious bias. That’s the claim. Predictive models hold out the promise that it offers a more effective way of allocating scarce resources – mainly, by mining data that will enable us to infer the future actions of individuals based on how ‘similar’ people have behaved in the past.
That’s the naïve hope. The naivete lies in assuming that the data inputs and the modeling frames are themselves bias-free, when in fact, they’re drenched in the values and policy priorities of their creators. Models, as data scientist Cathy O’Neill wrote in her 2016 book Weapons of Math Destruction, are simply “opinions embedded in mathematics. “
That becomes clear if and when anyone cares to unpack the logic on how the harms identified by INZ might be administered in practice. INZ has said that it profiles immigrants who are (a) not paying their health bills, (b) using the health system and (c) committing crimes. These at least, comprise some of the “harms” considered to be relevant to it. So….which “countries of origin” are we talking about here? We know that obesity is linked to diabetes and to other calls upon the health system, and that (relatively speaking) Polynesians are more likely to be overweight. There are a range of genetic and social reasons for them being overweight – so do these render them liable to deportation? It would seem so.
Does this also mean that INZ will treat Pacific Islanders differently to other migrants, when it comes to deportation decisions. Again, it would seem so. But here’s the thing. What “data sets” does INZ use to measure attitudes on race held by thin white South Africans, and what risk is that pre-history deemed to pose to social integration within this avowedly bi-cultural nation of ours ? Yes, that question invokes a stereotype about South Africans that many of them do not deserve; but that’s the whole point. Such stereotypes, masquerading as science, are highly selective. Tonga, Samoa and China appear to be the main targets of this INZ ‘pilot programme’ that the new government should stop in its tracks. It really has no place here.
Social Investment, Export Version
We shouldn’t be unduly surprised that INZ is using data crunching in this fashion, to remove unwanted immigrants. Much the same modeling has also been developed to guide decision-making on removing children from their families, within New Zealand. From 2015 onwards, Bill English and Paula Bennett were the political champions of “social investment” – which entailed the use of number matching and statistical surveillance to track the life choices of beneficiary families, and to direct welfare case work accordingly. We’ve even exported this model to the US, to pilot programmes in southern California and in Allegheny County, Pennsylvania.
In 2015, Werewolf warned about the dangers of using Big Data in this way. It is a process prone to racial stereotyping and circular reasoning in that the algorithms will concentrate the state’s surveillance and enforcement responses upon social groups who already feel stigmatised and marginalized, and whose privacy rights tend to be discounted by the authorities. In a January 2018 report on the Allegheny County experiment by Virginia Eubanks, Wired magazine pointed out how class difference affects the compilation of the data sets:
A family living in relative isolation in a well-off suburb is much less likely to be reported to a child abuse or neglect hotline than one living in crowded housing conditions. Wealthier caregivers use private insurance or pay out of pocket for mental health or addiction treatment, so they are not included in the county’s database.
Imagine the furore if Allegheny County proposed including monthly reports from nannies, babysitters, private therapists, Alcoholics Anonymous, and luxury rehabilitation centers to predict child abuse among middle-class families. “We really hope to get private insurance data. We’d love to have it,” says Erin Dalton, director of Allegheny County’s Office of Data Analysis, Research and Evaluation. But, as she herself admits, getting private data is likely impossible. The professional middle class would not stand for such intrusive data gathering.
As Eubanks goes on to argue, the result readily leads to a confusion of cause and effect (and parallel occurrence) with the result that parents get blamed for their own deprivation, and for its impacts on their children:
The privations of poverty are incontrovertibly harmful to children. They are also harmful to their parents. But by relying on data that is only collected on families using public resources, the AFST unfairly targets low-income families for child welfare scrutiny. “We definitely oversample the poor,” says Dalton. “All of the data systems we have are biased. We still think this data can be helpful in protecting kids.”
We might call this poverty profiling. Like racial profiling, poverty profiling targets individuals for extra scrutiny based not on their behaviour but rather on a personal characteristic: they live in poverty. Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.
As Emily Keddell of Otago University usefully warned last month (in an article that echoes many of the concerns raised above) there are also transparency problems that can become particularly acute when private companies engage with the state in this data-driven form of welfare delivery :
The problem with private companies partnering with public bodies, as in a recent Chicago case, is that the weighting of variables and identification of outcomes that the algorithm is trying to predict are beyond the reach of transparent inspection – it happens in a ‘black-box’ that we can’t open. We can’t check. We can’t see what is happening.
This can lead to unregulated maverick practices and contributes to our inability to offer people identified by algorithms with the ‘right to an explanation’ about decisions made about them recently written into the European General Data Protection Regulation.
Similar regulatory safeguards are necessary here. For now though, the entire approach seems suspect. Reportedly, the proponents of predictive modelling – and the key international figure here is economics professor Dr Rema Vaithianathan of Auckland University - the humans using these models should still be encouraged to query the results, if the outcomes appear to be perverse. Yet in practice…one can readily imagine how the model would remain likely to drive the process, in that overworked case workers will increasingly be required to devise a pretty good “ why not?” argument to challenge the data, and its inbuilt assumptions. As yet, the coalition government doesn’t seem to have decided just what its stance is towards the “social investment” approach to welfare delivery.
As the INZ furore suggests, that horse may well have bolted, already.
Daphne and Celeste, Redux
Around 1999-2000, the duo of Daphne and Celeste released two of the most irritating/ingenious singles ever to disappear – blessedly for most of us – down the memory hole. Remember “Ooh Stick You” and “U.G.L.Y”??? Then the pair went to Reading Festival and started a riot in which they got pelted with bottles, and then vanished.
Well, they’re back. They’ve been working with UK EDM artist Max Tundra, and so far, the results are really good. This cut “BB” doesn’t actually name-check Ed Sheeran, but he seems to be the most deserving target of their ire…
In addition that has to be the best use of the word “hetero-normative” in a pop song this year. The track is no fluke, either. Here’s the first thing they did with Max Tundra, three years ago:
And finally, here’s footage of the infamous bottling episode at Reading, 18 years ago:

Next in Comment

US Lessons For New Zealand’s Health System: Profiteering, Hospital Adverse Events And Patient Outcomes
By: Ian Powell
Israel’s Argument At The Hague: We Are Incapable Of Genocide
By: Binoy Kampmark
View as: DESKTOP | MOBILE © Scoop Media