9:05 am today
The Privacy Commissioner says businesses should take great care when using facial recognition technology because there
is a high risk of misidentification.
Inquiries about a Dunedin man mistakenly identified as a shoplifter at a New World store revealed that New Zealand's
largest supermarket company has rolled out facial recognition CCTV technology in some of its North Island stores.
The man was allegedly mistakenly identified due to human error, and Foodstuffs NZ claimed facial recognition was not
used in the South Island.
However, the Otago Daily Times reported a different security system that "bridges the gap between businesses and the police" was now used at the
Centre City New World in Dunedin, among other South Island stores.
Privacy Commissioner John Edwards said that despite the technology being "cutting edge", it's not particularly reliable
- even in law enforcement.
A report from the Independent found that the facial recognition software used by London Metropolitan Police came up with false positives 98 percent
of the time.
"If a major law enforcement organisations in one of the biggest cities in the world can't get it right then you'd have
to query the dilligence that a domestic commercial user in New Zealand has applied to the equipment," Mr Edwards told Morning Report.
Audio: duration 5:16from Morning Report
Click a link to play audio (or right-click to download) in either
"If you've got a higher level of confidence in the accuracy of your technology than it is able to deliver, you can
accuse somebody of being untrustworthy, you can exclude them from your store, you can take all sorts of actions that are
completely unjustified.
"If they are throwing money at technology that is potentially going to get them into more trouble than it solves, then
that would seem like a false economy," he said.
Mr Edwards also pointed to troubling research from the US which showed that data sets used to teach the programmes can
be influenced by an ethnic bias.
A recent study showed one software could identify white males 99 percent of the time, but was less reliable with other
genders and ethnicites. Women of colour were accurately identified only 65 percent of the time.
"That's pretty significant, and I doubt the software we're talking about here has been generated in New Zealand so you'd
have to wonder whether it's been tested on a New Zealand population to a degree that gives a user sufficient confidence
that it's not going to tarnish the customer group that it tries to stop," Mr Edwards said.
In order to look into the issue, the commission would probably need someone to complain about the use of the technology,
Mr Edwards said.
However, he said, consumers should be told if they are entering shops and premises where their faces are being filmed
and their data stored, and why it was happening.
"The interesting question is whether there's an obligation to provide a bit more transparency about the technology
that's being plied over those images - and that's something we haven't addressed yet," he said.
"That's something that maybe the select committee will take an opportunity to look at with a new privacy bill in the
house."