Scoop has an Ethical Paywall
Licence needed for work use Learn More

Gordon Campbell | Parliament TV | Parliament Today | News Video | Crime | Employers | Housing | Immigration | Legal | Local Govt. | Maori | Welfare | Unions | Youth | Search

 

The Nation: Lisa Owen interviews Diane Robertson

The former head of the Auckland City Mission Dame Diane Robertson has a new challenge on her hands.
She's heading the Government's Data Futures Partnership, which aims to encourage more collection and sharing of data.
But weighing up the benefits of data sharing with the desire for personal privacy makes it a controversial issue.
Lisa Owen: Dame Diane Robertson joins me now. Good morning.
Diane Robertson: Good morning.
What kind of data do we need to be collecting and sharing, do you think?
The reality is we are swimming in data. Whether we are collecting it through cameras that are on the streets, whether we’re collecting it through census, whether we’re collecting it through our Fitbits, there is just so much data that we probably haven’t had access to before.
And so what would be of most benefit to us? What kind of information?
Well, there’s benefit from all sorts of things. We can actually use data to get better health outcomes. We could use data from our Fitbits to decide whether we need to have an intervention in hospital, we can use government data to decide what we’re going to do around families living in poverty, and so there is just so many uses for data that perhaps we haven’t even thought of before.
I suppose one of the fundamental problems with this is that a lot of people really are concerned about the collection of their personal information, the storing of it, the transfer of it. Do you understand that concern?
Oh, absolutely. One on hand, we are really concerned, and people express concerns about people’s data being used. On the other hand, we all sign up to Facebook and to Google and to our telco accounts, and we allow them to have access to a huge amount of data, and most of that data is held overseas.
Mm. Because this level of information that is collected can create dilemmas, so I want to talk about a real example, which was a predictive data tool that you’ve talked to us about that was developed to identify abused children or children that may be abused in the future. What happened around that?
Well, what happened around it was that there was a lot of work done about whether that was ethical to use, so it was signed off in an ethics committee. It was legally a thing that could be done. But actually, what happened was it didn’t get social licence. And social licence is whether people will actually think it’s an okay use of data. And on that particular occasion, it didn’t.
So, this was a tool that researchers basically felt would identify kids two years before they were abused – is that right?
Well, two years before other agencies knew that they were abused. So before abuse was reported or before a child turned up in hospital. So there was a lot of predictive things that said if a child lives in a really low-income house, if there was a parent who was in jail, where there had been previous interventions from Child, Youth and Family that the younger child would be at risk, just predicting which children were more likely to come to the attention.
So if we could do that, why wouldn’t we use it?
Well, I think it’s one of those examples of we can, but people may actually say that’s not a use of data that we want. It might be okay for you to say that that’s a good thing to do, but if a lot of other people say it’s not a good thing to do, then we have to be very cautious about how we proceed with it.
So how do you get people on board? You talked about social licence – that’s just people agreeing to allow this to happen. Well, how do you get it? How do you get social licence?
Oh, social licence is just situational. So you don’t go out and say I’m going to get social licence to use all of your data for everything I want to use it for; you go out and get social licence for particular issues. So we will be talking to people, saying, ‘Would you be happy for your data to be shared by a number of health professionals?’ I mean, that’s a really good example, is your doctor’s records are held by your doctor. Would you be happy for other doctors to see that, for your record to be held by a hospital as well, so that when you go from one medical practice to another or to a pharmacy, would you be okay about your medical records being shared amongst them? So we’ll be having conversations like that with New Zealanders.
I suppose the thing is there is a line between using data to inform policy, and critics would argue the other end of that spectrum is social engineering. So who decides where that line is?
Well, I think that’s something that New Zealanders have to decide where that is. I think we’ve had social licence conversations in New Zealand about many things other than data. We have social licence conversations about abortion, about gay marriage, about divorce, and it is where there is a general consensus that those things are okay. I mean, we’re having a really big social licence conversation at the moment about euthanasia. And so the same conversations will happen about data. I think what people don’t realise is that data actually is a commodity that we’ve got. It’s like our water. It’s like everything else – it’s a precious resource that can be used for good or it could be used for bad. I think with data, the thing that people don’t realise is that it’s not just used once; it can be used again and again and again for different purposes. So we need to be very careful about how.
You talk about the prospect that it can be used for good or bad, so how do you stop it being used against people? Because, for example, a school might not want to take on a child that has a bunch of indicators that show they could potentially have poor outcomes. And employer might not want that person. An insurance company might not want them either, or a landlord. So how do you stop it being used in that way?
I think that’s one of the things that we’re actually looking at at the moment is we often talk about the question of privacy, about where my data is used – what the question really is is, ‘How is it going to be used, and is it going to disadvantage me or is it going to exclude me?’ And that’s really the biggest reason in the sense that the data future’s partnership has been established is to look at those cases, find out ways and recommend back to government ways that we can make sure that people don’t get excluded because of poor data or that they’re penalised because of it.
Great to have you on the show this morning. Thanks for joining us, Dame Diane Robertson.
Transcript provided by Able. www.able.co.nz

Advertisement - scroll to continue reading

© Scoop Media

Advertisement - scroll to continue reading
 
 
 
Parliament Headlines | Politics Headlines | Regional Headlines

 
 
 
 
 
 
 

LATEST HEADLINES

  • PARLIAMENT
  • POLITICS
  • REGIONAL
 
 

Featured News Channels


 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.