Scoop has an Ethical Paywall
Licence needed for work use Learn More

Gordon Campbell | Parliament TV | Parliament Today | News Video | Crime | Employers | Housing | Immigration | Legal | Local Govt. | Maori | Welfare | Unions | Youth | Search

 

Christchurch Call – Expert Reaction

Christchurch Call – Expert Reaction
14 May 2019


Prime Minister Jacinda Ardern and French President Emmanuel Macron are hosting a summit in Paris on May 15 (local time) with world leaders and tech company CEOs to discuss how they can prevent social media being used to organise and promote terrorism.


Ardern hopes the attendees will agree to a pledge called the ‘Christchurch Call’ which aims to eliminate terrorist and violent extremist content online.

The SMC asked experts to comment.

Dr Kevin Veale, Lecturer in Media Studies, Massey University, comments:


“I think the ‘Christchurch Call’ is a fantastic place for this discussion to be starting, and it’s good that Jacinda Ardern is bringing the conversation to such prominence.

“However, this isn’t the first time that social media platforms have been implicated in terrorism. This is the first time that a terrorist attack in a ‘Western’ country was broadcast via the internet, but Facebook has been a significant factor in the genocide of Rohingya Muslims in Myanmar. This is not an isolated case: previous studies have demonstrated a link between Facebook use and violence against refugees in Germany and Youtube’s complicity in propagating and profiting from neo-Nazi and white-supremacist content through its service.

Advertisement - scroll to continue reading

“I hope the summit draws attention to these cases, and the fact that social media platforms profit from both an indifference to harassment, and from harassment itself. It falls within the realms of corporate responsibility to deal with these problems, which have been known for a substantial amount of time, and they have done nothing to remedy their contributions to harassment campaigns.

“Potentially pressure from governments and the threat of regulation will mean there is some movement. However, I expect that the social media companies themselves will offer primarily technological solutions based on filtering and algorithms, which can be and visibly are gamed by bad actors.

“Possibly the discussion will turn to removing anonymity from social media services or the internet, despite the evidence that many people involved in online abuse are comfortable doing so under their own names.

“New Zealand and other countries do get some benefit from social media platforms, but we also need to ask where the scales are set: what do we REALLY get out of allowing them to connect pervasively to so many aspects of our societies? There will have been high-level policy discussions weighing the benefits and risks involved in participating in the ‘Five Eyes’ surveillance network, but have similar policy discussions considered Facebook’s capacity to gather personal information and communication? What would happen if we – and potentially other countries connected to the ‘Christchurch Call’ discussions – flatly said that we were blocking Facebook from operating entirely in our territory until concessions were made?

“The issue, to an extent, is political will: if we cannot expect to tax Facebook and other social media giants based on their profit within our countries, we cannot expect to have enough leverage to change them in other ways either. In Germany and France, local law requires Twitter to block and filter neo-Nazi content; for some reason, Twitter has elected to only apply such a filter to those countries. Legislative action and regulation can have an impact, as we can see in examples like this.”

No conflict of interest. Dr Veale gave a talk today about how social media companies profit from racism, abuse and harassment.

Associate Professor Alistair Knott, Dept of Computer Science, University of Otago, comments:


“It’s great to see Jacinda Ardern taking the initiative in calling for regulation of social media companies in the wake of the Christchurch attacks. Jacinda’s focus is on preventing the posting and dissemination of violent extremist content on social media sites. That’s understandable, given the trauma caused by the Christchurch video – and given its potential as propaganda and precedent for other like-minded extremists. However, removing videos of atrocities is essentially a reactive process, that happens after the event. We should also be thinking about proactive reforms to social media platforms, that prevent the growth of extremism.

“What turns people into extremists? Videos of attacks have an effect at one end of the extremist spectrum – but we should also be thinking about processes that move people towards extremism, from more neutral positions. Obviously these are complex processes, but here again, the way information is shared in social networks may play a role. In platforms like Facebook and Twitter, users are shown the kinds of items they have previously shown some interest in. There is some evidence that this pushes users into ‘bubbles’ of increasingly narrow political or religious viewpoints. When Jacinda and colleagues consider how to regulate social media companies, they might want to think not just about removing depictions of terrorist atrocities, but also of exercising some control over the algorithms that choose items for users’ feeds. Small changes could potentially have large effects in reducing the polarisation of opinions that lead to extremism.

“Social media companies have become hugely powerful distributors of information in our society. In some ways, the policies of these companies are like government policies: they affect everyone, and small tweaks can have big effects. At present, tech companies’ policies are dictated solely by commercial considerations, rather than the public good. There are good arguments that governments should get more involved in their operation.”

No conflict of interest.

Associate Professor Dave Parry, Head of Department, Computer Science, AUT, comments:

“In technical terms, simply making preference setup clearer, allowing people to have a ‘whitelist’ of approved sources and only allowing upload by verified users would go a long way to reducing the viewing of despicable videos like the one recorded in Christchurch. A set of expectations for this and takedown response times, including automated systems to detect suspicious behaviour could form the basis for a reasonable set of rules that can be enforced.

“Although not perfect, this could be enforced on a national level and issues of different levels of censorship avoided. The key element is that social media companies will have to take steps to ensure that users are verified including their age, and that at least within the company, users can be linked to real people. Because of privacy issues, this will also bring the need for the regulations to stop companies simply using this information to increase advertising revenue.

“A set of ‘best practice’ guidelines may be the most we can hope for from the current meeting, along with some sharing of techniques for suspicious activity detection. Unfortunately, automatic recognition techniques are extremely useful to intelligence agencies, and it is unlikely that much will be revealed from those sources.

“The major conflict in these cases is not between free speech and censorship, it is between convenience and harm. These interventions will make it slightly more difficult to upload your snowboarding exploits, but they will also reduce the number of people who could be greatly distressed and damaged by offensive material. If the leaders can make this point then the meeting will be a success and lead to sensible and acceptable measures.”

No conflict of interest.


ends

© Scoop Media

 
 
 
Parliament Headlines | Politics Headlines | Regional Headlines

 
 
 
 
 
 
 

LATEST HEADLINES

  • PARLIAMENT
  • POLITICS
  • REGIONAL
 
 

Featured News Channels


 
 
 
 

Join Our Free Newsletter

Subscribe to Scoop’s 'The Catch Up' our free weekly newsletter sent to your inbox every Monday with stories from across our network.