Facebook reports removing millions of Christchurch mosque shooting posts
Facebook has taken down millions of images and articles worldwide depicting the Christchurch terror attacks, which killed 51 people.
Photo: Unsplash / Kon Karampelas
The company has been under pressure to remove objectionable material since the shooter livestreamed the 15 March massacre.
The social media company's fourth community Standards Enforcement Report details the number of times it removed content violating its policies.
The company's vice president of integrity, Guy Rosen, said between 15 March and 30 September it took down 4.5 million Christchurch attack content, and 97 percent of these were removed before being reported.
Mr Rosen said some of the content that was taken down would not ordinarily violate their policies, such as news media publishing stills of the attack.
Mr Rosen said Facebook decided to remove all content globally out of respect for the victims of the tragedy.
Internet New Zealand engagement director Andrew Cushen told Morning Report that the large figure showed the scale and challenges the social networking site has had to deal with.
"They
had to make some pretty extreme decisions about limiting
news reporting and other uses of the platform" - Andrew
Cushen, Internet New Zealand engagement director duration
3:29
from
Morning Report
Click a link to play audio (or right-click to
download) in either
MP3 format or in OGG format.
"These stats show how Facebook has learned and adapted and responded to that challenge to minimise that harm of sharing this content around the world."
He said the fact they had extended those policies to news outlets' content showed they had to make some pretty extreme decisions.
"That for me raises an interesting set of questions about how you support an organisation like Facebook to make decisions like that, and how rules like that should be applied around the world to make sure we are doing the right thing and minimising harm, and learning from an attack like this in the best way possible."
In May, Facebook signed up to the Christchurch Call in Paris, co-chaired by Prime Minister Jacinda Ardern and French President Emmanuel Macron.
It later announced changes including limiting who could use Facebook Live, and ensuring anyone in New Zealand who looked at extremist content on the site would in the future be directed to websites helping people to leave hate groups.