On March 15, at least 50 Muslims were killed in two New Zealand mosques in Christchurch, New Zealand. The shooter live-streamed the event on Facebook, and since then, the company has been scrambling to remove copies of it across its platform.
Now, a CNN report reveals videos of the shooting can still be found on both Facebook and Instagram. Eric Feinberg — a founding member of the Global Intellectual Property Enforcement Center — found nine videos. Each of them had originally been put up the week of the attack.
One copy of the video on Instagram initiated the platform’s “Sensitive content feature.” Still, the video had been viewed more than 8,000 times and was only taken down after CNN showed it to Facebook on Wednesday.
In the original 24 hours after the shooting broadcasted, Facebook removed 1.5 million videos globally, of which 1.2 million were blocked at upload. On March 21, Facebook then created a blog post explaining how it handled videos of the Christchurch shooting.
Generally, platforms like Facebook rely on artificial intelligence to identify what videos need to be removed. The company even turned to video hashing, which breaks videos into keyframes. Those frames are then given their own signature known as a hash.
By storing hashes, they can be compared to other videos on the platform. That should allow Facebook to find all versions of the video, even if it’s been edited. That’s a position backed up by Hany Farid, a professor at Dartmouth and expert in digital forensics and image analysis.
“The problem is that their hashing technology doesn’t work the way it is supposed to,” Farid told CNN. “When Facebook tells you that artificial intelligence is going to save them and us, you should ask how that is if they can’t even deal with the issue of removing previously identified content.”
Facebook isn’t the only social media company that had to fight the video’s spread. In addition to a live stream, the shooter uploaded a 17-minute video to Facebook, Instagram, Twitter, and YouTube.
The Christchurch shooting illuminated a deeper issue of how hate is allowed to fester online. It showed the rhetoric found on internet forums can easily be carried into real life with deadly consequences.
Social media companies have a responsibility to tackle Islamophobia and variations of white supremacy on their sites. Even though other platforms hosted versions of the video, Facebook is in a unique position because it’s where the shooter live-streamed it all to start.