Showing 18 results for:
Popular topics
On March 15, shootings at two mosques during Friday prayers in New Zealand left 51 Muslims dead. The massacre itself was horrific, but what made it even worse was that the shooter live streamed the event on Facebook. Tech companies scrambled to remove the video after it appeared online, but they were unable to do so. Less than two weeks after the shooting, New Zealand officially banned people from sharing the Christchurch shooter’s manifesto or video . Consequences for owning or sharing the video weren’t initially made clear. Then on Tuesday, a Neo-Nazi was sentenced to 21 months in prison for sharing a video of the Christchurch massacre, Gizmodo reported . Philip Arps pled guilty to two counts of distributing objectionable material, New Zealand outlet RNZ reported . Arps sent it to about 30 people, but that’s not where he ended. He told the judge overseeing his case that the video — where bodies of dead children are visible — was “awesome.” In addition, RNZ reported that Arps asked...
Having artificial intelligence solve most of today’s issues is no easy feat, especially for social media companies. Facebook came under the spotlight after the Christchurch shootings in March because it was unable to sufficiently and efficiently stop the video from being shared online. Facebook CEO Mark Zuckerberg told audiences at the F8 conference that the company would tackle violent videos with automation and AI, but now one Facebook exec is saying that it will take years to execute, according to Bloomberg. “This problem is very far from being solved,” Facebook’s chief AI scientist Yann LeCun said during a talk at Facebook’s AI Research Lab in Paris. Additionally, LeCun said there is not enough data to train automated systems to detect violent live streams because they “thankfully” don’t have many examples of people shooting others. Facebook’s timing could raise some issues with users, and its live stream could be weaponized again. Since the Christchurch shooting, the company...
During Friday prayers on March 15th, fifty Muslims were killed in two New Zealand masjids in Christchurch. The massacre by itself was horrifying, but made even more so by the realization that one of the shooters live streamed everything on Facebook. Videos of the shooting spread across social media before the death count was even confirmed. As an attack, Christchurch noticeably revolved around social media. It was designed to go viral, with the shooter not only live streaming but uploading a 17-minute video to Facebook, Instagram, Twitter, and YouTube. But, perhaps most alarmingly, the shooter also released a manifesto in which he espoused white nationalist rhetoric. The shooter identified popular YouTubers as a source of inspiration. Since the shooting, tech companies — including Facebook, Twitter, Google, and YouTube —have scrambled to remove videos of the Christchurch shooting, but some were still found lingering on platforms in May . On Sunday, following a day-long summit...
In March, Facebook’s live stream feature was used to broadcast the Christchurch shooting , where fifty people were killed in two mosques in New Zealand. Since the shooting, Facebook has understandably come under fire and scrutiny by both the public and government officials. In April, the company was even called to testify before the House Judiciary Committee alongside Google on the rise of white nationalism online . Today, Facebook announced it’s implementing a new “one-strike” rule. Any users who violate Facebook’s most serious policies — such as the Dangerous Organizations and Individuals policy — will be prohibited from using Facebook Live for a set period of time. “Following the horrific terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate,” Facebook’s VP of Integrity, Guy Rosen, wrote. The company plans on extending its restrictions to other areas on its platform, starting with preventing...
On March 15, at least 50 Muslims were killed in two New Zealand mosques in Christchurch, New Zealand. The shooter live-streamed the event on Facebook, and since then, the company has been scrambling to remove copies of it across its platform. Now, a CNN report reveals videos of the shooting can still be found on both Facebook and Instagram. Eric Feinberg — a founding member of the Global Intellectual Property Enforcement Center — found nine videos. Each of them had originally been put up the week of the attack. One copy of the video on Instagram initiated the platform’s “Sensitive content feature.” Still, the video had been viewed more than 8,000 times and was only taken down after CNN showed it to Facebook on Wednesday. In the original 24 hours after the shooting broadcasted, Facebook removed 1.5 million videos globally , of which 1.2 million were blocked at upload. On March 21, Facebook then created a blog post explaining how it handled videos of the Christchurch shooting ....
The Christchurch massacre’s live-stream on Facebook — and subsequent spread across the internet — further illuminated social media’s hate problem that many have criticized for years. It left countries across the world scrambling to force tech companies to answer for their role in white nationalism’s presence online. On Wednesday, New Zealand’s Prime Minister Jacinda Ardern announced that she’s planning a summit in Paris alongside French President Emmanuel Macron. The summit’s goal is to have industry and world leaders agree to a pledge called the “Christchurch Call” to eliminate terrorist and violent extremist content online. In the announcement, Ardern said: “The March 15 terrorist attacks saw social media used in an unprecedented way as a tool to promote an act of terrorism and hate. We are asking for a show of leadership to ensure social media cannot be used again the way it was in the March 15 terrorist attack…We all need to act, and that includes social media providers taking...
Last week, Facebook refused to remove a video of Canadian white nationalist Faith Goldy lamenting white “replacement” and the invasion of white European countries. The decision came despite Facebook’s recent ban of white nationalism and white separatism. Now, the company is pulling a slight reversal by banning a bunch of Canadian white nationalist groups from Instagram and Facebook, but only under rules preventing hate groups, as reported by the Hill . The groups were not pulled under the white nationalism and white separatism ban, even though that’s exactly the kind of rhetoric they engage in. The ban will extend to Faith Goldy, Kevin Goudreau, the Canadian Nationalist Front, Aryan Strikeforce, Wolves of Odin, and Soldiers of Odin, according to the Hill. TORONTO, ON – September 25: Faith Goldy speaks to the crowd. Mayoral candidate Faith Goldy and some of her supporters protested outside of Corus Quay where a Mayoral debate was held to which she was not invited. Some protesters...
Since the Christchurch shooting, Facebook has scrambled to do damage control. The company has now banned white nationalism on its site — a policy that it doesn’t seem to be following —and began “exploring restrictions” for its live streaming feature. Despite all these changes, a meeting with lawmakers always seemed inevitable. On April 9, House Democrats will question both Facebook and Google on hate crimes and the rise of white nationalism online. In a press release, the House Judiciary Committee wrote: Communities of color and religious minorities have long been subject to discrimination and have been targeted by groups who affiliate with ideologies of hate. White identity groups have a long history of oppressing racial and religious minorities and promote individual expressions of violence with the aim of preserving white racial and political hegemony. Social media platforms have served as world-wide conduits to spread vitriolic hate messages into every home and country. ...
Since the Christchurch shooting broadcasted live on its platform, Facebook has received a lot of questions. On Saturday, the New Zealand Herald published a letter from Facebook Chief Operating Officer Sheryl Sandberg answering questions and outlining restrictions for live video in the future. Describing the attacks as “pure evil,” Sandberg acknowledged that people “rightfully questioned” the use of online platforms to spread the video. She said the company was “committed to reviewing what happened” and went on to add: “We have heard feedback that we must do more – and we agree. In the wake of the terror attack, we are taking three steps: strengthening the rules for using Facebook Live, taking further steps to address hate on our platforms, and supporting the New Zealand community.” The first step involves “exploring restrictions” for who can utilize Facebook Live “depending on factors such as prior Community Standard violations.” In addition, Facebook is investing in better...
Recently, the Australian government introduced a tough new bill to “prevent the weaponizing of social media platforms” by holding social media executives responsible for violent content on their platforms. Under the proposed law, social media platforms that fail to “expeditiously” remove “abhorrent violent material” (such as terrorism, murder, and rape) are subject to punishment. This can include a fine amounting to 10 percent of the companies’ annual earnings and executives could also be imprisoned for up to three years. In a press release , Australia’s Prime Minister Scott Morrison said: “Big social media companies have a responsibility to take every possible action to ensure their technology products are not exploited by murderous terrorists. It should not just be a matter of doing the right thing. It should be the law.” Any platforms who become aware of violent content on their sites are required to notify the Australian Federal Police. If they don’t, they could face fines up to...
Since it allowed the Christchurch shooting video to livestream on its platform, Facebook has faced increased public scrutiny. That wasn’t helped by a report earlier this week that showed Facebook allowed Neo-Nazi groups to remain on the platform, because they “do not violate community standards.” Now, the company has suddenly changed its tune. On Wednesday, Facebook shared in a blog post that conversations with academics and civil rights groups have led it to (finally) ban white nationalism, writing: “Today we’re announcing a ban on praise, support and representation of white nationalism and separatism on Facebook and Instagram, which we’ll start enforcing next week. It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services.” It’s important to note that Facebook previously excluded white nationalism and white separatism from its ban on white supremacy, as reported by The Associated Press . This is a big move for the same company that...
The French Council of the Muslim Faith (CFCM) has filed a lawsuit against Facebook and YouTube for their mishandling of of videos showing the Christchurch shooting, according to the Agence France-Presse . Agence France-Presse reported that CFCM’s complaint said they were suing the French branches of the two companies for “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor.” Those type of acts are punishable by three years imprisonment and an $85,000 fine, according to the Agence France-Presse. The shooting originally broadcasted on Facebook Live . Facebook said it removed 1.5 million videos of the New Zealand shooting in the 24 hours after it streamed. However, Facebook couldn’t identify all of them before upload, and videos exploded across social media. In addition to a livestream, the shooter uploaded a 17-minute video to Facebook, Instagram, Twitter, and YouTube. Each of those...
Since the Christchurch shooting, tech companies have scrambled to keep video of it offline — and they haven’t really succeeded. On Sunday, Microsoft’s president Brad Smith published a blog post addressing tech and its role in tragedies like this. Many companies have said that they simply weren’t prepared for an event like Christchurch. The video originally streamed on Facebook Live and later, the company said its artificial intelligence couldn’t detect the video to stop its spread. However, in his blog post, Smith argued that tech companies should have been prepared in advance. “Ultimately, we need to develop an industrywide approach that will be principled, comprehensive and effective,” Smith wrote. “The best way to pursue this is to take new and concrete steps quickly in ways that build upon what already exists.” Smith noted that “individuals are using online platforms to bring out the darkest sides of humanity,” as demonstrated by Christchurch. The attack was designed to go viral...
A report found that Facebook allowed various Neo-Nazi groups to remain on its platform, citing that they “do not violate community standards”, according to recent reporting from The Independent . The Counter Extremism Project , a nonprofit combatting extremist groups, reported 35 pages to Facebook, according to The Independent. Although the company said it’d remove six of them, the other requests were met with this response: “We looked over the page you reported, and though it doesn’t go against one of our specific community standards, we understand that the page or something shared on it may still be offensive to you and others.” – The Independent The groups reported included international white supremacist organizations, with many making racist or homophobic statements. Some groups also had images of Adolf Hitler and other fascist symbols. Although this is particularly troublesome following the Christchurch shooting — which broadcasted on Facebook Live — this has been a...
On Saturday, New Zealand’s Office of Film & Literature Classification officially banned the Christchurch shooter’s manifesto. By labeling it as “objectionable,” the government is considering its ban as a justifiable limit on freedom of expression. Under the ban, it’s now illegal to have a copy of either the video or the document and to share it with others — including online links. The New Zealand government urges people to report social media posts, links or websites displaying the video or manifesto here . If someone is found to have the manifesto, they can face up to ten years in prison, and those distributing it could face up to 14 years, as reported by Business Insider . The consequences for owning or sharing the video, though, are unclear. Although the full video is banned, it doesn’t mean any screenshots or other still images from it falls under that. The Office of Film & Literature Classification website notes images from the video “depicting scenes of violence, injury or...