Showing 15 results for:
Popular topics
Last summer, Snoop Dogg and Australian winery 19 Crimes formed a partnership to release a special red wine blend in honor of the Hip-Hop legend’s legacy. For their latest release, the two entities have now introduced Snoop Cali Rosé, a new California blend of Grenache and Zinfandel in which the rapper hopes to make the drink of the summer, Wine Industry Network reports. So far, 19 Crimes and Snoop Dogg’s collaboration has been well-received among consumers and has given the Australian wine brand a contemporary feel inspired by the rapper’s West Coast style. “We did it big with 19 Crimes Cali Red, so you know we had to do it again – and this time, I was thinking pink,” Snoop Dogg shared in a statement. “I can’t wait for everyone to sip on my Snoop Cali Rosé and bring those fresh feels from spring into summer and beyond. I hope when you open a bottle of this wine you take a little mind trip to my Cali home. This is how we Rosé the Snoop Dogg way!” John Wardley — TWE Marketing Vice...
In August 2017, over 500 white supremacists descended upon Charlottesville, Virginia for a “Unite the Right” rally. There, Neo-Nazi James Fields plowed through a crowd of counterprotesters , leaving dozens injured and killing Heather Heyer. The events of the rally were harrowing, not only because they confirmed what many people of color already knew about white supremacists’ ability to organize and march freely, but also because of how it was planned. To organize the Unite the Right rally, white supremacists utilized the popular, and private platform, Discord. The utilization of discord to plan a white supremacist rally — and the subsequent exposure of just how deep hate on the platform went — speaks to a larger trend of hate on gaming platforms. Initially designed for the gaming community. Discord’s use as a safe space for white supremacists online should spur questions about how and why gaming spaces, in particular, are used to promote and facilitate white supremacist violence....
In early March, a shooter massacred 51 Muslims at mosques in Christchurch, New Zealand during Friday prayers. The horrifying nature of the act became worse when people learned that the shooter had live streamed 17 minutes of the attack through Facebook Live. Tech companies scrambled to delete videos of the shooting — which could still be found on both Facebook and Instagram in early May. The shooting eventually posed deeper questions around the dangers of live streaming. Live streaming was introduced to the world as a way to help show people experiences in real-time. Despite its original intentions, it has become yet another way that people can use the internet to spread hate. Following the Christchurch shooting, Facebook came under fire for its lack of regulations, which allowed the video to both stream and spread. The company ended up imposing tighter restrictions on live streaming to prevent future abuse. A key aspect of Facebook’s restrictions was its new “one-strike” policy. If...
Over the past few years, hate crimes have steadily increased across the country. Despite being notoriously underreported — or simply not taken seriously when brought to authorities — the FBI still found that between 2016 and 2017, hate crimes rose 17 percent. It’s no stretch to correlate the rise of hate crimes with the United States’ political climate, including the election of President Donald Trump and the subsequent rhetoric that his administration ushered in. With the political climate continuing to worsen, it’s important to be aware of how hate operates online too. According to a survey from the Anti-Defamation League , 2018 turned out to be a record year for online hate speech. Social media platforms ranked especially high, with over half of all respondents (56 percent) saying they experienced hate on Facebook; meanwhile, Twitter and YouTube clocked in at 19 percent and 17 percent, respectively. Often, people separate what happens online from “real life,” as if the digital...
On March 15, shootings at two mosques during Friday prayers in New Zealand left 51 Muslims dead. The massacre itself was horrific, but what made it even worse was that the shooter live streamed the event on Facebook. Tech companies scrambled to remove the video after it appeared online, but they were unable to do so. Less than two weeks after the shooting, New Zealand officially banned people from sharing the Christchurch shooter’s manifesto or video . Consequences for owning or sharing the video weren’t initially made clear. Then on Tuesday, a Neo-Nazi was sentenced to 21 months in prison for sharing a video of the Christchurch massacre, Gizmodo reported . Philip Arps pled guilty to two counts of distributing objectionable material, New Zealand outlet RNZ reported . Arps sent it to about 30 people, but that’s not where he ended. He told the judge overseeing his case that the video — where bodies of dead children are visible — was “awesome.” In addition, RNZ reported that Arps asked...
On March 15, at least 50 Muslims were killed in two New Zealand mosques in Christchurch, New Zealand. The shooter live-streamed the event on Facebook, and since then, the company has been scrambling to remove copies of it across its platform. Now, a CNN report reveals videos of the shooting can still be found on both Facebook and Instagram. Eric Feinberg — a founding member of the Global Intellectual Property Enforcement Center — found nine videos. Each of them had originally been put up the week of the attack. One copy of the video on Instagram initiated the platform’s “Sensitive content feature.” Still, the video had been viewed more than 8,000 times and was only taken down after CNN showed it to Facebook on Wednesday. In the original 24 hours after the shooting broadcasted, Facebook removed 1.5 million videos globally , of which 1.2 million were blocked at upload. On March 21, Facebook then created a blog post explaining how it handled videos of the Christchurch shooting ....
The Christchurch massacre’s live-stream on Facebook — and subsequent spread across the internet — further illuminated social media’s hate problem that many have criticized for years. It left countries across the world scrambling to force tech companies to answer for their role in white nationalism’s presence online. On Wednesday, New Zealand’s Prime Minister Jacinda Ardern announced that she’s planning a summit in Paris alongside French President Emmanuel Macron. The summit’s goal is to have industry and world leaders agree to a pledge called the “Christchurch Call” to eliminate terrorist and violent extremist content online. In the announcement, Ardern said: “The March 15 terrorist attacks saw social media used in an unprecedented way as a tool to promote an act of terrorism and hate. We are asking for a show of leadership to ensure social media cannot be used again the way it was in the March 15 terrorist attack…We all need to act, and that includes social media providers taking...
The Sri Lankan government has shut down several social media platforms out of fear that misinformation could spread following the deadly attacks on Easter Sunday that killed 300 people and injured more than 500. Facebook, Whatsapp, Instagram, Youtube, Viber, and Snapchat, were all blocked in the country as of yesterday. This is the second time Sri Lanka has shut down social media in the country. In March of last year, the country banned Facebook, WhatsApp, and Instagram for a week to impede the spread of hateful posts and comments toward Muslims during protests. The move is part of a larger trend of big tech companies not being trusted to distribute reliable information or protect their platforms from being used for harm after a violent event. Weeks ago following the New Zealand mass shooting which was live-streamed on Facebook, conspiracy theories and shared videos filled platforms. New Zealand’s government and social media platforms began taking down the videos. Plus, the...
Last week, Sony announced it would be rolling out a new feature that allows users to change their PSN name — the ID that follows you throughout the Playstation gaming world — freely and openly. It’s a big step for Sony, since Microsoft allowed XBOX users to do it for years. While gamers are very excited about the change (not being bogged down to one name while you’re owning people in Overwatch is very exciting), there’s one issue. T he new feature has the potential to stir up one of the gaming world’s oldest problems: hate speech and offensive language. Sony recognizes the problems that allowing people to change their PSN names more frequently could bring, and the company is trying to get ahead of it. A company blog post last week outlined that any PSN names using racial slurs, profanity, or any other offensive language that breaks Sony’s terms of service would be automatically changed to “TempXXX.” This punishment is actually light, considering that Playstation used to ban people...
The French Council of the Muslim Faith (CFCM) has filed a lawsuit against Facebook and YouTube for their mishandling of of videos showing the Christchurch shooting, according to the Agence France-Presse . Agence France-Presse reported that CFCM’s complaint said they were suing the French branches of the two companies for “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor.” Those type of acts are punishable by three years imprisonment and an $85,000 fine, according to the Agence France-Presse. The shooting originally broadcasted on Facebook Live . Facebook said it removed 1.5 million videos of the New Zealand shooting in the 24 hours after it streamed. However, Facebook couldn’t identify all of them before upload, and videos exploded across social media. In addition to a livestream, the shooter uploaded a 17-minute video to Facebook, Instagram, Twitter, and YouTube. Each of those...
Since the Christchurch shooting, tech companies have scrambled to keep video of it offline — and they haven’t really succeeded. On Sunday, Microsoft’s president Brad Smith published a blog post addressing tech and its role in tragedies like this. Many companies have said that they simply weren’t prepared for an event like Christchurch. The video originally streamed on Facebook Live and later, the company said its artificial intelligence couldn’t detect the video to stop its spread. However, in his blog post, Smith argued that tech companies should have been prepared in advance. “Ultimately, we need to develop an industrywide approach that will be principled, comprehensive and effective,” Smith wrote. “The best way to pursue this is to take new and concrete steps quickly in ways that build upon what already exists.” Smith noted that “individuals are using online platforms to bring out the darkest sides of humanity,” as demonstrated by Christchurch. The attack was designed to go viral...
A report found that Facebook allowed various Neo-Nazi groups to remain on its platform, citing that they “do not violate community standards”, according to recent reporting from The Independent . The Counter Extremism Project , a nonprofit combatting extremist groups, reported 35 pages to Facebook, according to The Independent. Although the company said it’d remove six of them, the other requests were met with this response: “We looked over the page you reported, and though it doesn’t go against one of our specific community standards, we understand that the page or something shared on it may still be offensive to you and others.” – The Independent The groups reported included international white supremacist organizations, with many making racist or homophobic statements. Some groups also had images of Adolf Hitler and other fascist symbols. Although this is particularly troublesome following the Christchurch shooting — which broadcasted on Facebook Live — this has been a...
On Saturday, New Zealand’s Office of Film & Literature Classification officially banned the Christchurch shooter’s manifesto. By labeling it as “objectionable,” the government is considering its ban as a justifiable limit on freedom of expression. Under the ban, it’s now illegal to have a copy of either the video or the document and to share it with others — including online links. The New Zealand government urges people to report social media posts, links or websites displaying the video or manifesto here . If someone is found to have the manifesto, they can face up to ten years in prison, and those distributing it could face up to 14 years, as reported by Business Insider . The consequences for owning or sharing the video, though, are unclear. Although the full video is banned, it doesn’t mean any screenshots or other still images from it falls under that. The Office of Film & Literature Classification website notes images from the video “depicting scenes of violence, injury or...
On Wednesday, in response to continued pressure from multiple countries, Facebook published a blog by Guy Rosen, Facebook’s VP of product management, providing further details on the company’s response to the Christchurch shooting in New Zealand that left 50 people dead. Companies like Facebook often use artificial intelligence to identify content that should be removed. However, an over reliance on AI can lead to exactly what happened with the Christchurch video. The shooting was allowed to be broadcast on Facebook Live and after, videos continued to spread across the internet. Although Facebook continues to cite that the video was only viewed about 200 times during its live broadcast — and that nobody flagged it to moderators — those excuses aren’t cutting it for users and lawmakers. Now, the company is saying it tried to use an experimental audio technology, in order to catch copies of the video that it’s AI missed. Facebook wrote that it “employed audio matching technology to...
Since the Christchurch massacre, social media platforms have scrambled to keep video of it off their platforms. Days after the attack, it’s not difficult to find clips or still images from it. To many, this opens up questions about tech companies’ failures to regulate hate on their platforms, and who shares responsibility in moments like this. After the shooting, where at least fifty Muslims were killed in two New Zealand mosques, archives of the alleged shooter’s page revealed only 10 people had tuned into his Facebook Live broadcast of the event, according to The Wall Street Journal . Although the original video didn’t have many viewers, it exploded across social media in the days following the attack. Facebook, which has faced the brunt of criticism due to its site hosting the livestream, says it removed 1.5 million videos of the New Zealand shooting in the 24 hours after the shooting was broadcast. In the first 24 hours we removed 1.5 million videos of the attack globally, of...