logo of mobile app Snapchat

Lawsuit against Snap over fentanyl deaths can proceed, judge rules

logo of mobile app Snapchat

Image Credits: Denis Charlet / AFP / Getty Images

A lawsuit blaming Snapchat for a series of drug overdoses among young people can proceed, a Los Angeles judge ruled this week.

A group of family members related to children and teens who overdosed on fentanyl sued Snapchat maker Snap last year, accusing the social media company of facilitating illicit drug deals involving fentanyl, a synthetic opioid many times deadlier than heroin. Fentanyl, which is cheap to produce and often sold disguised as other substances, can prove lethal in even extremely small doses.

The parents and family members involved in the lawsuit are being represented by the Social Media Victims Law Center, a firm that specializes in civil cases against social media companies in order to make them “legally accountable for the harm they inflict on vulnerable users.”

The lawsuit, originally filed in 2022 and amended last year, alleges that executives at Snap “knew that Snapchat’s design and unique features, including disappearing messages… were creating an online safe haven for the sale of illegal narcotics.”

“Long before the fatal injuries giving rise to this lawsuit, Snap knew that its product features were being used by drug dealers to sell controlled substances to minors,” Matthew P. Bergman, who founded the Social Media Victims Law Center, said at the time.

Snapchat rebutted the claims, noting that it is “working diligently” to address drug dealing on its platform in coordination with law enforcement. “While we are committed to advancing our efforts to stop drug dealers from engaging in illegal activity on Snapchat, we believe the plaintiffs’ allegations are both legally and factually flawed and will continue to defend that position in court,” a Snapchat representative told TechCrunch.

In the ruling on Tuesday, Los Angeles Superior Court Judge Lawrence Riff rejected Snap’s effort to get the case dismissed. Snap had argued that the case should be thrown out on the grounds that the social media app is protected by Section 230 of the Communications Decency Act, a law that protects online platforms for liability from user-generated content.

“Courts in California and the Ninth Circuit have explicitly held that Section 230 immunity applies to communications about illegal drug sales and their sometimes-tragic consequences—the exact circumstances here—because the harm flows from third-party content that was exchanged by third parties on the defendant’s social media platform,” Snap’s lawyers argued in their brief last year.

Riff did dismiss four counts against Snap but overruled the company’s efforts to throw out more than 10 others, including negligence and wrongful death. He also waded into Section 230’s relevance to the case, but did not conclude that the law’s legal shield should protect Snap outright:

Both sides contend that the law is clear and the legal path forward obvious. Not so. The depth of disagreement is revealed by the parties’ inability jointly to label Snap’s social media presence and activities: “service,” “app,” “product”, “tool,” “interactive course of conduct,” “platform,” “website,” “software” or something else.

What is clear and obvious is that the law is unsettled and in a state of development in at least two principal regards (1) whether “section 230” (a federal statute) immunizes Snap from potential legal liability under the specific allegations asserted and (2) whether concepts ofstrict products liability – usually applicable to suppliers of tangible products – already do or now should extend to specified alleged conduct of Snap.

That interpretation is likely to prove controversial and the latest in a flurry of recent cases in which a judge allowed a lawsuit that might be tossed out on Section 230 grounds to proceed.

Supreme Court arguments this week could reshape the future of the internet

logo of mobile app Snapchat

Lawsuit against Snap over fentanyl deaths can proceed, judge rules

logo of mobile app Snapchat

Image Credits: Denis Charlet / AFP / Getty Images

A lawsuit blaming Snapchat for a series of drug overdoses among young people can proceed, a Los Angeles judge ruled this week.

A group of family members related to children and teens who overdosed on fentanyl sued Snapchat maker Snap last year, accusing the social media company of facilitating illicit drug deals involving fentanyl, a synthetic opioid many times deadlier than heroin. Fentanyl, which is cheap to produce and often sold disguised as other substances, can prove lethal in even extremely small doses.

The parents and family members involved in the lawsuit are being represented by the Social Media Victims Law Center, a firm that specializes in civil cases against social media companies in order to make them “legally accountable for the harm they inflict on vulnerable users.”

The lawsuit, originally filed in 2022 and amended last year, alleges that executives at Snap “knew that Snapchat’s design and unique features, including disappearing messages… were creating an online safe haven for the sale of illegal narcotics.”

“Long before the fatal injuries giving rise to this lawsuit, Snap knew that its product features were being used by drug dealers to sell controlled substances to minors,” Matthew P. Bergman, who founded the Social Media Victims Law Center, said at the time.

Snapchat rebutted the claims, noting that it is “working diligently” to address drug dealing on its platform in coordination with law enforcement. “While we are committed to advancing our efforts to stop drug dealers from engaging in illegal activity on Snapchat, we believe the plaintiffs’ allegations are both legally and factually flawed and will continue to defend that position in court,” a Snapchat representative told TechCrunch.

In the ruling on Tuesday, Los Angeles Superior Court Judge Lawrence Riff rejected Snap’s effort to get the case dismissed. Snap had argued that the case should be thrown out on the grounds that the social media app is protected by Section 230 of the Communications Decency Act, a law that protects online platforms for liability from user-generated content.

“Courts in California and the Ninth Circuit have explicitly held that Section 230 immunity applies to communications about illegal drug sales and their sometimes-tragic consequences—the exact circumstances here—because the harm flows from third-party content that was exchanged by third parties on the defendant’s social media platform,” Snap’s lawyers argued in their brief last year.

Riff did dismiss four counts against Snap but overruled the company’s efforts to throw out more than 10 others, including negligence and wrongful death. He also waded into Section 230’s relevance to the case, but did not conclude that the law’s legal shield should protect Snap outright:

Both sides contend that the law is clear and the legal path forward obvious. Not so. The depth of disagreement is revealed by the parties’ inability jointly to label Snap’s social media presence and activities: “service,” “app,” “product”, “tool,” “interactive course of conduct,” “platform,” “website,” “software” or something else.

What is clear and obvious is that the law is unsettled and in a state of development in at least two principal regards (1) whether “section 230” (a federal statute) immunizes Snap from potential legal liability under the specific allegations asserted and (2) whether concepts ofstrict products liability – usually applicable to suppliers of tangible products – already do or now should extend to specified alleged conduct of Snap.

That interpretation is likely to prove controversial and the latest in a flurry of recent cases in which a judge allowed a lawsuit that might be tossed out on Section 230 grounds to proceed.

Supreme Court arguments this week could reshape the future of the internet

US Capitol building

CEOs from Meta, TikTok, Snap, X and Discord head to Congress for kids' online safety hearing

US Capitol building

Image Credits: Bryce Durbin/TechCrunch

CEOs from some of the biggest social platforms will appear before Congress on Wednesday to defend their companies against mounting criticism that they have done too little to protect kids and teens online.

The hearing, set to begin at 10 a.m. ET, is the latest in a long string of congressional tech hearings stretching back for years, with little in the way of new regulation or policy change to show for the efforts.

The Senate Judiciary Committee will host the latest hearing, which is notable mostly for dragging five chief executives across the country to face a barrage of questions from lawmakers. Tech companies often placate Congress by sending legal counsel or a policy executive, but the latest hearing will feature a slate of CEOs: Meta’s Mark Zuckerberg, X (formerly Twitter) CEO Linda Yaccarino, TikTok’s ​​Shou Chew, Discord’s Jason Citron and Evan Spiegel of Snap. Zuckerberg and Chew are the only executives who agreed to appear at the hearing voluntarily without a subpoena.

While Zuckerberg is a veteran of these often lengthy, meandering attempts to hold tech companies to account, Wednesday’s televised hearing will be a first for Yaccarino, Spiegel and Citron. Snap and X have sent other executives (or their former chief executive) in the past, but Discord — a chat app originally designed for gamers — is making its first appearance in the hot seat. All three first-timers could produce some interesting off-script moments, particularly Yaccarino. In recent interviews as X’s top executive, Elon Musk’s pick to lead the company has appeared flustered and combative — a world apart from her media overtrained peers like Zuckerberg and Chew.

Discord is a very popular app among young people, but it’s still an unusual name to come up in one of these hearings. The committee’s decision to include Discord is likely a result of a report last year from NBC News exploring sextortion and child sexual abuse material (CSAM) on the chat platform. The company’s inclusion is notable, particularly in light of the absence of more prominent algorithm-powered social networks like YouTube — often inexplicably absent from these events — and the absence of Amazon-owned livestreaming giant Twitch.

Wednesday’s hearing, titled “Big Tech and the Online Child Sexual Exploitation Crisis,” will cover much more ground than its narrow naming would suggest. Lawmakers will likely dig into an array of concerns — both recent and ongoing — about how social platforms fail to protect their young users from harmful content. That includes serious concerns around Instagram openly connecting sexual predators with sellers advertising CSAM, as the WSJ previously reported, and the NBC News investigation revealing that Discord has facilitated dozens of instances of grooming, kidnapping and other instances of sexual exploitation in recent years.

Beyond concerns that social platforms don’t do enough to protect kids from sexual predation, expect lawmakers to press the five tech CEOs on other online safety concerns, like fentanyl sellers on Snapchat, booming white supremacist extremism on X and the prevalence of self harm and suicide content on TikTok. And given the timing of X’s embarrassing failure to prevent a recent explosion of explicit AI-generated Taylor Swift imagery and the company’s amateurish response, expect some Taylor Swift questions too.

The tech companies are likely to push back, pointing lawmakers to platform and policy changes in some cases designed to make these apps safer, and in others engineered mostly to placate Congress in time for this hearing. In Meta’s case, that looks like an update to Instagram and Facebook last week that prevents teens from receiving direct messages from users they don’t know. Like many of these changes from companies like Meta, it raises the question of why these safeguards continue to be added on the fly instead of being built into the product before it was offered to young users.

KOSA looms large

This time around, the hearing is part of a concerted push to pass the Kids Online Safety Act (KOSA), a controversial piece of legislation that ostensibly forces tech platforms to take additional measures to shield children from harmful content online. In spite of some revisions, the bill’s myriad critics caution that KOSA would aggressively sanitize the internet, promote censorship and imperil young LGBTQ people in the process. Some of the bill’s conservative supporters — including co-sponsor Sen. Marsha Blackburn — have stated outright that KOSA should be used to effectively erase transgender content for young people online.

The LGBTQ advocacy group GLAAD expressed its concerns about the hearing and related legislation in a statement provided to TechCrunch, urging lawmakers to ensure that “proposed solutions be carefully crafted” to avoid negatively impacting the queer community.

“The US Senate Judiciary Committee’s hearing is likely to feature anti-LGBTQ lawmakers baselessly attempting to equate age-appropriate LGBTQ resources and content with inappropriate material,” GLAAD said. “… Parents and youth do need action to address Big Tech platforms’ harmful business practices, but age-appropriate information about the existence of LGBTQ people should not be grouped in with such content.”

The ACLU and digital rights organization the EFF have also opposed the legislation, as have other groups concerned about the bill’s implications for encryption. Similar concerns have followed the Children and Teens’ Online Privacy Protection Act (now known as “COPPA 2.0“), the STOP CSAM Act and the EARN IT Act, adjacent bills purporting to protect children online.

The bill’s proponents aren’t all conservative. KOSA enjoys bipartisan support at the moment and the misgivings expressed by its critics haven’t broken through to the many Democratic lawmakers who are on board. The bill is also backed by organizations that promote children’s safety online, including the American Academy of Pediatrics, the National Center on Sexual Exploitation and Fairplay, a nonprofit focused on protecting kids online.

“KOSA is a needed corrective to social media platforms’ toxic business model, which relies on maximizing engagement by any means necessary, including sending kids down deadly rabbit holes and implementing features that make young people vulnerable to exploitation and abuse,” Josh Golin, executive director of Fairplay, said in a statement provided to TechCrunch. Fairplay has also organized a pro-KOSA coalition of parents who have lost children in connection with cyberbullying, drugs purchased on social platforms and other online harms.

As of last week, KOSA’s unlikeliest supporter is one of the companies that the bill seeks to regulate. Snap split from its peers last week to throw its support behind KOSA, a move likely intended to endear the company to regulators that could steer its fate — or perhaps more importantly, the fate of TikTok, Snap’s dominant rival, which sucks up the lion’s share of screen time among young people.

Snap’s decision to break rank with its tech peers and even its own industry group on KOSA echoes a similar move by Meta, then Facebook, to support a controversial pair of laws known as FOSTA-SESTA back in 2018. That legislation, touted as a solution to online sex trafficking, went on to become law, but years later FOSTA-SESTA is better known for driving sex workers away from safe online spaces than it is for disrupting sex trafficking.

Fan fiction writers rally fandoms against KOSA, the bill purporting to protect kids online

Snap Pixy Drone

Snap recalls discontinued Pixy drone over fire risk

Snap Pixy Drone

Image Credits: Snap Inc.

Snap and the Consumer Product Safety Commission (CPSC) are recalling the company’s discontinued Pixy drone and telling owners to stop using it or charging the batteries because of a fire hazard. This comes after the safety agency received “four reports of the battery overheating and bulging, resulting in one minor battery fire and one minor injury.”

The recall, first spotted by The Verge, involves a refund of at least $185 and up to $250 for those who bought the “flight pack” bundle. Owners can also get between $40 and $50 if they have the Pixy’s extra charger and battery combo, and between $16 and $20 for spare batteries themselves. More information about the recall is available here and at support.pixy.com. Anyone who purchased the drone from Snap, Amazon, or had it gifted to them is eligible.

Snap revealed the palm-sized drone camera in April 2022 after years of rumors, and apparently moved roughly tens of thousands of them according to the CPSC’s recall listing (though that includes the batteries sold separately) — not a shabby accomplishment considering the company backed away from the product only four months later.

The Pixy was always limited in functionality. It was designed to be used without a controller, or even a memory card. But CEO Evan Spiegel had high hopes for the project, once claiming the market for personal drones was even bigger than camera glasses — the other hardware category Snap has stumbled through.