(123)456-7890

Update on Social Media Addiction Litigation

By Susan Barfield
September 7, 2022

Susan Barfield (00:06):
Hello everyone. Thank you for joining another Case Works stream today, we are joined by Matthew Bergman. Matthew is the founder of the Social Media Victims Law Center, the leading national firm on social media litigation. Matthew works to hold social media companies legally accountable for the harm they inflict on vulnerable users. The Social Media Victims Law Center seeks to apply principles of product liability to force social media companies to elevate consumer safety and design safer platforms that protect users from foreseeable harm. Matthew, thanks so much for joining us today.

Matthew Bergman (00:46):
Well, thanks for asking me, yeah. Happy to be here.

Susan Barfield (00:50):
Yeah. Awesome. Well, for those that don’t know, could you please share with our listeners some of your background and what made you decide to start tackling some of these complex societal issues tied to social media?

Matthew Bergman (01:00):
Well, I’ve been a product liability lawyer for 25 years. Primarily, my practice has been representing the victims of asbestos disease. But I wanted to transition my practice and focus on not simply compensating plaintiffs and victims, but trying to do something so that people didn’t become victims in the first place. And about that time, Francis Haugen came out with her incredible revelations about what the social media companies knew. And the surgeon general came out with his report on the mental health crisis that American teens are experiencing. And to me, this was just kind of a modern incarnation of the asbestos litigation. This was the Sumner Simpson papers all over again. This was Dr. Selikoff all over again. And yet the harms were so much greater.

Matthew Bergman (01:54):
We were talking about children who are just being ravaged by these products and companies that are so knowledgeable about what their products are doing to our kids and continue to profit from them – that they make the asbestos companies look like choir boys. And so I decided to form the Social Media Victims Law Center for the purpose of representing parents whose children are injured or in some cases killed through social media addiction and abuse. And over that time, we’ve taken the lead in filing. Now we have 30 cases filed across the country. We have hundreds of clients and parents who we’re working with to try to hold these companies accountable. This is new litigation. This is not a sure thing and this is not a cake walk. But we’re committed to moving forward to hold these companies accountable to protect our kids.

Susan Barfield (02:48):
Yeah. Well, for those who’ve maybe not been following you on the news, or haven’t read all the stories. Will you share what this litigation’s about and some of the specific cases that have been making the press. I read a little bit about Selena Rodriguez. Maybe that’s a case you could share, or maybe another one that comes to mind.

Matthew Bergman (03:05):
Well, the first case that we filed involved Selena Rodriguez, a child that took her life at the age of 11 after becoming so addicted to social media, that she’d become physically violent when her phones were taken away or the Wi-Fi was cut off. And so, this 11 year old child was being subjected to sexual abuse online, being encouraged and paid to expose herself in horrible ways, being kept up at all hours of the day or night. And ultimately she took her life. And she did so on Snapchat, posting it for everyone to see and horrifically. This is a phenomenon that we’re seeing in many cases of children taking their lives on social media or holding their phones when they breathe their last breath. And this is a direct consequence of the design decisions that these companies have made. They have designed these products to maximize user engagement over all other things. They use operant conditioning and artificial intelligence to make their products addictive. They know that content that is psychologically discordant is more likely to be engaged with children. So they deliberately design algorithms that make their products progressively more damaging to children. So a 13 year old girl that’s interested in exercise, is in short order, led through these algorithms to sites that encourage anorexic behavior – anorexic porn, for back of a lack of a better word. And these children in many cases develop severe eating disorders and in some cases lose their lives. These algorithms will be pernicious in other ways. An African American young man who’s interested in music is profiled and given sites and directed to sites that promote gun violence and gang violence. This is not an accident and it’s not a coincidence. We’ve also filed cases involving three children who died of the TikTok blackout challenge. And that’s a challenge that goes on TikTok and encourages children to choke themselves and then release themselves. And in many cases, children are unable to release themselves and die. And again, there’s nothing accidental and there’s nothing coincidental about it. This is woven in the web and the woof of these products design, and yet, unless they’re held accountable, they’re gonna keep doing it because their profits dictate that they put their engagement over safety. And we’re here to put an end to that.

Susan Barfield (05:53):
Yeah. You know, you mentioned suicide and the social media addiction and you shared a little, a couple of other cases, but what other negative psychological implications are you seeing in these cases?

Matthew Bergman (06:05):
We’re seeing a number of kids, predominantly girls, but not only girls who are, led toward anorexic content. We have a case of a 12 year old girl, who got on TikTok and was interested in exercise and was shortly being subjected to, sites that encouraged her to starve herself. She spent three days, three weeks in the hospital. We have other children who are continuing to struggle with with eating disorders. We’ve also have a pervasive, children predominantly girls, but not only girls are direct messaged online by adult predators, to be sent obscene, pictures of male genitalia, and encouraged to expose themselves. And this is the rule, rather than the exception. We have over a thousand clients, 70% of which involve cases involving young girls. And in 85% of those cases, these girls are unsolicited and subjected to sexual misconduct. And in some cases, even progress toward exposing themselves. And in some cases engaging in sexual trafficking themselves. And again, this is all off the shelf technology. It can be prevented – they just don’t do it because it would cost them money. A lot of cases involving attempted suicide, severe depression, and anxiety as well.

Susan Barfield (07:40):
I feel like I know the answer to this, but is it your belief that the social media or the specific platforms are their defective products and/or malicious in design?

Matthew Bergman (07:53):
Well both. I mean this is so much worse than the asbestos companies, you know? Asbestos was a chronic problem and the companies knew that over a period of time, individuals who were exposed significantly would develop disease. In this case, kids are being exposed right now and their own documents confirm that. Meta’s documents confirm that a third of the girls that are on Instagram have negative body image. They know that 7% of the unwanted direct messages toward children is sexual content. They know this and they don’t change it. And they don’t change it because it would somewhat decrease their profitability. And currently in the absence of product liability, they will have no incentive to change their behavior. So we’re trying, you know. And one of the meta documents says, and I quote, “tweens are herd animals.” And I would submit that a company that refers to our children as animals is not necessarily a company that’s gonna respond to moral persuasion, yeah or social responsibility. What they’re gonna respond to is their bottom line. When they actually have to pay the costs of what their defective products are doing to their kids, they’ll change their behavior not before and not until, and that’s what we’re here to do.

Susan Barfield (09:13):
Well, you’ve talked to us a little bit about Section 230 and how you’re planning to combat it.

Matthew Bergman (09:20):
Well, without going into any details of our, our legal strategy. Section 230 was a statute that was enacted in the nineties ironically to protect kids from online abuse. And it provided for certain types of immunity for third party content hosted on social media. And well, social media didn’t exist then – hosted on, basically it was online bulletin boards. And it was enacted at a time when Netscape was the largest online provider. Google didn’t exist. Facebook didn’t exist. Mark Zuckerberg was in junior high school. And it’s been broadly interpreted to provide extreme protections to social media companies for all sorts of terrible things that befall their users. And I think there’s a widespread consensus in the halls of Congress and more and more among jurists that Section 230 has been interpreted far beyond its original legislative intent. And I believe it actually was used, has been used to facilitate these horrific design decisions that these companies make. Look, everybody – every man, woman and child in the United States has a duty of reasonable care. Everybody has a duty to be careful, right? And everybody has a duty to use reasonable care to avoid foreseeable harms, except the social media companies. They know their products are hurting people. Snapchat knows that it’s disappearing messages is being used as a conduit for drug dealers and pedophiles. They know this and they have no incentive to change it because they feel that they’re immune. We don’t believe this is what Congress intended and we’re seeing growing consensus throughout all realms of society and the academy and jurists that a more realistic interpretation of the statute has to prevail, more in keeping with its original statutory intent.

Susan Barfield (11:35):
I’m sure the social media platforms blame or point the finger to the parents. What can parents do to protect their children with the limitations that are put in place that prevent parental control and access, and how are you helping people?

Matthew Bergman (11:50):
We do this, we’re all in favor of parental responsibility. Sure, we’re all into that. The problem is that these products are deliberately designed to evade and thwart parents, exercise of responsibility, and oversight over their kids. Instagram encourages kids to open multiple accounts. They call it in their own words, “a value add,”, and the age verification, that children have to do. You know they can say whatever the age they want, you know. If I’d have been able to go into a, convenience store when I was in high school and said, “I’m 21,” I want here. I guess I think my high school career would’ve been a little different. Yet that’s what these companies are allowing kids to do. The other thing they’re not doing is verifying age and identity. You know, what’s interesting is that apps like Tinder and Grindr – hookup apps – they actually have technology embedded in them to get to confirm the identity of the people that are engaging with each other. And I’d submit that if technology that’s set up for people to hook up can provide these protections. Why aren’t the same protections being provided for our children? But the other thing is that they make it impossible for parents who know that their children are having trouble online to do anything about it. If their parents don’t know the password and username, well, you know, show me a 14 year old kid that gives their parents their username and password. I’ll show you a kid that flunks Teenager 101. There’s so many things that these companies can easily do to encourage and empower parents. And instead they’re making it very difficult for parents to do. But I guess, you know, in terms of examples for parents, a couple things come to mind. First, parents should lead by example. Parents that are always online aren’t engaging with their spouses or their families are setting a bad example for their kids. I think families need to have a “phone free time,” you know, during meal times and socializing, just so that there’s an engagement. I think parents need to be very, and I would hate to use this word, but invasive in monitoring their kids’ social media use, you know. This is not a place where parents can kind of have a hands off approach. If parents only knew that their 10 and 12 year olds are getting, and I hate to use the word, but dick pics from adults in the world. And this is pervasive – they’d say “enough is enough.” So I think that parents need to do everything they can to educate their kids, to foster open relationships with their kids so that they can talk freely about embarrassing things and that they can really understand how pernicious and deadly these products are to impressionable young kids.

Susan Barfield (14:52):
Yeah. Now, what is the “No Parental Consent” campaign you and your team are facilitating?

Matthew Bergman (14:59):
We have a program online and our website is socialmediavictims.org, where parents can notify social media companies that they don’t consent to their kids using their products. You know, some of the products like Snapchat actually says, you require parental consent to use that’s in their user agreement, but there’s no effort made to ensure that the parents actually consent. And so this is an opportunity for parents to direct an email to the leadership of the company saying “we don’t consent to our kids’ use, take them off.” And that’s, I think something that every parent should do, because there’s very little good that can go from unregulated social media use of kids.

Susan Barfield (15:47):
Right? What’s your thought about these cases being consolidated into an MDL?

Matthew Bergman (15:53):
Well, it’s currently pending before the MDL panel. We’re gonna find out on the 20th – well, there’s gonna be a hearing on the 29th. The panel typically rules fairly quickly thereafter. We think there are a lot of good reasons for consolidation but we also think there’s some reasons to be concerned otherwise. And you know, we’re comfortable you know. We we’re supportive of consolidation, but we understand that the panel might not feel that that’s an opportunity at this time. And so we are committed to going forward with these cases either in a consolidated form or not – this has to move forward. Something needs to be done. These companies have to be held accountable or else we’re going to continue to see this carnage that we’re seeing. I mean, there’s been 146% increase in suicide among 12 to 16 year olds. Since social media has come to the fore. This is code red. Something needs to be done, you know, not just in law, but in the halls of Congress, and in the minds and hearts of parents. And we’re committed regardless of whether or not the case is consolidated to moving forward with alacrity to try to hold these companies accountable and do something to save our kids. You know, I’ve been a plaintiff’s lawyer for 25 years and I’ve been blessed to represent clients who are worthy and thoughtful and careful and meritorious people who have been harmed and deserve compensation. But I’ve never had clients less concerned about money and more concerned about social good than the people I’m privileged to represent. Just yesterday. I spoke with three parents who lost their children to social media addiction and abuse. And every one of those parents when asked, “why are you going forward with this case?” They say, “to save other kids from what’s happening.” And they say that “if we can save one family, it’s been worth it.” And look, you know, the Talmud says, “if you save one life, it’s like you saved the world.”

New Speaker (18:00):
Yeah.

New Speaker (18:01):
And we’re convinced that if through our efforts, we save even one child, it’s been worth it. And that’s why we’re doing this. And it’s the most rewarding, challenging, and heart rendering work I’ve ever done in 30 years of law practice. But I can’t imagine myself doing anything but this.

Susan Barfield (18:21):
Yeah. I can only imagine the heaviness of those conversations day in, day out that you’re having, but like you said, it’s life altering. And to be able to help and prevent another family, another child, is just amazing. Think of, you know, if we’re thinking about the attorneys listening to this webinar. I mean, what do you say to attorneys that may be considering litigating these cases and how do they contact you?

Matthew Bergman (18:50):
Well, socialmediavictims.org is our website. And we provide, not just litigation information, but a depth of information for parents or for practitioners on what the state of the art is regarding social media and teen harms, some of the things to watch out for, and the like. But this, this is new litigation. This is very much like what asbestos litigation was like in 1966 when Ward Stephenson filed the, [inaudible] case in Orange County, Texas. This is the first, you know. And so we’re very much at the forefront. I think it’s important thatour lawyers work together, that they share ideas, and that they’re very thoughtful about how these cases are brought and how they’re prosecuted and what type of cases merit a prosecution and what don’t.

New Speaker (19:43):
Mm-hmm.

New Speaker (19:44):
You know, if the standard for a case is “did somebody have an adverse emotional impact from social media?”, we’d be talking about every American 30 years or younger.

Matthew Bergman (19:57):
Sure.

New Speaker (19:57):
So I think it’s important to identify harms that are recognizable legally in the context of social media addiction and abuse, and to think very carefully about the cases that get filed, and move very judiciously and doing a lot of due diligence on each case and making sure that it’s the right case to go forward at this time. You know, we’re at the forefront of something significant. We’re confident that ultimately the legal system will adapt to provide an avenue for these kinds of claims. But the only certainty at this juncture is that it’s gonna be a long, hard fight, you know, to paraphrase Thomas Hobbs – “This is gonna be nasty, brutish, and long.” So I don’t think it’s something that someone wants to go into lightly. I think it’s important to have colleagues, co-counsel, and supporters to work through these difficult issues and bring these cases hopefully to a successful conclusion.

Susan Barfield (20:57):
Yeah. Well Matthew, this has been very insightful, informative. I really appreciate you taking time to talk about this topic and about what you’re doing and what your team is doing and the families that you’re helping. So thank you so much for taking time today to connect with me and to share the insights on this litigation with all of our listeners.

Matthew Bergman (21:17):
Well, thank you very much. And thank you for taking this time. This is a very important issue and we’re happy to be here.

Susan Barfield (21:23):
Yeah, absolutely.

Featured Streams