#facebook meta lawsuit mental health
Explore tagged Tumblr posts
penpoise · 1 year ago
Text
Ellie Mental Health: Navigating the Path to Well-Being
I. Meet Ellie: Your Friendly Mental Health Companion Welcome to the world of Ellie, a unique ally on your journey to better mental health. A. A Friend in Need, Discover the story behind Ellie, born from the idea of creating a supportive friend to navigate the twists and turns of mental health. B. The Magic of Ellie’s Features Unravel the enchanting features that make Ellie more than just an…
Tumblr media
View On WordPress
0 notes
tomorrowusa · 2 months ago
Text
Being a content moderator on Facebook can give you severe PTSD.
Let's take time from our holiday festivities to commiserate with those who have to moderate social media. They witness some of the absolute worst of humanity.
More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism. The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi. The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa. The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.
You can imagine what now gets circulated on Elon Musk's Twitter/X which has ditched most of its moderation.
According to the filings in the Nairobi case, Kanyanya concluded that the primary cause of the mental health conditions among the 144 people was their work as Facebook content moderators as they “encountered extremely graphic content on a daily basis, which included videos of gruesome murders, self-harm, suicides, attempted suicides, sexual violence, explicit sexual content, child physical and sexual abuse, horrific violent actions just to name a few”. Four of the moderators suffered trypophobia, an aversion to or fear of repetitive patterns of small holes or bumps that can cause intense anxiety. For some, the condition developed from seeing holes on decomposing bodies while working on Facebook content.
Being a social media moderator may sound easy, but you will never be able to unsee the horrors which the dregs of society wish to share with others.
To make matters worse, the moderators in Kenya were paid just one-eighth what moderators in the US are paid.
Social media platform owners have vast wealth similar to the GDPs of some countries. They are among the greediest leeches in the history of money.
33 notes · View notes
stupittmoran · 1 year ago
Text
Tumblr media
The state of New Mexico has sued social media giant Meta and its CEO Mark Zuckerberg for “knowingly” exposing children to ‘sexual exploitation and mental health harm.’
In a Tuesday court filing, New Mexico’s Attorney General’s (NMAG) Office revealed that it had conducted an undercover investigation, creating fake accounts of minors which were then used to fish for offending content, according to a press release reported by the Daily Caller.
“Meta and its CEO tell the public that Meta’s social media platforms are safe and good for kids,” reads the lawsuit. “The reality is far different. Meta knowingly exposes children to the twin dangers of sexual exploitation and mental health harm. Meta’s conduct has turned New Mexico children who are on its platforms into victims. Meta’s motive for doing so is profit.”
Meta is accused of allowing Facebook and Instagram to become “a marketplace for predators in search of children upon whom to prey.”
56 notes · View notes
head-post · 6 months ago
Text
Instagram makes teen accounts private for security purposes
Instagram is making teen accounts private by default in an attempt to keep the platform safer for children amid a growing backlash, according to AP News.
Starting Tuesday in the US, UK, Canada and Australia, anyone under 18 who signs up for Instagram will be put into restricted teen accounts. Those who already have accounts will be transferred over the next 60 days. Teens in the European Union will see their accounts adjusted later this year.
Meta acknowledges that teens can lie about their age and says they will be required to confirm their age in more cases. The Menlo Park, California-based company, has also stated that it is developing technology that preemptively finds teen accounts posing as adults and automatically puts them into restricted teen accounts.
Teen accounts will be private by default. Private messages are limited, so teens can only receive them from people they are subscribed to or already connected to.
“Sensitive content” would be limited, Meta stated. Teens will also receive notifications if they are on Instagram for more than 60 minutes. “Sleep mode” will turn off notifications and send auto-replies to private messages from 10 p.m. to 7 a.m.
Naomi Gleit, head of product at Meta, stated:
The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see or that they’re getting contacted by people they don’t want to be contacted by or that they’re spending too much on the app. So teen accounts is really focused on addressing those three concerns.
The announcement came amid lawsuits from dozens of US states accusing the company of harming young people and contributing to the youth mental health crisis. Instagram and Facebook allegedly intentionally create features that tie children to their platforms.
Meta’s latest changes give parents more control over their children’s accounts. Those under the age of 16 will need permission from a parent or guardian to change their settings to be less restrictive.
According to Nick Clegg, Meta’s president of global affairs, teen accounts will create a “big incentive for parents and teens to set up parental supervision.”
Parents will be able to see, via the family center, who is messaging their teen and hopefully have a conversation with their teen. If there is bullying or harassment happening, parents will have visibility into who their teen’s following, who’s following their teen, who their teen has messaged in the past seven days and hopefully have some of these conversations and help them navigate these really difficult situations online.
Read more HERE
Tumblr media
3 notes · View notes
hunter-rodrigez · 1 year ago
Text
33 US states are suing facebook!!!
Tumblr media
(October 24th 2023)
Dozens of US states, including California and New York, are suing Meta Platforms Inc. for harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.
A lawsuit filed by 33 states in federal court in California, claims that Meta routinely collects data on children under 13 without their parents’ consent, in violation of federal law. In addition, nine attorneys general are filing lawsuits in their respective states, bringing the total number of states taking action to 41 and Washington, D.C.
LET'S FUCKING GOOOOOO
(Source)
5 notes · View notes
srkshaju · 1 year ago
Text
Content Moderation: A Toxic Cocktail of Algorithms and Burnout
Meta's dirty secret is finally spilling out in a Spanish courtroom.
A Barcelona court has ruled that the social media giant's local subcontractor, CCC Barcelona Digital Services, is responsible for the psychological damage suffered by a worker exposed to graphic content on Facebook and Instagram.
Tumblr media
This landmark decision exposes the dark underbelly of content moderation, where human beings are treated as disposable filters for the internet's sewer.
Imagine facing a barrage of murders, suicides, and torture every day. That's the reality for countless low-paid workers hidden behind the shiny screens of Facebook and Instagram.
Our 26-year-old Brazilian friend, after five years of this mental assault, is suffering from panic attacks, anxiety, and a crippling fear of death. Meta, however, seems content to deny the problem, treating his illness as a mere "common ailment."
But the court saw through their charade. This ruling is a major victory, not just for this brave individual, but for all workers whose mental health is sacrificed on the altar of social media engagement.
Espacio Jurídico Feliu Fins, the worker's law firm, put it perfectly: "Meta and social media in general must recognize the magnitude of this problem... They must accept that this horrific reality is as real as life itself."
Meta's outsourcing game is sickening. They dump the dirty work of filtering their toxic content onto third-party subcontractors, who in turn exploit young, vulnerable workers for pennies.
Remember the $52 million settlement in the US? Or the lawsuit in Africa against Sama, another Meta subcontractor? This is not an isolated incident; it's a systemic pattern of exploitation and neglect.
Meta's defense? A laughable cocktail of excuses and empty promises. They claim their contracts with subcontractors include provisions for counseling and support, but workers tell a different story.
They talk of "very insufficient" support, grueling performance quotas, and a constant threat of termination.
These "tools" to limit exposure become meaningless when meeting targets is paramount.
Shift work, burnout, and high churn are baked into this model. It's essentially outsourced burn-out-as-a-service, where workers are treated like disposable batteries for the content moderation machine.
This ruling, however, throws a wrench in the works.
Legal accountability might finally force these companies to take responsibility for the human cost of their toxic algorithms.
This is just the beginning. More lawsuits will follow, more voices will be heard.
The curtain is being pulled back on the dark side of social media, and we're finally seeing the human cost of their endless pursuit of clicks and engagement.
It's time for Meta and other tech giants to take responsibility for the mental health of the workers they exploit, to clean up their platforms, and to stop treating human beings like disposable filters in their toxic cesspool.
Let's make the internet a safer place, not just for users, but for the people who clean up its mess.
3 notes · View notes
beardedmrbean · 2 years ago
Text
SALT LAKE CITY (AP) — Children and teens in Utah would lose access to social media apps such as TikTok if they don’t have parental consent and face other restrictions under a first-in-the-nation law designed to shield young people from the addictive platforms.
Two laws signed by Republican Gov. Spencer Cox Thursday prohibit kids under 18 from using social media between the hours of 10:30 p.m. and 6:30 a.m., require age verification for anyone who wants to use social media in the state and open the door to lawsuits on behalf of children claiming social media harmed them. Collectively, they seek to prevent children from being lured to apps by addictive features and from having ads promoted to them.
The companies are expected to sue before the laws take effect in March 2024.
The crusade against social media in Utah's Republican-supermajority Legislature is the latest reflection of how politicians’ perceptions of technology companies has changed, including among typically pro-business Republicans.
Tech giants like Facebook and Google have enjoyed unbridled growth for over a decade, but amid concerns over user privacy, hate speech, misinformation and harmful effects on teens’ mental health, lawmakers have made Big Tech attacks a rallying cry on the campaign trail and begun trying to rein them in once in office. Utah’s law was signed on the same day TikTok’s CEO testified before Congress about, among other things, the platform's effects on teenagers’ mental health.
But legislation has stalled on the federal level, pushing states to step in.
Outside of Utah, lawmakers in red states including Arkansas, Texas, Ohio and Louisiana and blue states including New Jersey are advancing similar proposals. California, meanwhile, enacted a law last year requiring tech companies to put kids’ safety first by barring them from profiling children or using personal information in ways that could harm children physically or mentally.
The new Utah laws also require that parents be given access to their child's accounts. They outline rules for people who want to sue over harms they claim the apps cause. If implemented, lawsuits against social media companies involving kids under 16 will shift the burden of proof and require social media companies show their products weren’t harmful — not the other way around.
Social media companies could have to design new features to comply with parts of the laws that prohibit promoting ads to minors and showing them in search results. Tech companies like TikTok, Snapchat and Meta, which owns Facebook and Instagram, make most of their money by targeting advertising to their users.
The wave of legislation and its focus on age verification has garnered pushback from technology companies as well as digital privacy groups known for blasting their data collection practices.
The Electronic Frontier Foundation earlier this month demanded Cox veto the Utah legislation, saying time limits and age verification would infringe on teens’ rights to free speech and privacy. Moreover, verifying every users’ age would empower social media platforms with more data, like the government-issued identification required, they said.
If the law is implemented, the digital privacy advocacy group said in a statement, “the majority of young Utahns will find themselves effectively locked out of much of the web."
Tech industry lobbyists decried the laws as unconstitutional, saying they infringe on people’s right to exercise the First Amendment online.
“Utah will soon require online services to collect sensitive information about teens and families, not only to verify ages, but to verify parental relationships, like government-issued IDs and birth certificates, putting their private data at risk of breach,” said Nicole Saad Bembridge, an associate director at NetChoice, a tech lobby group.
What’s not clear in Utah's new law and those under consideration elsewhere is how states plan to enforce the new regulations. Companies are already prohibited from collecting data on children under 13 without parental consent under the federal Children’s Online Privacy Protection Act. To comply, social media companies already ban kids under 13 from signing up to their platforms — but children have been shown to easily get around the bans, both with and without their parents’ consent.
Cox said studies have shown that time spent on social media leads to “poor mental health outcomes” for children.
“We remain very optimistic that we will be able to pass not just here in the state of Utah but across the country legislation that significantly changes the relationship of our children with these very destructive social media apps,” he said.
The set of laws won support from parents groups and child advocates, who generally welcomed them, with some caveats. Common Sense Media, a nonprofit focused on kids and technology, hailed the effort to rein in social media's addictive features and set rules for litigation, with its CEO saying it “adds momentum for other states to hold social media companies accountable to ensure kids across the country are protected online.”
However, Jim Steyer, the CEO and founder of Common Sense, said giving parents access to children’s social media posts would “deprive kids of the online privacy protections we advocate for." Age verification and parental consent may hamper kids who want to create accounts on certain platforms, but does little to stop companies from harvesting their data once they're on, Steyer said.
The laws are the latest effort from Utah lawmakers focused on the fragility of children in the digital age. Two years ago, Cox signed legislation that called on tech companies to automatically block porn on cellphones and tablets sold in the state, after arguments about the dangers it posed to children found resonance among Utah lawmakers, the majority of whom are members of The Church of Jesus Christ of Latter-day Saints. Amid concerns about enforcement, lawmakers ultimately revised that legislation to prevent it from taking effect unless five other states passed similar laws.
The regulations come as parents and lawmakers are growing increasingly concerned about kids and teenagers’ social media use and how platforms like TikTok, Instagram and others are affecting young people’s mental health. The dangers of social media to children is also emerging as a focus for trial lawyers, with addiction lawsuits being filed thorughout the country.
7 notes · View notes
forsakebook · 2 years ago
Link
5 notes · View notes
voiceofentrepreneurlife · 4 months ago
Text
Judge Clears Mark Zuckerberg in Lawsuit Over Social Media's Impact on Kids
Tumblr media
A federal judge has ruled that Meta Platforms CEO Mark Zuckerberg cannot be held personally liable in ongoing lawsuits against the company, which allege it has endangered children by fostering social media addiction. On Thursday, Judge Yvonne Gonzalez Rogers dismissed claims that Zuckerberg played a key role in concealing the mental health risks associated with Facebook and Instagram for young users. The plaintiffs, representing children across various states, argued that Zuckerberg ignored internal warnings and downplayed social media’s potential dangers.
Judge Rogers concluded that the plaintiffs did not provide concrete evidence linking Zuckerberg directly to the alleged misconduct. She emphasized that controlling a company’s actions does not automatically equate to personal liability. However, this ruling does not affect the broader lawsuits still proceeding against Meta. Read More-https://voiceofentrepreneur.life/judge-exonerates-meta-ceo-mark-zuckerberg-in-lawsuit-regarding-social-medias-impact-on-children/
0 notes
technewssolution · 5 months ago
Text
Meta faces lawsuits over allegations of teen social media addiction
Meta, the parent company of Facebook and Instagram, is being sued in more than 40 states for allegedly contributing to juvenile social media addiction and causing mental health issues. The complaints allege that Meta built its platforms to be addictive, particularly for younger users, and ignored these concerns while being aware of the hazards. 
0 notes
socialservices147 · 10 months ago
Text
States Suing Meta And Instagram Over Mental Health of Kids
On May 22, 2024, a group of 33 states filed a lawsuit against Meta, the parent company of Facebook and increase Instagram, alleging that the company's social media platforms are harming the mental health of account and teenagers. The lawsuit alleges that Meta has knowingly designed its increase platforms to be addictive and to exploit the vulnerabilities of young people. The lawsuit also alleges that Meta has failed to take adequate steps to protect children from harmful content on its platforms.
The lawsuit is the latest in a increase account growing number of legal challenges against Meta over the company's role in harming the mental health of young people. In recent years, Meta has been accused of contributing to the increase in anxiety, depression, and suicide among children and teenagers.
"The more time kids spend on the app, the more addicted our kids become and the more money increase Meta account," North Carolina Attorney General Josh Stein said Tuesday during a news conference at the Charlotte-Mecklenburg Government Center, which was streamed on the Facebook page of the state Department of Justice and attorney general's office.
In a statement, Meta said it shares "the attorneys general's commitment to providing teens with safe, increase positive experiences online, and have already introduced over 30 tools to support teens and their families."
"We're disappointed that instead of working productively with increase, companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path," the company added. Meta has denied the allegations in the lawsuit. The company has said that it is committed to protecting young people on its platforms and that it has taken a number of steps to do so, such as developing parental controls and removing harmful content.
However, the states that have filed the lawsuit argue that increase Meta has not done enough to protect children. The states are seeking a court order requiring Meta to change its practices and to pay damages to the families of children who have been harmed by the company's platforms. The outcome of the lawsuit could have a significant impact on Meta and on the way that social media platforms are designed and regulated. If the states are successful, Meta could be forced to make major changes to its platforms, and other social media companies could face similar lawsuits. In Tuesday's cases, Meta could face civil penalties of $1,000 to $50,000 for each violation of various state laws -- an increase amount that could add up quickly given the millions of young children and teenagers who use Instagram. Much of the focus on Meta stemmed from a whistleblower's release of documents in 2024 that showed the company knew increase Instagram, which began as a photo-sharing app, was addictive and worsened body image issues for some teen girls. The lawsuit by the 33 states alleged that Meta has strived to ensure that young people spend as much time as possible on increase social media despite knowing that they are susceptible to the need for approval in the form of "likes" from other users about their content. The lawsuit is also a reminder of the potential dangers of social media for children and teenagers. Parents should be aware of the risks and should take steps to protect their children from harm. TIKTOK, YOUTUBE ALREADY FACE LAWSUITS The cases are the latest in a string of legal actions against increase account social media companies on behalf of children and teens. Meta, ByteDance's TikTok and Google's (GOOGL.O) YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictiveness of social media. Mark Zuckerberg, Meta's chief executive, has defended in the past his increase company's handling of content that some critics find harmful.instagram likes account Here are some tips for parents on how to protect their children from the potential dangers of social media: Talk to your children about the risks of increase social media. Set limits on how much time your children spend on social media. Monitor your children's social media activity. Talk to your children about what they see and do on social media. Encourage your children to come to you if they see something or experience something harmful on social media. If you are concerned about the mental health of your child, please reach out to a qualified mental health professional. Participating in Tuesday's multistate federal suit are California, Colorado, Connecticut, Delaware, Georgia, united kingdom, Hawaii, Idaho, Illinois, Indiana, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Missouri, Nebraska, New Jersey, New York, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Virginia, Washington, West Virginia and Wisconsin. The additional suits filed in state courts were brought by the District of Columbia, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah and Vermont.The use of increasee social media among teens is nearly universal in the U.S. and many other parts of the world. Up to 95% of youth ages 13 to 17 in the U.S. report using a social media platform, with more than a third saying they use social media "almost constantly," according to the Pew Research Center study released in may of 2024.
0 notes
usnewsper-politics · 1 year ago
Text
Meta's Addictive Design: Lawsuit Links Facebook to Youth Mental Health Crisis #MetalawsuitFacebookaddictivedesignYouthmentalhealthcrisisSocialmediaandmentalhealthBigTechresponsibility
0 notes
yooha87 · 1 year ago
Text
Instagram was accused of providing a platform for sexual abuse of children
The New Mexico Attorney General has filed a new lawsuit against Meta, calling Facebook and Instagram a breeding ground for sex offenders.
Advertising
According to the Wall Street Journal , a new complaint has been filed by the New Mexico Attorney General against Meta. The lawsuit alleges that Meta and its CEO, Mark Zuckerberg , have allowed Facebook and Instagram to become "platforms for child sexual abusers." According to the complaint, filed in state court on Tuesday, the meta-algorithms recommend sexual content to children.
The New Mexico Attorney General's Office conducted an investigation that included the creation of a number of fake and test profiles on Facebook and Instagram; Accounts that claimed to be children or teenagers. As a result of this investigation, not only did they find inappropriate recommendations for each of the prey accounts (including accounts that publicly posted adult pornography), but it was confirmed that they also attracted sexual harassers and predators.
A test account claiming to be a 13-year-old girl gained more than 6,700 followers, most of whom were male adults. Some accounts asked the user to contact them privately or meet them offline.
According to the complaint filed, the fake account of the 13-year-old girl received "messages full of sexual images and videos". This account reported many posts and accounts; But Meta responded that it found no violations of community standards.
The complaint states: "Meta, Facebook and Instagram platforms are breeding grounds for criminals who prey on children for human trafficking, distribution of sexual images, neglect and prostitution. Teenagers and children can easily register without any restrictions and there is no age verification process for them. It also sends inappropriate meta and content to them."
The Wall Street Journal recently published findings on how Facebook enables and promotes groups that share child sexual abuse content. Based on this, Meta developed terms, phrases and emojis related to child safety and used them to identify criminal networks. It also stopped recommending groups with members "behaving in a suspicious manner."
A Meta spokesperson told the Wall Street Journal: "We use advanced technology to root out abusers, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information with other companies and law enforcement agencies. "We share with state attorneys general."
New Mexico Attorney General Raul Torres claimed that Meta downplays the dangers children face on the platform and continues to prioritize engagement and advertising revenue over the safety of the most vulnerable members of society.
Meta is currently facing dozens of lawsuits alleging that its platform harms the mental health of children. Zuckerberg and the CEOs of other major social platforms are also set to testify before the US Senate in January about their failure to protect children.
0 notes
influencermagazineuk · 1 year ago
Text
Legal Woes Mount for Meta as US States Sue Over Child Safety Concerns
Tumblr media
In a recent development, the state of New Mexico has filed a lawsuit against Meta Platforms, the parent company of Facebook and Instagram, along with CEO Mark Zuckerberg. New Mexico Attorney General Raul Torrez contends that Meta has inadequately protected children from sexual abuse, online solicitation, and human trafficking. Torrez stated, "Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex." The lawsuit alleges that Meta's platforms facilitated adults in finding, contacting, and coercing children into providing explicit content. In response, Meta emphasized its use of advanced technology, employment of child safety experts, reporting of content to the National Center for Missing and Exploited Children, and collaboration with law enforcement to combat predatory behavior. This legal action follows similar suits from other states, such as Montana, which accused Meta of intentionally designing Instagram to be addictive, especially to minors. In October, over 40 U.S. states filed lawsuits against Meta, asserting that the company fueled a youth mental health crisis by fostering addictive social media platforms. The attorneys general from 33 states, including California and New York, alleged that Meta misled the public about the risks of its platforms, knowingly encouraging addictive social media use among young children and teenagers. Eight additional states and Washington, D.C. lodged similar complaints. These lawsuits represent a broader trend of legal actions against social media companies, including Meta, TikTok's ByteDance, and Alphabet's YouTube, on behalf of children and teenagers. Notably, Meta is also accused of intentionally evading children privacy laws, prompting U.S. Senators Ed Markey and Bill Cassidy to call for a halt to such practices. Read the full article
0 notes
head-post · 4 months ago
Text
TikTok sued in France over harmful content that allegedly led to teen suicides
Seven French families have filed a lawsuit against social media giant TikTok on Monday, accusing the platform of exposing their teenage children to harmful content that caused two of them to commit suicide at the age of 15, French media reported.
The families’ lawsuit claims that TikTok’s algorithm showed the seven teens videos promoting suicide, self-harm and eating disorders. The lawyer for the affected families said:
“The parents want TikTok’s legal liability to be recognised in court. This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings.”
TikTok has long faced criticism over content control on its app, as have a host of other social networks.
Like Meta’s Facebook and Instagram, it faces hundreds of lawsuits in the US accusing them of seducing and addicting millions of children to their platforms, harming their mental health.
Media could not immediately reach the company for comment on the charges.
The company has previously said it takes children’s mental health issues seriously. CEO Shou Zi Chew told US lawmakers this year that the company had invested in measures to protect young people using the app.
Earlier, French Minister of Digital Transition and Telecommunications Jean-Noël Barrot reported that social network X could be banned in the European Union if the platform fails to comply with new EU rules against misinformation. According to him, the social network plays an important role in public debate, but at the same time disinformation is one of the threats to democracy.
Read more HERE
Tumblr media
0 notes
mentalhealthinpdx · 1 year ago
Text
#4 News
The solution to the lack of connection
Opposing opinions isn’t the best way to look at differing views for the causes behind youth mental health in Oregon, everyone is fighting toward the same goal just fighting different fronts. The state of Oregon joined dozen of other states in suing Meta claiming “app features including infinite scrolling, like buttons, push notifications, and “rabbit hole” algorithms are harmful for children and teen’s mental and physical health, adding to the “youth mental health crisis”” (Bourgeois 2023) Social media's influence on teens has grown during the pandemic a time where many were isolated and sought connection online. The lack of connection is a big factor in teen’s mental health problems both the state of oregon and Jill Baker, OHA Youth Suicide Prevention coordinator agree. 
The state of Oregon condemned social media and Meta in particular for their predatory practices and misleading data collection, but offered no alternatives to keep teens connected with their community. Jill Baker talks about the lack of connection being a growing problem in teens and lists several online based resources as solutions. Baker also offers several other solutions in her problem solving focussed interview with the Oregon Health News Blog including in person outreach to friends and family members that might be experiencing suicidal thoughts. Baker suggests Oregon’s Suicide Preventaion training OPR; Ouestion, Persuade, Refer as an invaluable resource that everyone should take. 
Both articles acknowledge there is a problem in teen mental health and wellbeing in Oregon but one offers solutions while the other places blame. 
sources:
Bourgeois, M. (2023, October 24). Oregon, Washington join suit against Meta Over Kids’ health concerns. KOIN.com. https://www.koin.com/news/oregon-washington-join-suit-against-meta-over-kids-health-concerns/ 
To prevent youth suicide, connection is key. Oregon Health News Blog. (2023, June 8). https://covidblog.oregon.gov/to-prevent-youth-suicide-connection-is-key/ 
0 notes