In a polarized Congress, a rare bipartisan deal is being forged to protect the mental health of young people with a slew of bills aimed at regulating tech companies.
Growing efforts by lawmakers to target social media companies come alongside warnings from the US surgeon general about the effect of platforms on the mental health of minors. In May, Dr Vivek Murthy called for ‘immediate action’ from tech companies and lawmakers to protect children’s mental health after issuing a public health alert highlighting the effects of social media use on young people of the country.
In a report released in February, the Centers for Disease Control and Prevention found that adolescent girls were experiencing record levels of violence, sadness and suicide risk, with nearly 3 in 5 feeling consistently sad or hopeless in 2021. This represents almost 60% increase and the highest level recorded in the last decade.
While addressing America’s growing mental health crisis, especially among young people, has become a priority in Washington, there’s still no consensus on how to reign supreme in social media companies. Bills that aim to hold tech companies accountable through content moderation and fines have garnered more support than others that seek to ban minors from using the platforms.
Lawmakers try to hold technology accountable
Lawmakers who pushed to ban children from some social media platforms altogether have faced some criticism.
We don’t think the bills that keep children off social media are necessarily realistic, and even more we think that ultimately those are essentially putting the problem back on the feet of parents and young people, Josh Golin, the group’s executive director for child safety Fairplay, he told USA TODAY.
What we need is legislation that actually changes how these platforms engage with young people, he added.
The Kids Online Safety Act, otherwise known as KOSA, a bipartisan effort reintroduced this year by Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., are one of two bills supported by the children’s advocacy group.
The bill would provide families with the tools, safeguards and transparency they need to protect themselves from threats to children’s health and well-being online. It would also require platforms to put children’s interests first, according to a memo on the bill.
According to the text, under the bill:
- Social media platforms would be required to provide children with options to protect their information; disable addictive product features, including rewards for time spent on the platform and autoplay of media content, and disable algorithm recommendations.
- Parents would have access to new controls to help spot harmful behavior, which includes a dedicated channel for reporting harmful content to the platform.
- Platforms will have to perform an annual independent audit which will assess risks to minors, compliance with legislation and whether platforms are taking steps to prevent harmful impacts, including sexual exploitation and abuse.
- Social media platforms would be responsible for preventing and mitigating harm and harmful content, such as violence and the promotion of drugs and alcohol, to minors and would be subject to penalties for violations.
Advocates, including Golin, have also rallied in favor of the Children and Teens Online Privacy Protection Act, also known as COPPA 2.0.
The bill aims to update current online data privacy rules to help combat the youth mental health crisis by outlawing practices that target minors through algorithms and toxic content, according to a May press release.
While the CDC found a marked increase in mental health problems among adolescent girls, it also found that all adolescents reported increases, including in experiences of violence and suicidal thoughts and behaviors, according to data from the 2021 Youth Risk Behavior Survey.
COPPA 2.0 would build on the Children’s Online Privacy Protection Rule, or COPPA, by prohibiting tech companies from collecting information about users ages 13 to 16 without their consent, according to the bill.
The bill also:
- Prohibit advertising targeted to minors.
- Require companies to allow users to delete a minor’s personal information whenever possible.
- Establish a Digital Marketing Bill of Rights for Teens, which would limit the collection of teens’ personal information.
A problem that Congress cannot solve alone
Despite widespread support for the congressional action, some parents and educators aren’t convinced it will be enough, including Chicago high school teacher Max Bean.
Exposure to harmful content and data privacy are not primary concerns for Bean. Instead, he is concerned about social media’s ability to supplant in-person interaction and the consequences of less face-to-face contact.
The problem here is a total change in how humans interact, and you know that deleting harmful content isn’t going to fix that. I think humans need to interact face-to-face, Bean, 41, told USA TODAY, adding that along with legislation, there also needs to be social change in how social media is used.
Though the high school math and physics teacher is skeptical that congressional efforts will make a difference, she sees Washington’s action as the first step in addressing the harmful impacts of social media.
I don’t think Congress alone will solve this problem. But I think if Congress takes action, it encourages other people to do things, I think that’s a step, Bean said.
Beans’ perspective is one shared by Kailan Carr, a 39-year-old mom of two from Bakersfield, California.
I think Congress has a place in this puzzle, Carr told USA TODAY. Congress, I feel the need to step in and hold these big tech companies accountable, but I also feel it has to be like a company that didn’t put so much emphasis on social media.
#Washington #targets #social #media #protect #childrens #mental #health
Image Source : www.usatoday.com