
a bipartisan pair of senators restart The Kids Online Safety Act Tuesday with an update that aims to address concerns that the bill may inadvertently cause more harm to the young Internet users it seeks to protect. But some activists raising those issues say the changes are still insufficient.
The bill aims to make the Internet a safer place for children to access, while preventing and mitigating the harm caused by their services on social media companies. The new version of the bill defines a prescribed list of harms that require platforms to take reasonable steps to mitigate, including preventing posts that promote suicide, eating disorders, substance abuse and more . It will require those companies to undergo annual independent audits of their risks to minors and enable them to enable the strongest privacy settings for children by default.
Congress And President Joe Biden has made it clear that online safety for children is a major priority, and COSA has become one of the leading pieces of legislation on the subject. KOSA has garnered a long list of more than 25 co-sponsors and the first version of the bill passed unanimously from the Senate Commerce Committee last year. The new version of the bill has received support from groups such as Common Sense Media, the American Psychological Association, the American Academy of Pediatrics, and the Eating Disorders Coalition.
In a virtual press conference Tuesday, Sen. Richard Blumenthal, D-Conn., who introduced the bill with Sen. Marsha Blackburn, R-Tenn., said Senate Majority Leader Chuck Schumer, D.N.Y., is “a hundred” Cent behind the bill and efforts to protect children online.”
While Blumenthal acknowledged that it is ultimately up to the Senate leadership to figure out the timing, he said, “I have every expectation and hope that we will have a vote this session.”
Schumer’s spokesman did not immediately respond to a request for comment.
at the end of last year, Dozens of civil society groups warned Congress against the passing of the bill, claiming that it could further endanger young internet users in various ways. For example, groups concerned the bill would increase pressure for online platforms to “ultra-moderate,” including state attorneys general seeking to make a political point. what kind of information is appropriate for young people.”
Blumenthal and Blackburn made several changes to the text in response to criticisms from outside groups. They called for the law to be more carefully crafted to limit the duty of care requirements for social media platforms to a specific set of potential harms to mental health based on evidence-supported medical information.
They also added protections for support services like the national suicide hotline, substance abuse groups and LGBTQ+ youth centers to ensure they are not inadvertently disrupted by the bill’s requirements. Blumenthal’s office said it did not believe a duty of care would apply to those types of groups, but chose to clarify regardless.
But the changes haven’t been enough to placate some civil society and industry groups.
Evan Greer, director of the digital rights nonprofit Fight for the Future, said Blumenthal’s office never met with the group or shared the updated text before the introduction, despite multiple requests. Greer acknowledged that the co-sponsors’ offices met with other groups, but said in an emailed statement that “it appears they intentionally excluded groups that have specific issue-area expertise in content moderation, algorithmic recommendation, etc.” Is.”
Greer wrote, “I have read it and can state categorically that the changes that have been made do not address the concerns raised in our letter.” “The bill still has a duty of care that covers content recommendation, and it still effectively allows state attorneys general to determine what content platforms can recommend to minors,” He said.
“The ACLU strongly opposes COSA because it would ironically expose the very children it seeks to protect to harm and increased surveillance,” ACLU senior policy advisor Cody Wenzke said in a statement. The group joined last year in a letter warning against its passage.
“Cosa’s original approach still jeopardizes the privacy, safety and free expression of both minors and adults,” said Wenzke, who deputized platforms of all stripes to police their users and filter their content ‘with care’. Censors under the guise of ‘duty’.” To accomplish this, the bill would legalize platforms’ already extensive data collection to identify which users are minors when seeking to curb the misuse of those data. Parental guidance is important in life, but KOSA would mandate monitoring devices regardless of home situations or safety of minors. KOSA would be a step backward in making the internet a safer place for children and minors.”
At the press conference, in response to a question about criticism of Fight for the Future, Blumenthal said that the duty of care had been “pretty purposefully narrowed” to target certain disadvantages.
“I think we’ve met that kind of suggestion very directly and effectively,” he said. “Obviously, our door remains open. We’re willing to listen and talk to other kinds of suggestions. And we’ve talked to a number of groups that were very critical and many have actually dropped their protest, As I think you will hear in the answers to today’s session. So I think our bill has been clarified and reformulated in a way that addresses some of the criticisms. We want to solve all the world’s problems in one bill We’re not going to solve it. But we’re making a measurable, very important start.”
The bill also faced criticism from several groups receiving funding from the tech industry.
NetChoice, which has California is sued over its Age-Appropriate Design Code Act and whose members include Google, meta And TikTok said in a press release that despite efforts by lawmakers to respond to concerns, “unfortunately, how this bill would work in practice still requires an age verification mechanism and data collection on Americans of all ages.” “
“How young people should use technology is a tough question to figure out, and parents have always had the best answer,” Carl Szabo, NetChoice vice president and general counsel, said in a statement. “Cosa instead creates an oversight board of DC insiders that will take the place of parents in deciding what is best for children,” Szabo said.
“Cosa 2.0 raises more questions than it answers,” Ari Kohn, Free Speech Counsel for TechFreedom, said in a statement. “What is reason to know that a user is under 17 is completely vague and undefined by the Bill. In the face of that uncertainty, platforms must explicitly age-verify all users to avoid liability.” – or worse, avoid gaining any knowledge. And leave minors without any protection.”
Matt Schruers, president of the Computer and Communications Industry Association, said, “The online safety of young people is a broadly shared goal. But it would contradict the goals of such a bill to impose compliance obligations that protect teens’ privacy and security.” weakens.” members are included Amazon, Google, Meta and Twitter. “Governments should avoid compliance requirements that would force digital services to collect more personal information about their users – such as geographic location information and government-issued identification – particularly when responsible companies collect less data on customers. establishing measures to store and store
WATCH: Sen. Blumenthal accuses Facebook of adopting Big Tobacco’s playbook