A bipartisan pair of senators reintroduced the Kids Online Safety Act on Tuesday with updates that aimed to deal with concerns that the bill could inadvertently cause more harm to the young web users it seeks to guard. But some activists who raised those issues say the changes are still insufficient.
The bill goals to make the web a safer place for teenagers to access by putting the onus on social media firms to forestall and mitigate harms that may come from their services. The new edition of the bill defines a set list of harms that platforms must take reasonable steps to mitigate, including by stopping the spread of posts promoting suicide, eating disorders, substance abuse and more. It could require those firms to undergo annual independent audits of their risks to minors and require them to enable the strongest privacy settings by default for teenagers.
related investing news
Congress and President Joe Biden have made clear online protections for kids are a key priority, and KOSA has turn into one in all the leading bills on the topic. KOSA has racked up a protracted list of greater than 25 co-sponsors and the sooner version of the bill passed unanimously out of the Senate commerce committee last 12 months. The new edition of the bill has gained support from groups corresponding to Common Sense Media, the American Psychological Association, the American Academy of Pediatrics and the Eating Disorders Coalition.
At a virtual press conference on Tuesday, Sen. Richard Blumenthal, D-Conn., who introduced the bill alongside Sen. Marsha Blackburn, R-Tenn., said that Senate Majority Leader Chuck Schumer, D-N.Y., is “one hundred percent behind this bill and efforts to guard kids online.”
While Blumenthal acknowledged it’s ultimately as much as Senate leadership to determine timing, he said, “I fully hope and expect we’ll have a vote this session.”
A Schumer spokesperson didn’t immediately reply to a request for comment.
Late last 12 months, dozens of civil society groups warned Congress against passing the bill, claiming it could further endanger young web users in other ways. For instance, the groups apprehensive the bill would add pressure for online platforms to “over-moderate, including from state attorneys general in search of to make political points about what kind of data is suitable for young people.”
Blumenthal and Blackburn made several changes to the text in response to critiques from outside groups. They sought to more rigorously tailor the laws to limit the duty of care requirements for social media platforms to a selected set of potential harms to mental health based on evidence-backed medical information.
Additionally they added protections for support services corresponding to the National Suicide Hotline, substance abuse groups and LGBTQ+ youth centers to make sure they are not unintentionally hampered by the bill’s requirements. Blumenthal’s office said it didn’t consider the duty of care would have applied to those forms of groups, but opted to make clear it regardless.
However the changes haven’t been enough to placate some civil society and industry groups.
Evan Greer, director of digital rights nonprofit Fight for the Future, said Blumenthal’s office never met with the group or shared the updated text prematurely of the introduction despite multiple requests. Greer acknowledged the co-sponsors’ offices met with other groups, but said in an emailed statement that “it seems they intentionally excluded groups which have specific issue-area expertise in content moderation, algorithmic advice, etc.”
“I’ve read through it and may say unequivocally that the changes which have been made DO NOT address the concerns that we raised in our letter,” Greer wrote. “The bill still comprises an obligation of care that covers content advice, and it still allows state Attorneys General to effectively dictate what content platforms can recommend to minors,” she added.
“The ACLU stays strongly against KOSA because it will mockingly expose the very children it seeks to guard to increased harm and increased surveillance,” ACLU Senior Policy Counsel Cody Venzke said in an announcement. The group joined the letter warning against its passage last 12 months.
“KOSA’s core approach still threatens the privacy, security and free expression of each minors and adults by deputizing platforms of all stripes to police their users and censor their content under the guise of a ‘duty of care,'” Venzke added. “To perform this, the bill would legitimize platforms’ already pervasive data collection to discover which users are minors when it must be in search of to curb those data abuses. Furthermore, parental guidance in minors’ online lives is critical, but KOSA would mandate surveillance tools without regard to minors’ home situations or safety. KOSA could be a step backward in making the web a safer place for kids and minors.”
On the press conference, in response to a matter about Fight for the Future’s critiques, Blumenthal said the duty of care had been “very purposefully narrowed” to focus on certain harms.
“I feel we have met that type of suggestion very directly and effectively,” he said. “Obviously, our door stays open. We’re willing to listen to and talk over with other forms of suggestions which can be made. And we now have talked to most of the groups that had great criticism and a number have actually dropped their opposition, as I feel you will hear in response to today’s session. So I feel our bill is clarified and improved in a way that meets among the criticism. We’re not going to resolve all the problems of the world with a single bill. But we’re making a measurable, very significant start.”
The bill also faced criticism from several groups that receive funding from the tech industry.
NetChoice, which has sued California over its Age-Appropriate Design Code Act and whose members include Google, Meta and TikTok, said in a press release that despite lawmakers’ attempts to answer concerns, “unfortunately, how this bill would work in practice still requires an age verification mechanism and data collection on Americans of all ages.”
“Understanding how young people should use technology is a difficult query and has all the time been best answered by parents,” NetChoice Vice President and General Counsel Carl Szabo said in an announcement. “KOSA as a substitute creates an oversight board of DC insiders who will replace parents in deciding what’s best for kids,” Szabo added.
“KOSA 2.0 raises more questions than it answers,” Ari Cohn, free speech counsel for TechFreedom, a think tank that is received funding from Google, said in an announcement. “What constitutes reason to know that a user is under 17 is entirely unclear and undefined by the bill. Within the face of that uncertainty, platforms will clearly must age-verify all users to avoid liability — or worse, avoid obtaining any knowledge in anyway and leave minors with none protections in any respect.”
“Protecting young people online is a broadly shared goal. Nevertheless it would contradict the goals of bills corresponding to this to impose compliance obligations that undermine the privacy and safety of teens,” said Matt Schruers, president of the Computer & Communications Industry Association, whose members include Amazon, Google, Meta and Twitter. “Governments should avoid compliance requirements that might compel digital services to gather more personal details about their users — corresponding to geolocation information and a government-issued identification — particularly when responsible firms are instituting measures to gather and store less data on customers.”
WATCH: Sen. Blumenthal accuses Facebook of adopting Big Tobacco’s playbook