Will lawmakers really act to protect children online? Some say yes.

Share

In the final minutes of a congressional hearing Wednesday in which technology CEOs were chastised for failing to protect children online, Sen. Richard J. Durbin, D-Ill., urged lawmakers to act to protect to younger Internet users.

“There are no excuses,” he said.

Lawmakers have long made similar statements about holding tech companies accountable, and have little to show for it. Both Republicans and Democrats have stated at various points that it was time to regulate the tech giants on issues such as privacy and antitrust laws. Yet for years, that’s where it ended: no new federal regulations for companies to follow.

The question is whether this time will be different. And there are already indicators that the issue of child online safety may gain more legislative force.

At least six legislative proposals waiting in the wings in Congress target the spread of child sexual abuse material online and would require platforms like Instagram, Snapchat and TikTok to do more to protect minors. The efforts are supported by emotional accounts of children who were victimized online and died by suicide.

The only federal Internet law passed in recent years, SESTA (Stop the Enablement of Sex Traffickers Act and Combat Online Sex Trafficking Act), which made it easier for victims of sex trafficking to sue websites and online platforms, was approved in 2018. also after the heartbreaking testimony of a victim’s mother.

Child safety is a visceral and personally relatable issue that is easier to sell politically than other issues, lawmakers and online safety experts said. At Wednesday’s hearing, faced with stories of children who had died after sexual exploitation, Meta’s Mark Zuckerberg said he was sorry the families had suffered.

“Just like the tobacco industry, it took a series of embarrassing tobacco hearings, but Congress finally acted,” said Jim Steyer, president of Common Sense Media, a nonprofit children’s advocacy group. “The dam finally broke.”

Any legislative progress on child online safety would be a counterpoint to the gridlock that has engulfed Congress in recent years on other technology issues. Time and time again, proposed rules to govern tech giants like Google and Meta have failed to become law.

In 2018, for example, Congress questioned Zuckerberg over a leak of Facebook user data to Cambridge Analytica, a company that created voter profiles. Outrage over the incident led to calls for Congress to pass new rules to protect people’s online privacy. But while California and other states eventually passed online privacy laws, Congress did not.

Lawmakers have also attacked a legal statute, Section 230 of the Communications Decency Act, which protects online platforms like Instagram and TikTok from many lawsuits over content posted by their users. Congress has not substantially changed the statute, beyond making it more difficult for platforms to employ the legal shield when they are accused of significantly aiding sex trafficking.

And after companies like Amazon and Apple were accused of being monopolies and abusing their power over smaller rivals, lawmakers proposed a bill to outlaw some of their business practices. An effort to get the legislation across the finish line failed in 2022.

Sens. Amy Klobuchar, D-Minn., and Josh Hawley, R-Mo., as well as other lawmakers, have blamed the power of tech lobbyists for killing the proposed rules. Others have said that technology regulations have not been a priority for congressional leaders, who have focused on spending bills and measures aimed at subsidizing American companies that make crucial computer chips and harness renewable energy.

The Senate Judiciary Committee, which hosted Wednesday’s hearing, discussed five child safety bills targeting tech platforms ahead of the hearing. The committee passed the bills last year; none have become law.

Among the proposals were the STOPCSAM Act (Strengthening Transparency and Obligations to Protect Children from Abuse and Maltreatment Act), which would give victims new avenues to report child sexual abuse material to Internet companies. , and the REPORT (Review of Existing Procedures for Reporting through Technology) Act, which would expand the types of potential crimes that online platforms must report to the National Center for Missing and Exploited Children.

Other proposals would make it a crime to distribute an intimate image of someone without that person’s consent and force authorities to coordinate investigations into crimes against children.

A separate proposal passed last year by the Senate Commerce Committee, the Child Online Safety Act, would create a legal duty for certain online platforms to protect children. Some of the legislative proposals have been criticized by digital rights groups such as the Electronic Frontier Foundation, which say they could encourage platforms to remove legitimate content as companies try to comply with laws.

Klobuchar, who grilled the tech executives at Wednesday’s hearing, said in an interview that the session “felt like a breakthrough.” He added: “As someone who has taken on these companies for years, it’s the first time I felt hope for movement.”

Others were skeptical. For any proposal to pass, they will need the support of congressional leaders. Bills that passed out of committee last year will have to be reintroduced and go through that process again.

Hany Farid, a professor at the University of California, Berkeley, who helped create the technology used by platforms to detect child sexual abuse material, said he had watched Congress hold hearing after hearing on protecting children online.

“This is something we should be able to agree on: that we have a responsibility to protect children,” he said. “If we can’t do this right, what hope do we have for anything else?”

You may also like...