Experts say social media firms should not be left to enforce their own standards.

While more and more people acknowledge the harms caused by troubling online content, there is also a growing understanding that the real danger lies in the systems that curate and serve this content to users. 

A consensus is emerging among policymakers and privacy advocates, calling for a systemic approach to regulation that targets these underlying digital mechanisms.

Digital platforms' structural designs, particularly their algorithms and content-recommendation systems, play a pivotal role in shaping user experiences. 

“Content can harm, but not all harms come from content,” says Alice Dawkins, executive director of Reset.Tech Australia, a tech accountability research and policy organisation.

She said research shows varied efficacy in platforms' attempts to curb such content. For instance, while TikTok has blocked harmful material effectively, Instagram and X have struggled. X has even been found to rapidly guide users from dangerous to worse content. 

The current regulatory regime, which has been heavily influenced by tech executives, focuses on reactive measures like content takedowns. 

This approach “suits bombastic tech executives to keep regulatory fights to reactive measures”, Dawkins argues, noting that real change would require these companies to fundamentally reassess themselves, which they are not incentivised to do. 

Dawkins also criticises the influence the industry holds in its own regulation.

“The industry ‘holds the pen’ on drafting”, she says, allowing it to create regulations that are often toothless and easily manipulated by those they are supposed to regulate. 

Experts call for the role of industry to be diminished in these regulatory processes, and a more substantial inclusion of insights from public interest researchers and affected communities.

The gap between the promises made by these platforms under voluntary codes and their actual practices is telling. 

Despite pledges to curb misinformation, studies by groups like Reset.Tech Australia have shown that these platforms fail to adhere to their own standards.

“Across the board, there were some significant gaps between statement and practice,” Dawkins says

“Harm will continue to happen while we wait for industry-led regulation to fail,” she says, calling for government intervention that targets the root of the problem rather than its symptoms.