Join the Movement - Sign the Delay Manifesto 📑


What is Section 230 and Why It Matters
Physical Harm and Digital Harm
A past E. coli outbreak and the way McDonald’s responded show how differently society treats physical harm compared to digital harm.
When food makes children sick, the reaction is immediate. Investigations begin right away. Public statements are issued. Products are pulled from shelves. The source is traced, and safeguards are put in place to prevent it from happening again. There is clear action and clear accountability.
Now compare that to the way we respond to digital harm.
When online platforms expose children to addictive design, sexualized content, self-harm material, or predatory contact, the response is often slower and far less decisive. The conversation frequently shifts toward “parent responsibility” rather than platform accountability.
Let me be clear: parents are primarily responsible for the digital well-being of their children. But all the pressure shouldn’t be on the parents. Big Tech should be held accountable and liable to the egregious harm that happens on their platforms, just like how we would treat a physical establishment for serving food containing E. coli.
In a healthy and just society, children are protected by multiple layers of safeguards in physical spaces and products. We do not leave it solely up to parents to regulate alcohol, tobacco, unsafe toys, or contaminated food. Society has collectively decided that children deserve broader protection. For physical things.
So why should the digital world be any different?
The “Attractive Nuisance” Principle
In law, there is a concept known as the attractive nuisance doctrine. It applies when something in the physical world is likely to draw children in, such as a swimming pool or water slide. Because children are naturally curious and still developing judgment, property owners are required to take precautions.
That’s why we require fences, locked gates, and warning signs. We’ve collectively decided those safeguards are wise and necessary. For physical spaces.
But that same principle rarely seems to apply online to online platforms. Even though that’s where kids are spending a lot of their time.
Children often encounter infinite scrolling (a never-ending feed), algorithmic amplification, sexualized content, and intentionally addictive design. In many ways, these features function like the digital equivalent of an unfenced swimming pool or water slide.
What About Kids Without Protective Parents?
Now, not every child has a parent who understands digital risks, has the time to supervise, knows how to install filters, or even recognizes the problem. Largely, we still don’t blame the parents as many of the controls, toggles, and settings are complex and difficult to set up.
If you can relate, consider viewing our Device Guides and App Reviews for step-by-step instructions on how to keep your kid safe.
The German theologian Dietrich Bonhoeffer once wrote, “The test of the morality of a society is what it does for its children.” His words still carry tremendous weight today.
Why Policy Still Matters
Many people, including myself, are not eager for more government involvement in our lives. But policies can play an important role when they create consequences for companies that knowingly harm children. We don’t want our government to parent for us, but they should help hold companies accountable and liable.
That could include auditable global design standards, something the United Kingdom has already begun implementing. It could involve meaningful age verification systems, ideally using privacy-preserving technologies like zero-knowledge proof. It could also mean real liability for companies that ignore safety concerns, a direction Australia has been exploring.
In the United States, legislation such as the Kids Online Safety Act (KOSA) continues to generate debate about the responsibilities platforms should carry when it comes to protecting young users. Section 230 is the primary clause that allows tech companies to dodge accountability and liability when a user is harmed on their platform.
The “Digital Playhouse” Thought Experiment
Researcher Michael Salter offers a powerful thought experiment that helps illustrate the difference between how we regulate physical spaces and digital ones.
Trigger warning: references to child sexual abuse below.
“Thought Experiment:
I buy some land on a busy road and built a playhouse stocked with toys and games. I throw the doors open, and anyone can come in for free. I make money by selling advertising space on the walls of the playhouse.
Children love the playhouse, and they are a lucrative market, so I encourage them in. I don’t want to pay for staff to check ID at the door or supervise the playhouse. It’s fairly common that men come in off the street to sexually abuse children in the playhouse.
The government has decided that my playhouse is a special kind of company. They have provided me with immunity from civil liability or criminal charges linked to abuse in my playhouse. I don’t want to change my lucrative business model, and I don’t have to.
However, it’s important for my reputation that I’m seen to care about abuse in the playhouse. So I blame parents - it’s their fault for not coming to the playhouse with their child. And I encourage children to become “playhouse literate” - to know the warning signs of abuse.
All the while, I’m making huge sums of money running the playhouse and selling advertising space. My business model depends on my open-door policy, and the last thing I want to do is spend money on supervision, checking ID at the door, etc.
Nonetheless, I express vocal concern about abuse. I join “multi-stakeholder” dialogues about abuse. I found anti-abuse organizations where I sit on the board and control their strategic direction. Meanwhile, I plan to build more playhouses on the same model.
Obviously, this is a reprehensible scenario, but it’s the online status quo. Why is it wrong to build physical structures where children are abused, but fine and legal to build online structures where children are abused?
Why do we blame religious institutions for creating environments with no oversight where children can be abused, but we are unconcerned when technology companies do exactly the same thing?
Why is an offline or physical child-focused product or service subject to stringent safety requirements while its online equivalent is exempt?”
Final Thoughts
Some might respond to these concerns by saying, “Parents just shouldn’t allow their kids into the digital playhouse.” In many ways, I agree. But shouldn’t we also live in a society where we create meaningful policy and change to hold businesses like the “digital playhouse” accountable for what happens inside? We certainly think so.
When it comes to social media, we should wait. We should delay all addictive tech for as long as we can. You might feel like the only one, but hold your ground. Many families, including ours, choose a slower approach to technology. No anti-tech, but choosing the right tech at the right time.
Why? Because what kind of parents would we be if we weren’t involved with that playhouse?

What if I have more questions? How can I stay up to date?
Two actions you can take!
- Subscribe to our tech trends newsletter, the PYE Download. About every 3 weeks, we’ll share what’s new, what the PYE team is up to, and a message from Chris.
- Ask your questions in our private parent community called The Table! It’s not another Facebook group. No ads, no algorithms, no asterisks. Just honest, critical conversations and deep learning! For parents who want to “go slow” together. Become a member today!

A letter from our CEO
Read about our team’s commitment to provide everyone on our global platform with the technology that can help them move ahead.
Featured in Childhood 2.0
Honored to join Bark and other amazing advocates in this film.
World Economic Forum Presenter
Joined a coalition of global experts to present on social media's harms.
Testified before Congress
We shared our research and experience with the US Senate Judiciary Committee.







