Únete al movimiento - Firma el Manifiesto Delay 📑

Seguridad: general
April 5, 2026

What is Section 230 and Why It Matters

Physical Harm and Digital Harm

A past E. coli outbreak and the way McDonald’s responded show how differently society treats physical harm compared to digital harm.

When food makes children sick, the reaction is immediate. Investigations begin right away. Public statements are issued. Products are pulled from shelves. The source is traced, and safeguards are put in place to prevent it from happening again. There is clear action and clear accountability.

Now compare that to the way we respond to digital harm.

When online platforms expose children to addictive design, sexualized content, self-harm material, or predatory contact, the response is often slower and far less decisive. The conversation frequently shifts toward “parent responsibility” rather than platform accountability.

Let me be clear: parents are primarily responsible for the digital well-being of their children. But all the pressure shouldn’t be on the parents. Big Tech should be held accountable and liable to the egregious harm that happens on their platforms, just like how we would treat a physical establishment for serving food containing E. coli.

In a healthy and just society, children are protected by multiple layers of safeguards in physical spaces and products. We do not leave it solely up to parents to regulate alcohol, tobacco, unsafe toys, or contaminated food. Society has collectively decided that children deserve broader protection. For physical things.

So why should the digital world be any different?

The “Attractive Nuisance” Principle

In law, there is a concept known as the attractive nuisance doctrine. It applies when something in the physical world is likely to draw children in, such as a swimming pool or water slide. Because children are naturally curious and still developing judgment, property owners are required to take precautions.

That’s why we require fences, locked gates, and warning signs. We’ve collectively decided those safeguards are wise and necessary. For physical spaces.

But that same principle rarely seems to apply online to online platforms. Even though that’s where kids are spending a lot of their time.

Children often encounter infinite scrolling (a never-ending feed), algorithmic amplification, sexualized content, and intentionally addictive design. In many ways, these features function like the digital equivalent of an unfenced swimming pool or water slide.

What About Kids Without Protective Parents?

Now, not every child has a parent who understands digital risks, has the time to supervise, knows how to install filters, or even recognizes the problem. Largely, we still don’t blame the parents as many of the controls, toggles, and settings are complex and difficult to set up. 

If you can relate, consider viewing our Device Guides and App Reviews for step-by-step instructions on how to keep your kid safe.

The German theologian Dietrich Bonhoeffer once wrote, “The test of the morality of a society is what it does for its children.” His words still carry tremendous weight today.

Why Policy Still Matters

Many people, including myself, are not eager for more government involvement in our lives. But policies can play an important role when they create consequences for companies that knowingly harm children. We don’t want our government to parent for us, but they should help hold companies accountable and liable.

That could include auditable global design standards, something the United Kingdom has already begun implementing. It could involve meaningful age verification systems, ideally using privacy-preserving technologies like zero-knowledge proof. It could also mean real liability for companies that ignore safety concerns, a direction Australia has been exploring.

In the United States, legislation such as the Kids Online Safety Act (KOSA) continues to generate debate about the responsibilities platforms should carry when it comes to protecting young users. Section 230 is the primary clause that allows tech companies to dodge accountability and liability when a user is harmed on their platform. 

The “Digital Playhouse” Thought Experiment

Researcher Michael Salter offers a powerful thought experiment that helps illustrate the difference between how we regulate physical spaces and digital ones.

Trigger warning: references to child sexual abuse below.

“Thought Experiment:

I buy some land on a busy road and built a playhouse stocked with toys and games. I throw the doors open, and anyone can come in for free. I make money by selling advertising space on the walls of the playhouse.

Children love the playhouse, and they are a lucrative market, so I encourage them in. I don’t want to pay for staff to check ID at the door or supervise the playhouse. It’s fairly common that men come in off the street to sexually abuse children in the playhouse. 

The government has decided that my playhouse is a special kind of company. They have provided me with immunity from civil liability or criminal charges linked to abuse in my playhouse. I don’t want to change my lucrative business model, and I don’t have to.

However, it’s important for my reputation that I’m seen to care about abuse in the playhouse. So I blame parents - it’s their fault for not coming to the playhouse with their child. And I encourage children to become “playhouse literate” - to know the warning signs of abuse.

All the while, I’m making huge sums of money running the playhouse and selling advertising space. My business model depends on my open-door policy, and the last thing I want to do is spend money on supervision, checking ID at the door, etc.

Nonetheless, I express vocal concern about abuse. I join “multi-stakeholder” dialogues about abuse. I found anti-abuse organizations where I sit on the board and control their strategic direction. Meanwhile, I plan to build more playhouses on the same model.

Obviously, this is a reprehensible scenario, but it’s the online status quo. Why is it wrong to build physical structures where children are abused, but fine and legal to build online structures where children are abused?

Why do we blame religious institutions for creating environments with no oversight where children can be abused, but we are unconcerned when technology companies do exactly the same thing?

Why is an offline or physical child-focused product or service subject to stringent safety requirements while its online equivalent is exempt?”

Final Thoughts

Some might respond to these concerns by saying, “Parents just shouldn’t allow their kids into the digital playhouse.” In many ways, I agree. But shouldn’t we also live in a society where we create meaningful policy and change to hold businesses like the “digital playhouse” accountable for what happens inside? We certainly think so.

When it comes to social media, we should wait. We should delay all addictive tech for as long as we can. You might feel like the only one, but hold your ground. Many families, including ours, choose a slower approach to technology. No anti-tech, but choosing the right tech at the right time.

Why? Because what kind of parents would we be if we weren’t involved with that playhouse?

Also see our Instagram and Facebook posts about this topic.

¿Qué pasa si tengo más preguntas? ¿Cómo puedo mantenerme al día?


¡Dos acciones que puedes tomar!

  1. Suscríbase a nuestro boletín de tendencias tecnológicas, el Descargar PYE. Aproximadamente cada 3 semanas, compartiremos las novedades, lo que está haciendo el equipo de PYE y un mensaje de Chris.
  2. ¡Haga sus preguntas en nuestra comunidad privada de padres llamada The Table! No es otro grupo de Facebook. Sin anuncios, sin algoritmos, sin asteriscos. ¡Solo conversaciones honestas y críticas y aprendizaje profundo! Para padres que quieren «ir despacio» juntos. ¡Conviértase en miembro hoy mismo!

Una carta de nuestro CEO

Lea sobre el compromiso de nuestro equipo de proporcionar a todos los usuarios de nuestra plataforma global la tecnología que puede ayudarlos a avanzar.

Lea la carta de Chris
Presentado en Childhood 2.0

Es un honor unirme a Bark y a otros increíbles defensores en esta película.

Ver ahora
Presentador del Foro Económico Mundial

Se unió a una coalición de expertos mundiales para presentar sobre los daños de las redes sociales.

Aprenda más
Testificó ante el Congreso

Compartimos nuestra investigación y experiencia con el Comité Judicial del Senado de los Estados Unidos.

Aprenda más