Join the Movement - Sign the Delay Manifesto 📑
Content and feature risks in the app.
Sora App Review
.webp)
What is the Sora App?
Sora is built by OpenAI and is a social video-creation platform powered by advanced AI that allows users to generate very realistic short videos and share them online.
Key features:
- Users can type in text prompts (for example, “a person riding a skateboard in a city at sunset”) and Sora will generate a short video that matches the prompt.
- The app also includes a “cameo” or likeness-upload feature: users can upload their own face/voice (or consent to others using their likeness) so that they (or friends) can appear in AI-generated video scenes. (The Verge) This is a huge part of Sora. Normal AI generated videos aren't all that interesting these days. But being able to put your face, or your friends faces, or famous peoples faces, in the videos you ask Sora to make is creating a lot of interest. And it should come with serious considerations.
- It also acts like a social media feed: users can view videos generated by others, remix them or share them, follow friends, etc. The design mimics popular short-form video apps. (IntuitionLabs)
- The app launched initially on iOS (in the U.S./Canada) and then expanded later to Android. (TechCrunch)
- Sora is built on an underlying AI video model (sometimes referred to as Sora or Sora 2) by OpenAI. (OpenAI)
How does Sora work?
Here’s a breakdown of how Sora functions from a user/parent perspective.
User experience:
- You (or your child) install the app, create an account, and then can either view existing videos in the feed or create new ones.
- To create a video: simply type a text prompt (what you want to see), optionally upload an image or your face for a cameo, and the AI will generate a short video. For example, the app states: “Turn your ideas into videos and drop yourself into the action.” (App Store)
- The “cameo” feature requires you toupload a one-time video+audio of yourself, similar to setting up Face-ID on an Apple device. The difference is that it also instructs you to say a few phrases to capture your voice as well. Once complete, Sora can capture how you look and sound—then you or approved friends can generate scenes with your likeness. We highly recommend to not allow others to access your cameo. Limiting to some trusted friends might be okay.
- Videos can be shared in the feed, others can like/comment/remix (depending on permissions), and follow activity. The feed is algorithmic, similar to how other short-form video apps work. (IntuitionLabs)
Safety/Guardrails (what the company says):
- OpenAI says they built Sora with “safety from the start” and have features like visible watermarks, provenance metadata (to tag videos as AI-generated), and internal tools to trace back videos. (ABC News)
- There are content policies: for example, the app says you cannot generate certain categories of content, like explicit sexual content, depict public figures without permission, etc. However, many still do. Users do get to decide if others can use their cameo to generate videos, which helps.
- OpenAI also states that for younger users (teens), there are extra feed-filters, a limit on continuous scrolling, a toggle on personalized feed, etc.
Limitations/concerns from experts:
- Even with guardrails, some experts say that the protections are minimal and not robust enough for kids. For example, there is a report that the non-consensual use or peer-use of a face/voice likeness could still happen. (Parents)
- Because the video generation is very realistic, it becomes harder for younger users to distinguish “real” from “AI-generated fake,” which raises concerns about trust, identity misuse, bullying, etc.
- Some older or more critical reviews raise worries about the potential for harassment, deepfakes, misuse of likeness, etc. (Gabb)
What Else do Parents Need to Know about Sora?
Here are important considerations for parents when it comes to Sora:
Age & permissions:
- The app’s terms require users to be at least 13 years old; anyone under 18 is supposed to get parental permission.
- But simply stating a minimum age doesn’t guarantee safe use in real life. Tech-savvy teens may bypass restrictions.
Privacy & likeness risks:
- The cameo/face-upload feature means the app stores or uses your child’s face and voice. Once your likeness is in the system, even if you delete it, there are questions about what remains. Experts warn that uploading your face/voice has risks of misuse, impersonation, or bullying.
- Even if your child isn’t using Sora personally, they may still see Sora-generated videos on other platforms (TikTok, Instagram, YouTube) because people share them outside Sora. That means exposure even without direct access.
Feed & content exposure:
- The feed is algorithmically driven and can show lots of short, attention-grabbing video content (similar to TikTok). This brings typical risks of screen time, scroll addiction, and content that is potentially disturbing or manipulative.
- Experts cite that Sora has been rated by Common Sense Media as an “unacceptable risk” for kids because of the lack of strong parental controls and the possibility of harmful content.
Content creation & sharing:
- Kids may be excited to generate content (which can be fun and creative) but they should understand the consequences of sharing: once a video is out, copies may spread, may be remixed by others, and their likeness may be used beyond what they expect.
- There is potential for peer pressure: friends may say, “Hey, let’s make a crazy video with your face,” or someone could mislead or manipulate another user into uploading their face or voice.
- The lines between “fun remix” and hurtful content are thin: even non-violent, non-explicit scenes can be humiliating or used for bullying. Experts point out that the ability to create ultra-realistic scenes means that misuses can fly under the radar. (Gabb)
Digital literacy & conversation:
- Rather than just relying on blocking or banning the app, experts recommend open conversations with kids: what it means to have your face/voice in the app, how to tell if a video is real or AI-generated, how peer pressure works, and how sharing works. (Parents)
- It’s also a moment to talk about digital footprints: once you upload something (face/voice/image), you might lose a degree of control.
- Parents should monitor what apps their children install, make sure simulations like Sora are included in discussions of “what if a video looks real but is fake,” and help children understand the difference.
Technical safeguards & what to check:
- Check the app’s settings: whether message features are enabled, whether continuous scroll is on/off, and whether your child has uploaded a cameo.
- Encourage setting privacy settings together: ensuring their account is private, their likeness cannot be used by everyone, and permissions are understood.
- Keep devices in shared spaces, use screen-time tools if needed, and talk about what content the child is seeing and creating.
- Review together what videos your child is making and sharing, including whether they’ve given their likeness to others or allowed unknown users to reuse it.
Bottom Line: Is Sora Safe for Kids?
At this time, Sora carries significant risks for kids and teens. It is not inherently safe for younger children without very active parental supervision and rules.
Why:
- The creative and fun potential is real: Sora allows users to generate imaginative videos, insert themselves into scenes, remix, and share. That can be exciting.
- But the safety infrastructure is still limited: the cameo/face upload feature is inherently risky, the feed can show content that is difficult to moderate, parental controls are minimal, and exposure to remixing/sharing means loss of control.
- Experts and watchdogs (like Common Sense Media) already flagged Sora as “unacceptable risk” for kids due to minimal controls and high potential for misuse. (Common Sense Media)
- The realism of the videos means that children may be less able to distinguish what is real vs. AI-generated, raising concerns around self-esteem, bullying, identity misuse, and consent.
What we recommend to parents:
- If your child is under, say, 16 (or younger), you might consider not allowing them to use Sora independently right now unless you’re sitting with them, reviewing what they generate and share.
- If you allow use, make sure you have clear rules: no uploading face/voice without discussing it; set permissions for cameo use; review their feed and sharing; talk about what they see; keep devices in common family areas.
- Emphasize the principle: Just because you can generate something doesn’t mean you should share it.
- Use the presence of Sora as a teachable moment about the wider digital world: how AI creates, how identity works online, how sharing can have long-term consequences.
Sora has a lot of creative promise. But for kids and teens, it is definitely not “safe by default.” It requires active parenting, digital literacy conversations, and monitoring. At this stage, we would advise treating Sora like a high-risk app rather than a typical social platform. With strict rules and involvement, older teens might use it responsibly — but younger children should likely be restricted or supervised closely.
To learn more about AI, consider our following resources:
What if I have more questions? How can I stay up to date?
Two actions you can take!
- Subscribe to our tech trends newsletter, the PYE Download. About every 3 weeks, we’ll share what’s new, what the PYE team is up to, and a message from Chris.
- Ask your questions in our private parent community called The Table! It’s not another Facebook group. No ads, no algorithms, no asterisks. Just honest, critical conversations and deep learning! For parents who want to “go slow” together. Become a member today!

A letter from our CEO
Read about our team’s commitment to provide everyone on our global platform with the technology that can help them move ahead.
Featured in Childhood 2.0
Honored to join Bark and other amazing advocates in this film.
World Economic Forum Presenter
Joined a coalition of global experts to present on social media's harms.
Testified before Congress
We shared our research and experience with the US Senate Judiciary Committee.






