Child Safety Standards
At MyOrbit, protecting children is our highest priority. We maintain a zero-tolerance policy for child sexual abuse material (CSAM) and child sexual abuse and exploitation (CSAE) on our platform. This document outlines our comprehensive approach to child safety, including prevention, detection, reporting, and compliance measures.
This policy works alongside our Safety Policy, Terms of Service, Privacy Policy, and Community Guidelines to create a safe environment for all users.
To report child safety concerns immediately: Contact us at safety@myorbit.ai or use our in-app reporting tools.
Table of Contents
1. Our Commitment to Child Safety
1.1 Zero Tolerance Policy
MyOrbit maintains an absolute zero-tolerance policy for:
- Child sexual abuse material (CSAM) of any kind
- Child sexual abuse and exploitation (CSAE)
- Grooming or predatory behavior toward minors
- Any content that sexualizes minors
- Trafficking or exploitation of children
- Any attempt to contact minors for inappropriate purposes
Immediate Action: Any content or behavior violating these standards results in immediate account termination, content removal, evidence preservation, and reporting to the National Center for Missing & Exploited Children (NCMEC) and law enforcement authorities.
1.2 Age-Appropriate Content
We enforce strict content tiers to ensure age-appropriate experiences:
- Tier 0 (All users 13+): General, educational content only
- Tier 1 (13+ with guidance): Contextual content with appropriate warnings
- Tier 2 (18+ only): Adult content restricted to verified adults
- Tier 3 (18+ verified): Explicit adult content with additional verification
Minors (ages 13-17) are automatically restricted from accessing Tier 2 and Tier 3 content.
2. CSAM Prevention and Detection
2.1 Detection Technologies
We employ industry-leading technologies to detect and prevent CSAM:
- PhotoDNA: Microsoft's hash-matching technology to identify known CSAM images
- AI-powered detection: Machine learning models trained to identify potentially harmful content
- Behavioral analysis: Pattern detection for grooming and predatory behavior
- Text analysis: Natural language processing to identify inappropriate conversations involving minors
2.2 Proactive Scanning
In our default AI Encryption mode:
- All uploaded images are scanned against known CSAM databases
- Conversations are monitored for child safety threats
- AI companions are programmed to refuse inappropriate requests
- Suspicious patterns trigger automatic review
2.3 Human Review
Our dedicated Trust & Safety team:
- Reviews flagged content within 1 hour for child safety concerns
- Operates 24/7 for immediate response to urgent threats
- Receives specialized training in child safety and trauma-informed practices
- Coordinates with law enforcement when required
3. Age Verification and Minor Protections
3.1 Age Verification System
We implement progressive age verification:
- Basic (All users): Phone number verification and date of birth confirmation
- Intermediate: Photo-based age estimation for sensitive features
- Advanced: Government ID verification for adult content access
3.2 Protections for Minors (Ages 13-17)
Users identified as minors automatically receive:
- Private profiles by default
- Restricted access to adult content and features
- Enhanced content filtering
- Limited ability to receive messages from unknown adults
- Blocked access to monetization features
- Age-appropriate AI companion interactions only
3.3 Under 13 Policy
MyOrbit is not intended for children under 13. In compliance with COPPA:
- We do not knowingly collect data from children under 13
- Accounts suspected of being under 13 are suspended pending verification
- Parents can request immediate deletion of any data inadvertently collected
4. Reporting Child Safety Concerns
4.1 How to Report
If you encounter child safety concerns on MyOrbit:
In-App Reporting
- Tap the report button on any content, profile, or message
- Select "Child Safety" as the concern type
- Provide any additional context
- Reports are prioritized and reviewed immediately
Direct Contact
- Email: safety@myorbit.ai
- Subject line: "Child Safety Report - URGENT"
- Response time: Within 1 hour for child safety matters
Emergency Situations
- If a child is in immediate danger: Contact local law enforcement (911 in the US) immediately
- NCMEC CyberTipline: www.cybertipline.org
4.2 What Happens After a Report
- Immediate review: Child safety reports are escalated instantly
- Content action: Violating content is removed immediately
- Account action: Offending accounts are suspended pending investigation
- Evidence preservation: All relevant data is preserved for authorities
- NCMEC report: Confirmed CSAM is reported to NCMEC within 24 hours
- Law enforcement: We cooperate fully with any investigation
4.3 Reporter Protection
- Reports can be made anonymously
- Reporter identity is never disclosed to the reported user
- No retaliation for good-faith reports
5. Parental Controls
5.1 Available Controls
Parents and guardians can:
- Link to their minor's account for oversight
- View activity summaries (with the minor's knowledge)
- Set daily usage time limits
- Review and approve connections
- Receive alerts for concerning activity
- Request complete data deletion
5.2 How to Set Up Parental Controls
- Create your own MyOrbit account (must be 18+)
- Go to Settings → Family → Link Child Account
- Enter your minor's account information
- Your minor will receive a request to confirm the link
- Once confirmed, parental controls become available
5.3 Educational Resources
We provide resources to help parents:
- Guide to talking to children about online safety
- Warning signs of online exploitation
- How AI companions work and appropriate use
- Setting healthy boundaries with technology
6. Legal Compliance
6.1 Regulatory Compliance
MyOrbit complies with all applicable child safety laws, including:
- COPPA (Children's Online Privacy Protection Act): US law protecting children under 13
- KOSA (Kids Online Safety Act): Enhanced protections for minors
- 18 U.S.C. § 2258A: Mandatory reporting of apparent CSAM to NCMEC
- UK Online Safety Act: Proactive child safety duties
- EU Digital Services Act: Enhanced protections for minors
- Australian Online Safety Act: Basic online safety expectations
6.2 NCMEC Reporting
As required by US law, MyOrbit reports all apparent CSAM to the National Center for Missing & Exploited Children (NCMEC):
- Reports submitted via NCMEC's CyberTipline
- All required information included per 18 U.S.C. § 2258A
- Evidence preserved for minimum 90 days
- Full cooperation with subsequent law enforcement requests
6.3 Law Enforcement Cooperation
We work with law enforcement agencies worldwide:
- Respond to valid legal process within 24 hours
- Emergency disclosure for imminent threats to child safety
- Preserve evidence upon request
- Provide testimony when required
- Participate in industry coalitions against child exploitation
6.4 Transparency
We publish quarterly transparency reports including:
- Number of CSAM reports submitted to NCMEC
- Accounts removed for child safety violations
- Law enforcement requests received and fulfilled
- Child safety feature improvements
7. Contact Information
Child Safety Team
For all child safety concerns, CSAM reports, and questions about our child protection practices:
- Email: safety@myorbit.ai
- Response time: Within 1 hour for urgent matters
- Available: 24/7
General Inquiries
- Email: support@myorbit.ai
- Privacy questions: privacy@myorbit.ai
External Resources
- NCMEC CyberTipline: www.cybertipline.org
- National Child Abuse Hotline (US): 1-800-422-4453
- Internet Watch Foundation (UK): www.iwf.org.uk
Our Promise
Child safety is not just a policy at MyOrbit—it is a core value that guides every decision we make. We are committed to continuous improvement of our safety systems and will always prioritize the protection of children over any other consideration.
We actively participate in industry initiatives, collaborate with child safety organizations, and invest in new technologies to stay ahead of threats. If you have suggestions for how we can better protect children, please contact us at safety@myorbit.ai.
Version: 1.0.0
Last Review: December 8, 2025
Next Review: March 8, 2026