Trust is not a feature. It’s our foundation.


Most AI Systems don’t understand children.
They're trained on adult data and built with adult expectations. But children are not small adults. They communicate in nonlinear, imaginative, and emotionally rich ways and generic safety filters don’t understand that.
Our platform does.
UG knows when to play along, when to teach, and when to alert a parent. We designed it that way. Because we’ve spent years listening to families and children, and building alongside them.
Real examples of what you can expect from our trust and privacy approach:
When your child chats with UG, their conversation is broken into small, de-identified parts so the system can learn safely without reconstructing the full dialogue.

Sensitive topics like home addresses or phone numbers? Our filters block those before they’re even processed.

A child’s voice is recorded only if parents opt in and even then, it’s anonymized and used solely to help us better understand child speech patterns.

If a child brings up something that parents need to know about, our system tells you. We know how to identify a real safety situation and when to escalate to you.




