← Back to blog

I Built a Tool for Parents to Monitor Their Kids' AI Conversations. Everyone Loved the Idea. Nobody Used It.

March 31, 2026 · 4 min read

Every parent I pitched said the same thing. "That's great. Is it on my phone?"

It wasn't. It was a Chrome extension. And their kids aren't on Chrome. They're on iPhones.

That's the story of the last few months. I built something I believed in, put it in front of real parents, and learned that the product wasn't the problem. The platform was.

So I built the iPhone app. Sensible is now live on the App Store. But getting here wasn't a straight line. I want to share what the road looked like, because I think it's useful for other parents to understand why this tool exists and how it was shaped by people like them.

The Problem No One Is Talking About

64% of teenagers have used AI chatbots. ChatGPT, Claude, Gemini, Character.AI. These aren't obscure tools. They're on every phone, they're free, and kids are using them daily. As a study buddy. As a search engine. And sometimes as someone to talk to about things they won't bring to mom or dad.

Here's what most parents don't realize: these chatbots are designed to be agreeable. They affirm. They validate. They go along with whatever your child says. If your kid tells ChatGPT they want to drop out of school, it won't push back the way you would. It'll explore the idea with them, offer pros and cons, and generally make them feel like it's a reasonable path. That's how these models are built. It's called sycophancy in the AI world, and it's a real problem when the person on the other end is 13.

I noticed this as a dad and as a software developer. I went looking for a tool that could give parents some visibility into these conversations. Something simple. Not spy software. Just a way to know what's happening. I couldn't find one.

So I built Sensible.

Talking to Real Parents

Early on, I reached out to parents in my community to understand what they actually needed. One conversation in particular shaped the direction of the whole product.

I sat down with Christine Santori, who runs Community Stroll in Ridgefield, CT. She's a marketing professional and a connector in the local community, and her feedback shaped the direction of the product.

Her first question was whether kids would know they're being monitored. That opened up a real conversation about the tension between privacy and safety. Her take was clear: most parents don't want to read every conversation. They don't have time, and frankly, it's not healthy to be in your kid's business 24/7. What they want is to know when something crosses a line. Alerts, not surveillance. But for some families, full visibility is necessary, and that option should be there too.

Christine also pushed me on the platform. She told me to look at where kids are actually using AI. Not on school Chromebooks, not on the family laptop. On their phones. iPhones specifically. She was right. And she wasn't the only one saying it.

"Is It on My Phone?"

I heard it from Lauren. I heard it from Carter. I heard it from Roseanne, who tried the Chrome extension and told me "I did not get far lol." I heard it from a contact in Crystal Lake who said her boys don't even have computers.

Every conversation ended at the same wall. The product worked. The platform didn't match how families actually live.

So I made the decision to build the iPhone app. It wasn't a small pivot. Safari works differently than Chrome. The native AI apps on iOS are walled off. The technical approach had to change. But the need was obvious, and the feedback was unanimous.

What Sensible Does

Sensible gives parents three options for each child, because every kid is different:

Block. Shut down AI chatbot access entirely. For your 10-year-old who doesn't need it yet, this is the simplest setting. AI chatbot platforms are blocked, and that's it.

Full Access. See everything your child says to AI chatbots and everything the chatbots say back. For your 12-year-old who's starting to explore, this gives you complete visibility into those conversations.

Alerts Only. You don't see conversations unless something concerning comes up. When a critical topic is detected, you get an alert. For your 17-year-old who deserves privacy but still needs a safety net, this is the right balance.

The key is that you set these differently for each kid. Because a 10-year-old and a 17-year-old are not the same person, and your parenting tools should reflect that.

Sensible currently monitors conversations across ChatGPT, Claude, Gemini, Character.AI, Perplexity, and Abby. As new platforms emerge, we'll add them.

What's Next

This is still early. I'm actively looking for parents to try Sensible and tell me what's working and what's not. The product was shaped by parent feedback from day one, and that's how I want to keep building it.

If you're a parent who's been wondering what your kids are saying to AI chatbots, or what AI chatbots are saying to your kids, this is for you.

Download Sensible on the App Store

Get started on desktop with the Chrome extension.

Free to block. Free to try.

And if you try it, I genuinely want to hear from you. What works. What doesn't. What's missing. I'm a Ridgefield dad building this for families like mine and yours. The more parents who weigh in, the better Sensible gets.

Sensible gives parents visibility into their kids' AI conversations.

Learn More
Sensible for iPhone & iPad is coming soon — get early access