I Used AI to Build Something Simpler Than What the Market Offers

A reflection on trust, data, and AI.

7 min readDecember 15, 2025
Hero image
Photo by Nicolas Solerieu on Dribbble

01 Conviviality & Surveillance

In 1973, philosopher Ivan Illich wrote about what he called convivial tools: tools that increase your autonomy rather than your dependence. Tools anyone can use, that don't require experts, that let you do what you want rather than what the system wants.

In 2019, Shoshana Zuboff wrote about surveillance capitalism: you get convenience, they get your data. We've had manifestos, exposés, documentaries, congressional hearings. Everyone knows the deal by now. The critique is well-established. What's been missing is the exit.

Convivial tool vs Surveillance illustration
An illustration generated by Gemini on the concept of convivial tools versus surveillance.

02 The ER Visit

A couple of days ago, I went to the ER for an abscess under my nose. A bit dramatic, I know, but it was quite big and painful. I was honestly scared.

During check-in, the nurse asked me if I wanted to hurt myself. Standard screening. Then she asked if I wanted to hurt others!? I get why the questions exist: liability, protocol, safety. But there's something strange about being treated as a potential risk before you're treated as a person who needs help.

Anyway, I got my abscess drained (very, very painful), and a prescription: oral antibiotics, 10 days, two pills a day. Simple enough.

I got home, still in pain from the draining, and realized I had a problem. I already take a few daily meds and vitamins. But I cannot miss any pill in my antibiotic course. I needed a system, something to help me track this 10-day course alongside my regular medications without mixing things up.

So I immediately disregarded the jolting down on a piece of paper idea :0, and searched for medication tracking apps. There are a lot of them. I didn't download a single one.

My little pharmacy system
My little pharmacy system.

03 The Search for Simple

I didn't need much. A calendar view. A way to mark off doses. Maybe add or remove a medication. That's it. I wasn't looking for a health management platform. I wasn't looking for AI-powered insights into my wellness journey. I just needed to remember whether I took my antibiotics this morning.

But every app I looked at came bundled with stuff I didn't ask for. Reminders, gamification, engagement features, gentle nudges. And behind all of it: where does my data go?

04 What's Actually at Stake

In 2019, the Wall Street Journal reported that Flo, a period-tracking app used by over 100 million people, had been sharing intimate health data with Facebook. When users logged their periods, when they indicated they were trying to get pregnant, when they reported symptoms, all of it was being transmitted to Facebook's analytics division for advertising purposes. The app had promised users their health data would remain private. It didn't.

The FTC eventually forced Flo to settle, requiring them to notify affected users and instruct third parties to destroy the data. But by then, the data had already been shared.

In 2023, the FTC went after GoodRx, a prescription discount app, for sharing users' medication data with Facebook and Google, including information about specific prescriptions that could reveal conditions like erectile dysfunction. The company used that data to target users with ads. If you'd searched for a medication for a sensitive condition, you might have seen an ad about it on Facebook. The company paid a $1.5 million penalty, but again: the data was already out there.

A BMJ study found that 79% of top medication-related apps share user data in ways that may violate privacy. The data includes names, medications, email addresses, and device identifiers. That information gets shared with app developers, parent companies, and third-party advertisers—and potentially onward to what researchers call 'fourth parties.'

What happens with that data? Insurance companies have started acquiring non-health data from data brokers—purchasing habits, addresses, court records—to build predictive models about healthcare costs. While they claim they're not using these models to set individual premiums, no law clearly prevents them from doing so. Health data that escapes an app doesn't stay in one place. It becomes part of a profile that follows you.

And after Roe v. Wade was overturned, the stakes became even more concrete. Period tracking data could theoretically be subpoenaed. Health information that once seemed merely private became potentially dangerous.

This is what I was thinking about when I looked at those medication tracking apps. Not 'will this definitely happen to me?' but 'why am I being asked to trust a system I don't understand, for a problem a piece of paper could solve?'

05 Why Simple Doesn't Exist

As a product designer, I was taught to design products that fit the user's mental model, informed by user research data, blah blah blah—oh, and make money.

Simple apps don't retain users, unless they're part of the OS. They don't convert free users to premium tiers. They don't collect valuable data. So app developers aren't incentivized to build minimal tools. Even if they do, users can't simply add new features if they need them.

However, aren't those 'extra features' in medication apps there because all the UX research data show people do forget doses and do need reminders? Isn't some of that complexity earned?

Maybe. But there's a difference between complexity that serves users and complexity that serves business models. A reminder feature is useful. A reminder feature that also tracks your location, analyzes your usage patterns, and shares data with advertising partners is something else entirely. The problem isn't features, it's that the features come bundled with surveillance, and you can't separate the two.

Meanwhile, the labor of managing illness falls on individuals. You're sick, your brain isn't working well, and you're expected to figure out your own care infrastructure. Find an app. Learn its interface. Hope it's trustworthy. Read a privacy policy written by lawyers to obscure rather than clarify. Make a decision while you're in pain.

A convivial tool would let you solve this yourself. But the market doesn't build convivial tools.

06 The Build

So I built an app for myself.

Using an AI code editor, I had a working web app in about 15 minutes. Calendar view. Medication list. Mark doses as taken. Add or remove meds. Different tracking methods for different medications. A clean, simple material expressive design system. The data saved locally in my local SQL database.

Medtracker Web App Interface
The initial web app built in 15 minutes.

The speed isn't the interesting part. What's interesting is the choices I made—or rather, the choices I didn't have to make.

I didn't have to decide whether to enable notifications (I didn't want them). I didn't have to figure out how to disable location tracking (there wasn't any). I didn't have to read a privacy policy (there wasn't one, because there was no one to share data with). I didn't have to create an account. I didn't have to remember a password. I didn't have to trust anyone but me.

A couple of days later, I got tired of starting a local server every time I wanted to log a dose. So I spent two more hours turning it into a mobile app, Supabase for the backend, deployed to TestFlight. Now I can log my meds from my phone without any friction. I know exactly where my data lives, and it's mine alone

Medtracker Mobile App Interface
The mobile app version running on iOS.

That's what Illich meant by convivial. Just a tool that extends what I can do, without extracting anything in return.

In 2019, sustainability researcher André Reichel asked whether AI could ever be convivial. His answer: not yet. But it could be—if designed to empower people to build for themselves, beyond the market. That's what I did, accidentally.

07 AI for Building Less

The dominant AI narrative is about building more. More features, more automation, more 'value'.

I used AI to build less. Fewer features. Less data collection. Less opacity. A tool that does exactly one thing.

Building your own tools feels like an exit. For a moment, it felt like I'd found one.

But I built this with a cloud-based AI coding tool. That company saw my prompts. They know I made a medication tracker. There's a difference, though. The AI saw me building once; the medication app would have seen me using it daily, indefinitely. One's a snapshot, the other's a feed. But still.

What about running models locally? Sometimes I use local models for text and images. Those prompts and outputs never leave my machine. Maybe that's the actual exit. Maybe it's not. Local AI has tradeoffs. I don't know if it scales to what I needed here, or if it ever will for most people.

Illich argued that tools should increase autonomy rather than manage users. Surveillance capitalism depends on persistent use and continuous extraction. When AI is used briefly and instrumentally, rather than as a permanent system, it begins to resemble a convivial tool. The value comes from limiting duration, scope, and dependency, rather than maximizing engagement or data collection. In that sense, convivial AI may be less a product and more a practice.

VERSION

v2.2.1

RALPH'S TIME