Confessions of a Targeted Ad Intern: How I Lost Control to the Algorithm

From Intern to Insider: An Ethical Awakening in the World of Targeted Ads.

5 min readDec 6, 2024
Manosphere Pipeline
Photo by Shutter Speed on Unsplash

The Seven-Ad Revelation

It was a Monday morning team meeting at the marketing agency. Our CEO casually said, 'If someone sees an ad seven times, they'll probably buy.' Everyone else nodded. I froze.

Seven ads?

I scribbled it down, but it didn’t feel right. Was that what we were doing—wearing people down until they gave in? I felt proud our work was effective, but also uneasy. Were we crossing a line?

Manosphere Pipeline
Photo by Kenny Eliason on Unsplash

Life as a Content Strategist Intern

During college, I joined the agency excited to learn about digital marketing. My job was to manage ads, analyze performance, and adjust campaigns using tools like Google Analytics and Facebook Ad Manager.

At first, I enjoyed the work. We could target ads to just the right people—college students, new parents, you name it. When a campaign did well, we celebrated. The goal was simple: get better results for less money.

But the more I worked, the more something started to feel off. Every part of the job pushed me to prioritize one thing: performance. There was no space to ask whether what we were doing was right.

Manosphere Pipeline
Photo by Patrick Perkins on Unsplash

When Optimization Overruled Ethics

One day, I was setting up a campaign for a supplement brand. The platform suggested expanding the audience: 'Include people who are similar to your current audience.'

Sounded smart. But who were these 'similar' people? I had no idea. I clicked the button anyway. The results improved. Still, I felt a twist in my gut. I was trusting a black-box system over my own judgment.

The platforms made suggestions with labels like 'Recommended' or 'Best Practice.' I followed them without thinking. They weren’t just helping me; they were guiding me.

Everyone around me—managers, clients, platforms—wanted results. Faster. Better. Cheaper. No one ever said, 'Is this ethical?'

My doubts grew. Were we treating people like human beings, or just as numbers on a dashboard?

The Turning Point

Manosphere Pipeline
Photo by Ian Schneider on Unsplash

One night, I was building a slide deck showing how our campaign improved sales. I listed techniques like retargeting and frequency capping—industry speak for showing the same ad to people again and again.

Then I stopped. These numbers weren’t just metrics. They were people. We had followed them, tracked them, nudged them—until they bought something. And I helped do it.

I realized I wasn’t just working in the system. I was shaped by it. I had let algorithms and company goals tell me what to do. I had stopped asking questions.

I realized I wasn’t just working in the system. I was shaped by it. I had let algorithms and company goals tell me what to do. I had stopped asking questions.

This wasn’t just about advertising. It was about surveillance capitalism: the idea that companies track and use people’s behavior to shape what they do, often without them knowing.

Now I had a name for what felt wrong. And I couldn’t unsee it.

Looking Back: Who Was in Control?

Manosphere Pipeline
Photo by Immo Wegmann on Unsplash

Reflecting later, I saw how my actions weren’t fully my own. My goals were set by the company. My tools pushed me in certain directions. The system rewarded one thing: profit.

I thought I was using the platforms. But they were shaping me too. Every 'tip' or 'insight' they gave nudged me to act in their interest, not mine.

This doesn’t mean I’m off the hook. But it does show how easy it is for good people to make questionable choices in systems designed to optimize, not to care.

As someone who wants to design better tech in the future, this hit hard. If we don’t think about ethics in our design, we might build tools that hurt without meaning to.

A Question for Us All

This isn’t just my story. Many systems today are built to influence us: what we see, what we buy, what we think.

So, in a world shaped by data and algorithms, how do we make sure we keep ethics and human dignity in the picture?

VERSION

v2.2.1

RALPH'S TIME