Umer Kazi

AI is Making Me Stupid

I'm going to say something that feels strange to admit given that I run an AI company: I think AI is making me dumber. Not in some abstract "technology is ruining society" way, but specifically, measurably, and personally dumber. I've watched it happen to myself over the past year and I'm only now starting to think about it.

To preface, this isn't a take on whether AI is good or bad for the world. This is about what it's doing to my brain, and the uncomfortable realization that the tool I've been building my company around is quietly eroding the thing that made me good at this work in the first place.

---

The slow rot

It didn't happen all at once, it began with programming. I used to be able to hold entire architectures in my head; the data flow, the edge cases, the way one service talked to another. I didn't need to look things up because I'd internalized them through repetition, pain, and late nights debugging shit that didn't make sense until it suddenly did.

Then Claude Code and Codex became part of my daily workflow, and they're absolutely incredible. The speed at which I could ship went through the roof. Features that used to take me a day were done in an hour. I felt more productive than I'd ever been. But at some point, productive and capable quietly stopped being the same thing.

I caught myself one afternoon trying to write a basic API route, something I've done hundreds of times, and my first instinct wasn't to write it, it was to describe it to AI and let it write it for me. Not because it was hard, not because I was tired, but because the muscle memory was gone. The syntax that used to flow out of my fingers without thinking now required me to stop and remember, and stopping felt slower than just prompting.

That was the first moment I realized something was wrong.

---

Beyond code

The really unsettling part is that it didn't stay in VS Code, it bled into everything.

I started drafting emails with AI, not complex, high-stakes emails, but mundane ones. A reply to a client, a follow-up to a prospect, a quick note to a teammate. Messages I would have written in thirty seconds a year ago were now being routed through ChatGPT because I couldn't be bothered to find my own words, or worse, because I genuinely wasn't sure my own words were good enough anymore.

I'd catch myself pasting AI outputs into Slack messages without even reading them, just skimming over the length, the tone, and hitting send. Someone would reply and reference something specific in the message, and I'd have to go back and read what I apparently said because I hadn't actually written it.

That's not using a tool, that's outsourcing your thinking and pretending it's efficiency.

The irony isn't lost on me. My entire website, the one you're reading this on right now, is built around the idea that I need to understand how things work before I'm willing to leave them alone, and here I was, letting a language model handle the most basic expression of my own thoughts because it was faster than thinking.

---

The creativity problem

The part that scares me most isn't the forgetting. Memory fades, but you can always rebuild that. It's the loss of creativity.

When I used to sit down to solve a problem, there was a process. A messy, inefficient, sometimes frustrating process where I'd stare at the problem, turn it over in my head, try a bad approach, realize why it was bad, and arrive at something better because I'd gone through the work of failing first. That friction was where the insight lived.

Now I describe the problem to Claude and get a reasonable answer in ten seconds. The answer is normally fine, sometimes it's good, and even brilliant at times, but I never went through the process of arriving at it myself. I never sat in the discomfort of not knowing. I never built the mental model that comes from struggling with a problem long enough to actually understand it.

I'm getting answers without earning them, and over time, that's made me worse at the thing I used to be best at: figuring shit out from scratch.

---

The bigger picture

I don't think I'm alone in this. I think there's an entire generation of developers, especially the ones who adopted these tools early and enthusiastically, who are quietly getting worse at their craft while feeling more productive than ever. The output is up, but the understanding is down, and nobody's talking about it because the output is the thing that gets measured.

This is the trade-off nobody warned us about. Not that AI would take our jobs, that's the dramatic version, but the more subtle version is that AI would take our competence, slowly, while we thanked it for making us faster.

---

Why I'm writing this

This blog exists because of this problem. Not in spite of it, rather because of it.

I needed something that forced me to use my own brain again. Not for code, not for work output, but for the simple act of organizing my thoughts into words and putting them somewhere. No AI drafting the sentences or cleaning up my grammar. Just me, sitting with a blank screen, remembering what it feels like to struggle with how to say something.

It's uncomfortable. This post took me significantly longer to write than it would have taken to prompt. The sentences came slower, I second-guessed my phrasing more than I have in years, and that discomfort is exactly the point; it means the muscle is still there, just atrophied from disuse.

I'm not swearing off AI, that would be stupid, and frankly, hypocritical given what I do for a living. But I'm drawing a line between using AI as a tool and using it as a replacement for my own thinking. The former makes me faster, the latter makes me hollow.

The goal isn't to go back to doing everything manually, but rather the goal is to make sure that when the AI isn't there, I'm still capable of doing the work myself. Because if I can't, the only thing standing between me and incompetence is my subscription to ChatGPT.

← Back