When Everyone Sounds the Same
- Feb 24
- 3 min read
Are We Actually Learning Anything in the Age of AI?
Although I'm a voracious reader, I admit that lately I’ve been tuning out. The ideas are good, the writers are intelligent people I admire but I just can't hear them. It's the rhythm, the symmetry and the very obvious three bullets or short sentences followed by a punch line. I can hear the machine beneath each and every word. And for me, once I hear it and recognize the fingerprints that are all over AI -generated language, I stop ready and start skimming. It isn't that the it's wrong, just boring and predictable and completely without friction.
Being frictionless actually keeps me from learning. I no longer care or retain. And that's a BIG problem.

We are living in a moment where everyone sounds like they went to the same college prep school with the same AP English teacher. Everyone has a take, a framework or a summary of a summary. It sounds smart sure but the question plaguing me is is anything actually wise? I’ve written before about the blank page as a portal and about prompting like a poet instead of a technician. I believe that AI can absolutely expand human creativity. I've seen it and have used it to provoke imagination. But this only works if you use it that way. What I'm seeing is not that. It's a room full of A+ students copying what's on the board, word for word.
Here's the interesting (and uncomfortable thing).
Learning that lasts requires effort.
Cognitive psychologist Robert Bjork coined the term “desirable difficulty” to describe something counterintuitive: when learning feels harder, retention improves. Struggle forces deeper encoding. When ideas arrive perfectly structured and fully formed, the brain doesn’t have to wrestle, reconstruct or strain.
Neuroscience tells us the brain is constantly predicting what comes next. When we are surprised, when something violates our expectation learning accelerates. It's called prediction error and it's central to predictive processing theory. The surprise releases dopamine which strengthens consolidation and that consolidation builds memory.
When writing is optimized for coherence and predictability and it flows so smoothly we never feel disoriented, we lose the very conditions that make learning stick. AI as it's being used today rarely disrupts us.
We remember what disrupts us.
There’s another psychological phenomenon at play here called the illusion of explanatory depth. Researchers found that people often believe they understand complex systems until they’re asked to explain them step-by-step. That’s when the illusion collapses. These fluent explanations trick the brain. They feel coherent, so we assume we comprehend. AI-generated prose is incredibly fluent, balanced and reasonable. The fluency is seductive but it isn't mastery.
.
Last but not least is one of the most robust findings in cognitive psychology called the generation effect. We remember what we generate better than what we simply read. When we struggle to produce an idea ourselves, neural encoding strengthens. So if AI generates the draft…If the person lightly edits…and the reader skims…Where exactly is the generation happening?
This is where my curiosity turns into concern. When I worked on IBM Watson Education, I was obsessed with the difference between automation and insight . Our goal was never to replace human effort, it was to scaffold it. However, scaffolding only works if someone climbs. If students are submitting AI-generated essays they haven’t metabolized…If teachers are generating lesson plans they haven’t deeply designed…If administrators are circulating strategic documents no one fully reads…Then what are we practicing?
Maybe what I’m really reacting to is that AI writing's sameness. It's not wrong per se, it's just interchangeable. It rarely carries scar tissue or contradicts itself mid-thought.It rarely risks being misunderstood. It sounds like elevator music. That passive consumption is not ever strong enough to alter someone's identity.
By now you know that I don’t think the answer is banning AI. I think the answer is redesigning how we use it. What if we changed our curricula so that students had to submit original AI output, a critique, a revision and a reflection on what changed in their thinking? Instead of hiding AI use, what if we exposed human cognition?
I think it's ridiculous to think that the danger is that machines can write. The real issue is that we stop wrestling with ideas. And without wrestling, there is no neural strengthening, which means no retention and most scarily no wisdom.
Everyone will sound like an A+ student and no one will remember a thing.

Comments