Spend ten minutes on LinkedIn after any big AI launch and you can watch the same opinion spread in real time. One person posts the original take, or what looks like one. Ten others rephrase it. Fifty more polish it. By the end of the day the whole feed has that eerie copy-of-a-copy feeling, like everyone raided the same mental wardrobe and showed up in slightly different jackets.
I have never been very good at that.
Not because I think I'm above anyone. Usually it feels more annoying than noble. I just always had this reflex against repeating the room. In school, in meetings, in random late-night founder conversations, the moment I noticed I was about to say the safe thing, I could feel myself pulling away from it. No, that's not quite what I think. Even when I could not fully articulate the better version yet.
And now AI is making that difference painfully visible.
The old trick was sounding original without being original
For years, a lot of people got away with simply repeating what they read somwhere, repeating what others have told them, repeating what the news said and still looked great on paper. That used to work because polished language was still a skill. If you could package the consensus nicely, people often mistook that for insight. You sounded sharp. You sounded informed. You sounded like you had a point of view, even if you were just laundering other people's thoughts through your own tone.
AI breaks that illusion.
In a randomized experiment published in Science, professionals using ChatGPT on midlevel writing tasks finished 40% faster and produced output rated 18% higher quality. Read that again and think about what it means. The competent first draft is getting cheaper fast. The polished paragraph is getting cheaper fast. The tidy summary of what everybody already believes is becoming almost free.
So if a machine can generate your opinion in a matter of seconds, what exactly was valuable about the opinion? Not the phrasing. Not the structure. Not the confident tone. The only thing left is whether the thought was yours.
This is the part I recognize in myself
I do not mean that I have always been right. Definitely not. I've had plenty of wrong ideas, half-baked instincts, and opinions I had to throw away two weeks later. But they were mine. That matters more than people admit. I was never built for intellectual crowd-following. I could copy formats when I had to, sure, but not beliefs. The moment something became popular too quickly, I wanted to inspect it from the side. What is everyone missing? What are they repeating without checking? Where does this logic break when it touches reality?
That instinct has saved me from a lot of bad consensus thinking over the years. It has also made me slower sometimes. Independent thinking is not always efficient. You take detours. You annoy people. You ask the question that makes the room slightly uncomfortable. But it gives you something most people still underestimate: an internal source of direction.
That is why this AI moment feels so clarifying to me.
If your whole professional identity was built on being able to restate the obvious in clean language, that identity is in trouble. If your value came from having actual judgment, taste, pattern recognition, and the nerve to say what you really think before the crowd approves it, AI is not reducing your value. It is exposing it.
Access is no longer the differentiator
This is not some distant future argument either. According to Stanford HAI's 2025 AI Index Report, 78% of organizations reported using AI in 2024, up from 55% the year before. At the same time, the inference cost for GPT-3.5-level performance fell by more than 280-fold between late 2022 and late 2024.
That combination changes the game completely. When almost everyone has access, and the cost keeps collapsing, the advantage shifts away from access itself. The edge moves to discernment. To taste. To the person in the room who can tell the difference between something that merely sounds plausible and something that is actually worth saying.
This is also where a lot of people get lazy. They think using AI is the advantage. It isn't. Using AI is table stakes now. The question is what happens after the model gives you the first answer. Do you challenge it? Do you push past the average response? Do you bring your own strange combination of experience, intuition, and standards to the output? Or do you accept the first polished thing because it looks good enough?
That fork in the road is where the gap opens.
The real divide is not AI versus humans
Microsoft researchers surveyed 319 knowledge workers in a CHI 2025 paper and found something that feels obvious once you've seen it in the wild: higher confidence in GenAI was associated with less critical thinking. They also found the human role shifting toward verification, response integration, and task stewardship.
Exactly.
That is the work now.
Not typing a prompt. Not generating a passable paragraph. Not posting the same sanitized take everyone else is posting with slightly better formatting.
The work is judgment.
The work is deciding what deserves to exist.
The work is noticing when the answer is technically fine and spiritually empty.
And that is where the divide gets uncomfortable very fast.
It will not be AI versus humans.
It will be independent thinkers versus everyone else.
One group will use AI to sharpen what they already see. The other group will use AI to avoid seeing anything for themselves. One group will get faster without becoming generic. The other will become a cleaner, more efficient version of average.
You can already feel this happening. Some people use AI and their voice gets stronger. Other people use AI and disappear inside the smoothness of the output.
Why I care about this so much
Maybe this hits me harder because I never wanted to build a company, a product, or even a body of writing that sounded like the approved consensus version of reality. I always cared more about the distinct angle, the sentence that risks a little friction, the product choice that reflects an actual belief. That is true for how I write. It is true for how I build. And honestly, it is part of why I care so much about software that preserves a team's real voice instead of flattening it into generic AI mush.
I do not think everybody needs to become some rebellious genius caricature. That is not the point. But I do think the age of getting rewarded for safe repetition is ending. Or at least shrinking fast.
When average language becomes free, original judgment becomes impossible to ignore.
So yes, AI is changing work. Everyone can see that. But the more interesting thing is what it reveals about people. Who has their own lens. Who has their own taste. Who can stand in a room full of borrowed opinions and still say something real.
That gap was always there. AI is just making it visible.