
Aristotle called it ethos — the credibility of the speaker. Twenty-four centuries later, we’ve shortened it to a phrase people use on LinkedIn to bring an abrupt end to conversations they don’t want to have.
“Consider the source.”
I’ve been thinking about that phrase a lot lately. Not because I disagree with it (actually, I don’t. I lean in on it all the time) but because of how it gets weaponized from both directions.
In recent months, I’ve watched a parade of AI platform CEOs and their corporate and tech news acolytes declare that we’ve reached “the inflection point.”
Codex revolutionizing and democratizing building code and applications. Agentic workflows everywhere. The tools have finally turned the corner. The future is now!…it’s exhausting!
And my reactions have ranged from this is pure marketing slop to I get what you’re saying, but you’re selling something to, occasionally, that’s actually a fair observation buried under three layers of breathless hype.
The problem isn’t that these people are wrong. Some of them aren’t.
The problem is that enthusiasm is doing a lot of rhetorical work — and most readers don’t notice.
I Use These Tools. Daily.
Let me be clear about where I’m standing.
I’m not a Luddite. I’m not here to warn you away from AI. If it’s for you and you are intellectually curious: embrace, learn, and become discriminating.
I run a four-tool AI workflow (ChatGPT, Claude, Gemini, NotebookLM) for different modes of thinking. I’ve built a methodology around it. I use these systems to draft, research, stress-test ideas, and manage a knowledge base I call my “second brain.”
I’m a fan of the technology.
What I’m not a fan of is the froth.
You can read about that work here:
There’s a difference between informed optimism and “tulip-bulb” salesmanship. And right now, a lot of AI discourse is the latter dressed up as the former.
The Two Tribes (And the Drowned-Out Middle)
Here’s what I see happening:
On one side, you have the breathless boosters. Every product launch is a paradigm shift. Every capability demo is proof that AGI is around the corner. Skepticism is dismissed as “not getting it.”
On the other side, you have the reflexive Luddites. AI is hype. AI is theft. AI is a bubble that will pop any day now. Enthusiasm is dismissed as “drinking the Kool-Aid.”
Both tribes have one thing in common: they’ve stopped thinking.
The boosters consider the source and decide: He’s a visionary, so he must be right.
The Luddites consider the source and decide: He’s a CEO, so he must be lying.
Neither is epistemology. Both are tribal filtering.
And the middle ground? The place where you actually weigh the source, examine the evidence, and calibrate your confidence (where I suspect most of us live) is getting crushed between them.
The Good Version: Weighting, Not Dismissing
Used correctly, “consider the source” is a discipline.
It doesn’t mean reject the claim. It means raise your standards for believing it.
When a platform CEO tells you their new tool is an inflection point, you ask:
What do they gain if I believe this?
What’s their track record on predictions?
Are they speaking in-domain — or extrapolating into territory they don’t actually control?
Where’s the evidence beyond the demo reel?
This isn’t cynicism. It’s basic epistemic hygiene.
A pharma exec praising a drug isn’t automatically lying. But if you don’t look at the trial data, you’re not being open-minded — you’re being incurious.
Source analysis is about weighting, not accepting or rejecting.
The Lazy Version: The Genetic Fallacy
Here’s where it goes wrong in the other direction.
“That’s just a founder hyping his product.” “That’s just a journalist who doesn’t understand technology.”
“That’s just an academic who’s never shipped anything.”
“That’s just an industry guy who’s captured by commercial interests.”
What that is, is a genetic fallacy. Rejecting an argument because of where it came from.
It feels like skepticism. It’s actually just convenient.
A biased person can still be right. A neutral person can still be wrong. An expert can still overreach. A non-expert can still see the obvious.
The origin doesn’t settle the claim. The evidence does.
Authority Is a Receipt, Not a Crown
People talk about “authority” like it’s a permanent badge.
It is not.
Authority is a claim — one that has to be earned, repeatedly, in a specific domain.
Real authority comes from:
Expertise you can test
Accuracy you can track
Data you can verify
Skin in the game you can see
And then there’s celebrity…Oh boy!
Celebrity is not authority. It is distribution.
What Fame tells you is that an algorithm liked the packaging. The SEO was “strong in this one”!
It tells you nothing about epistemic reliability.
The incentives are obvious: hotter takes travel further/faster. Cleaner narratives get shared. Certainty sells.
Confidence is loud. Evidence is quiet.
Your job is to hear the quiet part and make it sound out loud.
The Right Order of Operations
If you want to be serious, the sequence matters:
What’s the claim?
What’s the evidence?
What are the incentives?
Now adjust confidence.
Most people do it backwards:
Who said it?
Do I like them?
Then they label it “critical thinking.”
That’s not thinking. That’s sorting.
A Better Phrase
Maybe the phrase should be updated.
Not “consider the source.” Weight the source.
Because authority is contextual.
A Nobel Prize in science doesn’t make you a macroeconomist. A successful exit doesn’t make you a public health expert. A big podcast doesn’t make you a geopolitical strategist. A platform CEO doesn’t make you an unbiased analyst of your own platform’s significance.
The claim still has to be examined.
Where I Land
I think some recent AI tools represent genuine improvements in capability. Codex and its competitors are doing things that weren’t possible two years ago. That’s real.
I also think we’re swimming in a sea of incentivized optimism — and most people reading these breathless announcements aren’t adjusting for it.
The mature position isn’t rejection. It isn’t worship.
It’s this:
Factor the source into your confidence, and then…keep thinking.
The discourse doesn’t need more boosters. It doesn’t need more Luddites. It needs more people willing to do the quiet, boring work of actually weighing the evidence.
Aristotle would understand. Even if LinkedIn doesn’t.
SM



