Hugo
What's AI actually changing for tech companies day-to-day? Not just Big Tech, but agencies and SMBs?
I recently had the chance to discuss this with several CTOs during an informal meetup, and I wanted to share what came out of it.
AI is a hot topic for pretty much everyone because, whatever your opinion on it, you can't just sweep it under the rug. Clients talk about it, employees talk about it (positively or negatively), and some competitors are using it.
This led to several discussions:
I'll try to capture these discussions here. To preserve anonymity, all names are fictional.
I spoke with the head of a dev agency (let's call him Chris) for whom AI has been a game-changer, with a real company-wide adoption policy.
I'll keep it brief to avoid distorting what I heard:
This topic is rich, and I'd already seen it come up on Slack.
The question could be framed like this:
Some devs don't use AI on the team despite demo sessions, coaching, etc. How do you convince them?
The discussion raised many points, but it boils down to two questions:
For the second question, the reasons can be summarized as:
On the first point, nobody really wanted to go there. I didn't respond in the discussion either, maybe for fear of coming across as the resident cynic. But with hindsight, I'll use this post to share my thoughts. Sure, it might not please everyone, but I'm afraid it might come true.
Let's start with this hypothesis: If AI does in 5 minutes what an engineer would have done in several hours/days, the environmental equation strongly favors AI.
To support this, I'll use numbers I calculated in another blog post:
The average time to produce concept art varies between artists but is generally around 15 hours with Photoshop. That's 1500Wh — more than 300 times the cost with AI.
For equivalent tasks, AI is more environmentally friendly than a human worker. Obviously, one could argue that if we go faster, we'll produce more. The developer will work a full day regardless, so they'll consume more over the day.
Yes, a dev will consume more. If that's all they do, which is already optimistic.
But if headcount drops by 30 or 40%, the equation becomes positive again.
And I'm afraid nobody says this too loudly because the implications are heavy.
There will be two scenarios:
I'm afraid that people who hold back for environmental reasons will end up getting their way... by losing their jobs.
And I want to be clear: I don't wish this on anyone. I'm following a line of reasoning that seems important to address. You can hate the messenger. But it's the message that should interest you.
Then come the other two topics:
From the various discussions I've had, everyone seemed to reach the same conclusions: yes, the job is changing. It's much more about design, with heavy emphasis on architecture and the ability to formalize it. Many people end up enjoying it. But not everyone does, and that's partly what creates the resisters.
Now we get to the thorny question:
Opinions diverge quite a bit between those who think it should be a personal choice, like choosing your IDE.
The reflection is valid, but the "mandatory" aspect isn't justified in my view. Some tools are constraints: versioning, unit testing, etc. Others should be left to the dev's discretion: IDE, StackOverflow, Reddit, etc. AI, for me, falls into the second category.
And others who think it can no longer be optional, no more than refusing to use version control.
I understand your points, but I place AI at a different level than a simple tool. Today, if a dev uses VSCode, I don't force them to use IntelliJ (or anything else), but not using an IDE at all would seem pretty strange.
I didn't see any consensus emerge from the groups discussing this topic. But one phrase stands out for me as a conclusion:
Developers who don't want to use AI need to understand that their performance will be evaluated against people who do use it. If they're just as performant without it, fine. Otherwise, they're facing a problem they need to solve on their own, right?
From this discussion came a question: OK, but if we all switch to AI and the bubble bursts, what do we do?
I find this question very relevant. What happens in 2 or 3 years if the AI bubble bursts like the dot-com bubble in 2000? Will we be able to go back to our old work habits?
On the "bubble" aspect, the consensus seems widely shared, and the current period feels a lot like 2000.
In 2000, simply renaming your company with a ".com" could make stock valuations explode, even if the company had nothing to do with the web. To raise money, you had to find a link — real or not — to the internet. Today it's the same with AI. More than half of investments involve AI. I get contacted every week by companies that will "revolutionize sector X or Y" with AI.
Every. Single. Week.
So yes, the consensus is shared: many companies riding the AI wave will crash. And potentially not just small ones. But there's a second consensus: AI won't die with the bubble, just as the web didn't die in 2000.
Given the massive adoption of code assistance tools, there's little chance they'll disappear, especially since open-source models already exist. And some imagine it'll be possible to maintain specialized code models for less than today's big generalist models. A profitable economic equation will emerge, at least for this niche.
Nobody's a fortune teller, as one participant pointed out, but I tend to think the probability of AI completely disappearing seems unlikely.
I'm not even sure this chapter is necessary since it seems so obvious, but everyone seems equally intrigued and suffering from indigestion on AI topics. Sure, everyone uses it, but just like everyone uses a keyboard — that's not enough for it to be the ONLY topic of conversation.
Yet:
So yes, it's paradoxical: it's the topic of the moment, and at the same time, everyone's saturated.
Anyway, we talked about all this (not just this, don't worry) and I hope this little summary was interesting to you.
No comments yet. Be the first to comment!