Blog
Dec 12, 2025

What's the Real Impact of AI on Tech Companies in 2025?

Hugo

What's AI actually changing for tech companies day-to-day? Not just Big Tech, but agencies and SMBs?

I recently had the chance to discuss this with several CTOs during an informal meetup, and I wanted to share what came out of it.

AI is a hot topic for pretty much everyone because, whatever your opinion on it, you can't just sweep it under the rug. Clients talk about it, employees talk about it (positively or negatively), and some competitors are using it.

This led to several discussions:

  • What's the concrete impact of AI for a dev agency in 2025?
  • Is it a bubble? And what if it bursts?
  • How do you get your teams to adopt AI? Is it an individual choice or a company mandate?
  • How can AI be necessary yet nauseating at the same time?

I'll try to capture these discussions here. To preserve anonymity, all names are fictional.

Impact of AI for a Dev Agency in 2025

I spoke with the head of a dev agency (let's call him Chris) for whom AI has been a game-changer, with a real company-wide adoption policy.

I'll keep it brief to avoid distorting what I heard:

  • AI has impacted every aspect of the business: contracts, development, support, etc.
  • The agency does way more fixed-price projects and less time & materials. Billing by time spent no longer makes sense. They now prefer billing based on impact and value delivered.
  • They can now commit to smaller fixed-price contracts because margins have increased and risk has decreased.
  • They're starting to see bottlenecks on the client side. Devs are now faster than clients can express their needs.
  • They're increasingly having 1 dev per client, which isn't great for service continuity. But with two devs, there's not enough work.
  • The design phase has become more important, with solid documentation to capture requirements. That's what feeds the AI afterward (which also helps refine the docs).
  • Consequence of the above: it's the first time since the company was founded that this much documentation has been produced, with good quality.
  • He's had discussions with senior lead devs who've seen their job change and initially doubted whether they'd still enjoy their work. So far, these people have learned to love their new job.
  • It's hard to hire juniors because working with AI to build applications requires the hindsight to know how to code things yourself, read code, spot flaws, etc. There's a real "AI wall" for newcomers.
  • Overall, the agency is making better margins. But also because competitors haven't caught up yet. Chris thinks it'll rebalance once more competitors adopt the same working methods.
  • He has no problem involving clients and doesn't hide the use of AI. It's still a tool, but it's their methods and expertise that make it deliver satisfying results.

How to Get Your Teams to Adopt AI

This topic is rich, and I'd already seen it come up on Slack.

The question could be framed like this:

Some devs don't use AI on the team despite demo sessions, coaching, etc. How do you convince them?

The discussion raised many points, but it boils down to two questions:

  • Should you force people?
  • What reasons do these devs give?

For the second question, the reasons can be summarized as:

  • Environmental concerns
  • Feeling like the job has lost its meaning
  • Fear of losing skills

On the first point, nobody really wanted to go there. I didn't respond in the discussion either, maybe for fear of coming across as the resident cynic. But with hindsight, I'll use this post to share my thoughts. Sure, it might not please everyone, but I'm afraid it might come true.

Let's start with this hypothesis: If AI does in 5 minutes what an engineer would have done in several hours/days, the environmental equation strongly favors AI.

To support this, I'll use numbers I calculated in another blog post:

The average time to produce concept art varies between artists but is generally around 15 hours with Photoshop. That's 1500Wh — more than 300 times the cost with AI.

For equivalent tasks, AI is more environmentally friendly than a human worker. Obviously, one could argue that if we go faster, we'll produce more. The developer will work a full day regardless, so they'll consume more over the day.

Yes, a dev will consume more. If that's all they do, which is already optimistic.

But if headcount drops by 30 or 40%, the equation becomes positive again.

And I'm afraid nobody says this too loudly because the implications are heavy.

There will be two scenarios:

  1. Dysfunctional companies where people take 3 weeks for simple tasks and spend part of their time hiding at the coffee machine or in useless meetings. If these people use AI, they'll just spend more time on the rest. And that'll save energy. Yes, it's a bit provocative, but I know these companies exist, and I'm not being cynical here. I told you I'd be blunt...
  2. Companies that pursue productivity. These companies will keep their most effective people but will reduce headcount because demand won't keep up. It's hard to hear, I get it. But it's going to happen.

I'm afraid that people who hold back for environmental reasons will end up getting their way... by losing their jobs.

And I want to be clear: I don't wish this on anyone. I'm following a line of reasoning that seems important to address. You can hate the messenger. But it's the message that should interest you.

Then come the other two topics:

  • Loss of meaning
  • Loss of skills

From the various discussions I've had, everyone seemed to reach the same conclusions: yes, the job is changing. It's much more about design, with heavy emphasis on architecture and the ability to formalize it. Many people end up enjoying it. But not everyone does, and that's partly what creates the resisters.

Now we get to the thorny question:

Should you force people?

Opinions diverge quite a bit between those who think it should be a personal choice, like choosing your IDE.

The reflection is valid, but the "mandatory" aspect isn't justified in my view. Some tools are constraints: versioning, unit testing, etc. Others should be left to the dev's discretion: IDE, StackOverflow, Reddit, etc. AI, for me, falls into the second category.

And others who think it can no longer be optional, no more than refusing to use version control.

I understand your points, but I place AI at a different level than a simple tool. Today, if a dev uses VSCode, I don't force them to use IntelliJ (or anything else), but not using an IDE at all would seem pretty strange.

I didn't see any consensus emerge from the groups discussing this topic. But one phrase stands out for me as a conclusion:

Developers who don't want to use AI need to understand that their performance will be evaluated against people who do use it. If they're just as performant without it, fine. Otherwise, they're facing a problem they need to solve on their own, right?

From this discussion came a question: OK, but if we all switch to AI and the bubble bursts, what do we do?

Is It a Bubble? And What If It Bursts?

I find this question very relevant. What happens in 2 or 3 years if the AI bubble bursts like the dot-com bubble in 2000? Will we be able to go back to our old work habits?

On the "bubble" aspect, the consensus seems widely shared, and the current period feels a lot like 2000.

In 2000, simply renaming your company with a ".com" could make stock valuations explode, even if the company had nothing to do with the web. To raise money, you had to find a link — real or not — to the internet. Today it's the same with AI. More than half of investments involve AI. I get contacted every week by companies that will "revolutionize sector X or Y" with AI.

Every. Single. Week.

So yes, the consensus is shared: many companies riding the AI wave will crash. And potentially not just small ones. But there's a second consensus: AI won't die with the bubble, just as the web didn't die in 2000.

Given the massive adoption of code assistance tools, there's little chance they'll disappear, especially since open-source models already exist. And some imagine it'll be possible to maintain specialized code models for less than today's big generalist models. A profitable economic equation will emerge, at least for this niche.

Nobody's a fortune teller, as one participant pointed out, but I tend to think the probability of AI completely disappearing seems unlikely.

How Can AI Be Necessary Yet Nauseating?

I'm not even sure this chapter is necessary since it seems so obvious, but everyone seems equally intrigued and suffering from indigestion on AI topics. Sure, everyone uses it, but just like everyone uses a keyboard — that's not enough for it to be the ONLY topic of conversation.

Yet:

  • It's hard to raise money without saying you're doing AI
  • There are tons of company pitches constantly talking about AI, but when you dig a little, there's nothing credible if you're being serious
  • It's annoying to see tech conferences with 95% of talks on AI, with some competition in hot air (beginner tutorials, far-fetched topics, etc.)

So yes, it's paradoxical: it's the topic of the moment, and at the same time, everyone's saturated.

Anyway, we talked about all this (not just this, don't worry) and I hope this little summary was interesting to you.

Enjoyed this article?

Subscribe to receive the latest articles directly in your inbox.

0 Comments

No comments yet. Be the first to comment!