The “Death” Of IT Services Is Greatly Exaggerated
We’re seeing a spate of disruptive headlines and predictions around the demise of IT services. With the progress made by AI, specifically in software development, it’s easy to opine that traditional IT services are on their way out. This narrative is fundamentally flawed. Yes, AI-driven automation deflates the unit cost of technology work, but it also inflates the volume and complexity of that work even faster. The net effect is more demand for IT services, not less.
We’ve seen this scenario play out before through waves of offshoring, cloud migration, and platform engineering. But not every IT services firm will capture this growth. The winners will be those that build proprietary AI platforms, invest in domain expertise, and shift to outcome-based delivery. The losers will be those still selling labor arbitrage by the hour. The key questions for CIOs are about which operating model will win and whether their current providers are on the right side of that divide.
Here is how I see it:
- Frontier AI doesn’t run itself. Tech maturity is diverging at speed. Big tech pushes toward frontier AI. Most enterprises just need their systems to run. The gap between AI’s technical capability and its enterprise adoption is enormous, and it’s a gap that requires human expertise to close. And the surface area keeps expanding: Technology is penetrating logistics, finance, and operational functions that historically ran on spreadsheets and institutional memory. Process redesign, multiagent orchestration, data pipeline engineering, governance frameworks, compliance controls, and workforce training aren’t tasks you hand to an AI agent — that’s precisely what IT services firms deliver. As the gap widens, so does their relevance. At the recent India AI Impact Summit in New Delhi, both Sam Altman and Dario Amodei acknowledged that systems integrators will be the key channel for taking their solutions to enterprises.
- Agents write code; humans engineer everything around it. AI is making code generation faster and cheaper. That matters less than most headlines suggest. Coding represents roughly one-third of any delivery effort. The rest — specifications, architecture, cross-team coordination, platform integration, testing, and project governance — still demand human judgment. Getting requirements right is a nuanced process. Orchestrating work packages for coding agents across multiple teams requires architectural thinking that humans excel at and will do so for the foreseeable future. IT services firms do this work at scale, across complex enterprise environments, every day. AI makes their coders faster; it doesn’t replace the engineering discipline that surrounds the code.
- Agents don’t run on models — they run on context. Agents are only as strong as the context pipelines that feed them — from structured data and unstructured enterprise knowledge to business rules, ontologies, and real‑time operational signals. But they also depend on tacit knowledge: the domain judgment, accumulated problem‑solving instincts, and operational intuition living inside your people’s heads — the kind that rarely makes it into documentation. All of this must be engineered into a coherent layer that’s specific to each organization. This is context engineering, and it will be the highest-value discipline in enterprise AI. IT services firms already hold the raw material: deep familiarity with their clients’ processes, data architectures, and operational logic. Expect the leading IT services providers to develop sophisticated context engineering services, akin to Palantir’s forward deployed engineer model.
What This Means For CIOs
Your IT services portfolio was built for the arbitrage era. Restructure it for the AI era by:
- Testing for context engineering depth. Ask providers how they capture and embed tacit domain knowledge into AI agent pipelines. If the answer is vague, they haven’t started.
- Applying a “price x quantity” lens to vendor budgets. AI will deflate unit costs but inflate project volume. Model your vendor spend as a reallocation: less commodity coding and more integration, orchestration, and domain-specific AI work.
- Demanding outcome-based commercial models. Providers still pricing by full-time equivalent are optimizing for their margin, not your results. Shift contracts toward measurable delivery outcomes.
- Forcing the reskilling question. What percentage of their delivery staff are AI-skilled today? What’s the 12-month target? Providers without concrete numbers are hoping the transition happens slowly. It won’t.
The Tocqueville Filter Is Coming
French philosopher Alexis de Tocqueville observed that bad government is most endangered when it begins to reform. The AI-driven reform of IT services will follow the same logic: It won’t kill the industry, but it will expose and destroy the weakest operators. AI won’t kill IT services — it will separate the architects from the order-takers. Providers built on labor arbitrage — with no proprietary AI assets, no context engineering capability, and no credible reskilling plan — are the bad government in this analogy.
While Sam Altman and Dario Amodei were gracious about the role systems integrators will play, Palantir’s partnership with SAP on ECC‑to‑S/4HANA migrations rattled the services industry. It showcased a capability the integrators could have built themselves. They didn’t then — and now the market is going to force that reform on them. CIOs who wait to see which of their providers survive the filter will pay for that passivity in delayed transformation and stranded contracts. Act now.