The end of inexpensive compute is nigh. Two-tier IT may force a rethink on cost and strategy
[AI tools were used for data gathering and research during the preparation of the article. The analysis, opinion and writing are entire that of the author]
Since practically forever, the cost of compute power has been taken for granted to only go lower, and it has, barring the occasional short periods of demand spikes, like the memory and storage situation now, or supply-side shocks, like the 2004 Thailand tsunami that paralyzed hard drive factories.
This core assumption governed enterprise IT, allowing IT heads to scale up new workloads without bearing proportionately higher costs over time. The era of lower cost for more capability, and the enviable flexibility it gave CIOs, may well be ending. Here’s why.
The reliable decline in compute cost was a result of semiconductor innovation, but that predictability is now fracturing under multiple pressures. First, the economic engine of Moore’s Law is seizing. At advanced nodes, the cost per transistor is no longer falling. Multi-billion-dollar fabrication plants now yield diminishing returns, breaking the mechanism that delivered cheaper computing. This appears to be a structural change rather than a temporary disruption, barring any groundbreaking innovation.
This is happening at a rather inopportune time, just when there is a massive demand spike from AI. Of the roughly $6.7 trillion in cumulative data center capital expenditures McKinsey projects by 2030, AI infrastructure alone accounts for $5.2 trillion. This is a reshaping of the supply chain driven by fundamentals, not transients.
Third, and perhaps most definingly, energy will be a primary constraint, introducing a new, escalating cost category in how IT spending and strategies are approached.
Two-tier IT
The silver lining is that not all compute and storage costs are the same, because the hardware varies considerably for traditional and AI needs. Frontier AI depends on a premium stack of cutting-edge semiconductors, HBM, and high-power-density infrastructure. Traditional workloads run on a commodity stack of mature, amortized hardware. These stacks have different economic trajectories: the premium tier faces escalating manufacturing costs and resource competition, while the commodity tier benefits from stable, high-volume production.
This split will create two tiers: a premium tier for frontier AI development, marked by high costs and concentrated ownership, and a “commodity” tier for traditional IT, which does not face rapid inflation or hype-cycle pricing, but instead sees a manageable rise, with the end of deflation, driven by power costs. Understanding this division is critical for reassessing IT strategy.
Power, not just silicon, will define data center costs
Electricity costs represented roughly 20% of total data center costs in the early 2010s. In modern high-density data centers, estimates for power and cooling together show much higher ranges, about 40-50%, with some studies claiming up to 60%.
Improvements in Power Usage Effectiveness, PUE, a measure of how efficiently a data center uses electricity, have slowed in recent years. High-density AI clusters run closer to full capacity, leaving less room for efficiency gains. Ergo, greater overheads in running the data center, alongside any inflation in electricity costs. This cost will invariably be transferred to customers. Or, in the case of in-house data centers, it becomes a bottom-line pressure.
Global data center electricity consumption is projected by the International Energy Agency to more than double from 415 TWh in 2024 to 945 TWh by 2030, growing four times faster than all other sectors. This surge is driven by the power density of AI hardware. To illustrate, a traditional data center rack consumes 10-15 kilowatts, kW, while an AI rack requires 50-150 kW.
But electricity generation cannot be ramped up on cue. Data centers can be built in a couple of years, but power generation plants are a decade-long plan. This mismatch will invariably lead to supply deficits and governments being forced to regulate. In Ireland, where data centers accounted for over 22 percent of national electricity consumption in 2025, up from 5% in 2015, regulators are considering moratoriums on new connections. In developing countries, where electricity is cheaper, this may alleviate the cost burden for a while, but we do not yet have large-scale data center offshoring as case studies, amidst relatively greater grid-risk and policy uncertainty.
This power constraint reinforces the bifurcation between traditional IT and AI workloads.
AI will get costlier
Frontier AI infrastructure requires enormous capital investment in GPUs, data centers, and electricity. As these costs are amortized, providers will increasingly need to charge prices that reflect the true cost of compute. Combined with the increase in silicon costs, data center costs, and the imperative to deliver investor value, AI pricing will invariably go up, making it more expensive for everyone, individuals and enterprises alike.
A compounding factor comes in the form of electricity costs, which have a bigger impact on AI than on traditional IT. AI power is far more power-limited (electricity) than silicon-limited, in contrast to traditional IT workloads. This means higher electricity prices will impact AI prices more than traditional CPU loads.
The “let’s try AI everywhere and see what sticks” approach will make less and less financial sense. AI will be reserved for tasks deemed more business critical and where ROI has been established, while other areas will have to aggressively stake a claim for their use case to get those precious AI tools.
Implications for the IT Decision-Maker
For technology leaders, this landscape requires a strategic adjustment. Financial models based on perpetually declining compute costs should not be taken for granted. Budgets must assume flat or modestly rising costs for traditional IT, with a significant premium for cutting-edge AI.
Frank conversations with business and other stakeholders can reveal how classification and prioritization can happen between AI deployment alongside traditional IT “ramping up as you go.” A nuanced sourcing strategy can help harness the best of both worlds, distinguishing between commodity compute and premium AI capabilities, while being prudent with spending, especially in an economy that makes no promises.
