“The risk is not that society will need less compute. The risk is that we will build too much of the wrong compute in the wrong places for the wrong reasons.” – MJ Martin
Introduction
The global data centre market is expanding at extraordinary speed, driven by cloud computing, artificial intelligence, sovereign data requirements, digital services, and the growing electrification of almost everything. The central question is whether this expansion represents disciplined infrastructure planning or whether the industry is overbuilding capacity ahead of proven demand. The answer is not simple. Some capacity is undoubtedly speculative, but the larger risk may not be too many data centres. It may be too many data centres built in the wrong locations, with the wrong financing assumptions, and with insufficient access to power, water, transmission, and long-term customers.

Demand Is Real, but Uneven
Demand for compute capacity is real. AI training, AI inference, cloud migration, video, cybersecurity, industrial automation, and digital public services all require more data processing. JLL projects that global data centre capacity could reach 200 GW by 2030, with the sector nearly doubling between 2026 and 2030, largely because of hyperscale cloud and AI demand. BloombergNEF has reported that large data centre capital spending is approaching extraordinary levels, with capacity under construction exceeding 23 GW.
However, demand is not evenly distributed. Some facilities will be fully absorbed because they are close to power, fibre, enterprise customers, cloud zones, or sovereign data requirements. Others may struggle if they are built only on the assumption that AI demand will grow endlessly and uniformly. The market may not be overbuilt in total, but it can still be misallocated.

Costs and Financing Risk
The economics are becoming more difficult. Data centres are no longer simple real estate projects. They are power infrastructure projects, grid interconnection projects, cooling projects, semiconductor supply chain projects, and debt-financed technology bets. Morgan Stanley has noted that hyperscalers may spend more than $1 trillion across 2025 and 2026 and that financing, grid underinvestment, and supply chain disruption are becoming central constraints.
This creates risk. If interest rates remain high, if GPU utilization falls, if AI revenues disappoint, or if customers shift to more efficient models, some projects may face lower returns than expected. A data centre built for yesterday’s AI architecture may not match tomorrow’s workload economics. Overbuilding is therefore less about square footage and more about stranded capital.

Power Is the Hard Constraint
The strongest argument against a classic overbuild thesis is power scarcity. In many regions, the limiting factor is not land or capital. It is electricity. The U.S. Energy Information Administration expects U.S. power consumption to reach record highs in 2026 and 2027, with data centres supporting AI and cryptocurrency among the major demand drivers. BlackRock estimates that approximately 148 GW of additional power capacity may be needed by the end of the decade to satisfy data centre demand.
This means many announced projects may never be completed, or may proceed in phases. Power availability will separate viable projects from promotional announcements. In Canada, this issue is especially relevant because provinces with hydroelectric resources, cooler climates, and strong transmission planning may attract more interest. Yet even in Canada, local grid capacity, community acceptance, water use, and ratepayer impact will matter.

Massive Centralization Versus Distributed Infrastructure
The current data centre model favours massive centralization. Hyperscale campuses concentrate compute, capital, energy demand, and operational control. This creates economies of scale, but also creates systemic risk. A large centralized campus can strain the local grid, consume scarce water, require major transmission upgrades, and concentrate cyber and physical security exposure.
A more resilient model would combine hyperscale facilities with distributed infrastructure. Edge data centres, municipal compute nodes, utility-adjacent microgrids, and regional sovereign cloud facilities can reduce latency, improve resilience, and keep some compute closer to where data is created. This is especially important for smart grids, industrial IoT, transportation systems, public safety, and advanced metering infrastructure. Not every workload needs to travel to a massive remote campus.

Summary
Data centre capacity is not simply being overbuilt. It is being overcommitted, overpromised, and unevenly planned. The strongest projects will have secured power, credible customers, sound financing, efficient cooling, and a clear role in the digital economy. The weakest projects will depend on speculative AI demand, fragile debt structures, and optimistic grid assumptions. The future will not belong only to the largest data centres. It will belong to the best integrated infrastructure, where centralized scale and distributed intelligence work together.
About the Author:
Michael Martin is the Vice President of Technology with Metercor Inc., a Smart Meter, IoT, and Smart City systems integrator based in Canada. He has more than 40 years of experience in systems design for applications that use broadband networks, optical fibre, wireless, and digital communications technologies. He is a business and technology consultant. He was a senior executive consultant for 15 years with IBM, where he worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He is a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN: TSX).
Martin served on the Board of Directors for TeraGo Inc (TGO: TSX) and on the Board of Directors for Avante Logixx Inc. (XX: TSX.V). He has served as a Member, SCC ISO-IEC JTC 1/SC-41 – Internet of Things and related technologies, ISO – International Organization for Standardization, and as a member of the NIST SP 500-325 Fog Computing Conceptual Model, National Institute of Standards and Technology. He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) [now Ontario Tech University] and on the Board of Advisers of five different Colleges in Ontario – Centennial College, Humber College, George Brown College, Durham College, Ryerson Polytechnic University [now Toronto Metropolitan University]. For 16 years he served on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section.
He holds three master’s degrees – in business (MBA), communication (MA), and education (MEd). As well, he has three undergraduate diplomas and seven major certifications in business, computer programming, internetworking, project management, media, photography, and communication technology. He has completed over 80 next generation MOOC (Massive Open Online Courses) [aka Micro Learning] continuous education programs in a wide variety of topics, including: Economics, Python Programming, Internet of Things, Cloud, Artificial Intelligence and Cognitive systems, Blockchain, Agile, Power BI, Big Data, Design Thinking, Security, Indigenous Canada awareness, and more.
Martin in a volunteer, a photographer, a learner, a technologist, a philosophizer, and a romantic optimist.