Artificial intelligence investment is entering a new and more selective phase as companies and investors move beyond early excitement and focus on the physical infrastructure required to support large-scale AI systems. According to recent analysis from Goldman Sachs, the next stage of the AI boom will be driven less by experimental software and more by data centres, computing hardware, and energy capacity.
The report suggests that the market is entering what analysts describe as a “flight to quality,” where investors prefer companies with strong infrastructure assets instead of firms offering limited AI tools or early-stage applications. As AI adoption expands across industries, the need for high-performance computing and reliable data centre networks is becoming the main factor shaping long-term growth.
AI Boom Moves From Hype to Infrastructure Reality
During the first wave of generative AI adoption, many companies gained attention simply by announcing AI-related projects. However, the current phase is different. Investors are now asking whether companies have the resources required to run large models at scale.
Goldman Sachs analysts believe that the most valuable companies in the next stage of AI development will be those that own or operate the infrastructure behind the technology. This includes hyperscale cloud providers, semiconductor manufacturers, and data centre operators.
Training and running modern AI systems requires enormous computing capacity. Large language models and advanced machine learning platforms must process huge amounts of data, often using thousands of chips working together for long periods. Because of this, companies are investing billions of dollars in new facilities designed specifically for AI workloads.
Hyperscale cloud providers are leading this expansion, building massive data centres equipped with high-performance processors, advanced networking equipment, and specialized cooling systems. These investments are happening at a pace not seen during earlier phases of cloud computing.
Data Centres Become the Core of the AI Economy
The rapid growth of artificial intelligence is reshaping the global data centre industry. Research from Goldman Sachs estimates that AI workloads could account for around 30 percent of total data centre capacity within the next two years. This would mark a major shift from traditional cloud services, which previously dominated computing demand.
AI workloads are different from normal cloud applications. Training large models requires thousands of graphics processing units running simultaneously, often for weeks. Even after training, AI systems must continue using computing power to generate responses, make predictions, or process user requests.
This constant demand means companies cannot rely on existing infrastructure alone. Instead, they must build new facilities with higher energy capacity, faster networking, and improved cooling technology.
The growth in AI demand is also increasing the importance of networking equipment, as data must move quickly between servers, storage systems, and users. As a result, the entire supply chain supporting data centres is expanding.
Energy Demand Emerges as a Major Challenge
One of the biggest issues facing the AI industry is power consumption. Running advanced AI models requires far more electricity than traditional software systems. According to estimates from Goldman Sachs, global electricity demand from data centres could rise by about 175 percent by 2030 compared with 2023 levels.
Analysts say this increase would be similar to adding the power consumption of another top-ten energy-using country to the global grid. Such growth is forcing governments, utilities, and technology companies to rethink energy planning.
New data centres require stable and long-term electricity supplies, which means companies must secure power agreements before construction begins. In some cases, projects are delayed because local grids cannot support the additional demand.
As AI expands, energy availability is becoming just as important as computing hardware. Companies that can secure reliable power sources may gain a significant advantage in the competition to build large AI systems.
Location and Cooling Now Shape Data Centre Strategy
The growing need for electricity and cooling is influencing where companies build new data centres. Large AI facilities are often located in regions with strong energy infrastructure, affordable land, and access to high-capacity fibre networks.
Some companies are choosing remote areas where power is easier to obtain, even if the location is far from major cities. Others are building near renewable energy sources to reduce environmental impact and meet regulatory requirements.
Cooling systems are also becoming more important. AI servers generate large amounts of heat, and keeping them at safe temperatures requires advanced cooling technology. Academic research on AI infrastructure shows that the efficiency of cooling systems and the climate of the data centre location can significantly affect total energy use.
Water consumption is another concern. Some cooling methods use large amounts of water, which can create challenges in regions where water supply is limited. Because of these factors, infrastructure decisions now play a major role in AI strategy.
Building AI Infrastructure Takes Years
Unlike software development, building data centres is a slow and complex process. Large facilities require land acquisition, environmental approvals, construction work, and connections to power grids and fibre networks. Each step can take months or even years.
Supply chain constraints are also affecting the pace of expansion. Data centre projects depend on specialized equipment such as transformers, power systems, and high-performance chips. Shortages of these components can delay construction.
Many projects also require long-term energy contracts before they can begin. Without guaranteed electricity supply, companies cannot operate large AI clusters safely.
Because of these challenges, investors are increasingly interested in companies that already own large data centre networks. Firms with existing infrastructure can expand more quickly than those starting from scratch.
Investors Become More Selective in the AI Market
The early stage of the AI boom was marked by rapid increases in company valuations, even for businesses with limited experience in the field. That phase is now slowing as investors examine which companies have sustainable business models.
According to Goldman Sachs, the market is shifting toward businesses that provide essential services for AI deployment. Data centre operators, semiconductor manufacturers, and cloud providers sit at the base of the AI ecosystem, meaning their products are needed regardless of which AI applications become popular.
Software companies, on the other hand, face more uncertainty. Some applications may grow quickly, while others could disappear as technology changes. This difference is encouraging investors to focus on infrastructure rather than experimental tools.
A similar pattern has appeared in previous technology cycles. During earlier waves of computing growth, companies that built the underlying hardware and networks often generated stable revenue, while software platforms changed more frequently.
Governments and Utilities Face New Pressure
The rapid expansion of AI infrastructure is not only a business issue but also a public policy challenge. Rising electricity demand means governments must invest in power generation, transmission lines, and grid upgrades.
Environmental impact is another concern. Large data centres consume significant amounts of energy and water, which can affect local communities. As a result, regulators are paying closer attention to where facilities are built and how they operate.
Some countries see AI infrastructure as a strategic priority and are offering incentives to attract data centre projects. Others are limiting expansion until energy supply can keep up with demand.
These decisions will influence which regions become major hubs for AI development in the future.
The Next Stage of the AI Race
The analysis from Goldman Sachs suggests that the future of artificial intelligence will depend as much on physical infrastructure as on software innovation. Algorithms alone are not enough. Companies must also build the computing systems, power networks, and cooling facilities needed to run them.
As the AI market matures, investment is likely to concentrate on businesses that control these essential resources. Data centres, energy supply, and networking capacity are becoming the foundation of the entire AI economy.
The shift toward infrastructure also means the pace of AI growth may be limited by real-world constraints such as construction time, equipment availability, and electricity supply. Unlike software, these factors cannot be scaled instantly.
In the coming years, the success of AI companies may depend not only on their models and applications but also on their ability to secure land, power, and hardware. The next phase of the AI boom will therefore be shaped as much by engineers and utility planners as by software developers.
As investors continue to reassess the market, one trend is clear: the race to build artificial intelligence is increasingly becoming a race to build the data centres that make it possible.
Photo Credit by (unsplash)