Welcome to the first blog in our new series, Building to Suit in an AI World. At EdgeConneX, building data centers to suit unique customer specifications is part of our pedigree; we’ve been doing it since 2009. Over those years, our customers’ specifications have changed – a lot. As cloud migration accelerated, providers required significantly larger facilities in strategic locations. As consumers turned to streaming, providers needed deployments at the edge. We’ve supported our customers through each of those shifts.
Artificial intelligence (AI) has once again upended requirements. Building to suit AI workloads is very different than building to suit cloud workloads or content streaming. Location, timing, and density needs are all quite different with AI. In this series, we’ll take each of those dynamics in turn, explaining what has changed with AI, how those changes affect data center development, and how EdgeConneX continues to support unique customer specifications.
Location Requirements for AI Model Training & Inference
The power of an AI model is driven by the number of parameters used during training. Models are trained on clusters of connected graphics processing units (GPUs), and as the number of parameters increases, so does the size of the cluster. (Meta’s Llama 3.1, for example, has 405 billion parameters and was trained using a cluster of 16,000 NVIDIA H100 GPUs. The company’s next-generation model, Llama 4, has 2 trillion parameters and is trained on a cluster of over 100,000 H100 GPUs.)
Larger GPU clusters result in increased data center power and space requirements. Global demand for data center capacity today is approximately 60 GW; by 2030, it could be 3-5 times that, with a significant portion of the increase attributed to AI. Gigawatt data center campuses spanning millions of square feet are becoming increasingly common. That kind of power and space is increasingly hard to come by in primary markets, so the tech companies training AI models are looking to new markets for easier access to land and power.
AI inference (using the trained model to process new inputs and generate outputs) has different requirements. For specific applications, such as autonomous vehicles, high-frequency trading, and industrial automation, inference may require low latency. In those cases, edge deployments may be necessary.
Building to Suit Location Requirements
Meeting the power and land requirements of AI model training requires a strong real estate team with experience developing in new markets. With deployments in 60+ unique markets around the world, we have the experience and expertise to succeed in a wide variety of locations. As our Chief Marketing and Product Officer Phillip Marangella explained in an article about delivering wherever customers need capacity, “It is important to recognize what disparate markets have in common and how they might differ…While there can be a lot of shared solutions across markets, there is tremendous value in knowing how to deliver results in diverse locations, cultures, and communities.”
Supporting deployments of various sizes and locations requires a dynamic supply chain. Our experience and expertise span a range of development sizes and types. We operate edge deployments in the ~2 MW range as well as hyperscale deployments in the 200+ MW range. We build greenfield solutions, re-purpose existing facilities, or partner where needed to accelerate entry into new markets worldwide. Our trusted supply chain enables rapid sourcing globally, eliminating logistical challenges commonly associated with hard-to-reach regions. We leverage a flexible Basis of Design approach to identify, evaluate, and execute worldwide with the best design practices, partners, and technology.
Bottom Line
Whether for model training or inference, the unique requirements of AI workloads are driving significant developments in new markets, as well as new edge computing advancements. Success requires a developer with deep global experience across a range of deployment sizes and types, a dynamic supply chain, and a commitment to sustainability even where it is more difficult. A developer like EdgeConneX.
Up Next in the Building to Suit in an AI World Series:
- Building to Suit : Timing
- Building to Suit : Density