News
Latest Articles
Stay up to date with the latest news, updates, and insights from our team and industry.
Loading articles...

News
Stay up to date with the latest news, updates, and insights from our team and industry.
Loading articles...
Loading article...
Article

The era of supersized data centers is upon us. As artificial intelligence dominates the agendas of the tech giants, the need for bigger and more powerful data centers is accelerating, and it’s leading to a building boom that could reshape the American landscape.
“We aren’t seeing gigawatt buildings yet, but it’s really only a matter of time,” says Dan Drennan, data centers sector leader at Corgan, the topranking architecture firm on Building Design + Construction’s annual list of revenue for data center design.
These rising demands are creating new challenges for the design of data centers, from the power generation needed to the infrastructure to the buildings that contain the servers that make AI work. Right now, and for the foreseeable future, everything is getting bigger. Meta recently announced plans to build data centers that use up to 5 gigawatts of power. OpenAI, Oracle, and SoftBank announced plans earlier this year to invest up to $500 billion in a vast data center building spree. These and other so-called hyperscale data users like Google, Microsoft, and Amazon are expected to drive most of the growth in data centers in the U.S. and globally, according to an analysis by the Boston Consulting Group.
While the average data center building uses 40 megawatts of power today, it’s not uncommon for the biggest companies to be relying on data centers that suck up 300 to 400 megawatts of power per building. And that number is only going up.
“We’re actually building several multi-GW clusters.” Mark Zuckerberg’s July 14 data center building announcement on Facebook put these plans into somewhat menacing perspective. He paired his post with a visualization of a massive rectilinear block smothering a large portion of New York City. “Just one of these covers a significant part of the footprint of Manhattan,” he wrote.
Meta’s largest announced project—the Louisianabased Hyperion data center—is expected to use 2 gigawatts of power by 2030, with the potential to grow to 5 gigawatts of capacity. Now in its very early stages of construction, it sits on 2,250 acres of a former agricultural site. Manhattan’s total land area is more than 14,000 acres.
“From a logistical standpoint, it just makes sense to build these things under one roof,” says Gordon Dolven, director of CBRE Americas data center research. The dominant paradigm of AI today is the large language model, which pulls its intelligence out of deep pools of data and information stored in numerous servers stacked in long rows of 8-foot-tall cabinets, like the aisles of a grocery store filled with nothing but black boxes and blinking blue lights.
These servers connect and communicate with each other almost synaptically, so the closer they are to one another, the faster they can make those connections. The farther away they are, the slower the connections, and the more networking infrastructure and fiber optic cables required to keep them in communication.
That’s why the building size of data centers is increasing, and also why the companies pushing the development of AI are trying to have more of these large buildings constructed near each other.
For example, Meta’s Hyperion data center will be made up of 11 buildings covering more than 4 million square feet, according to a company spokesperson. Its Prometheus data center in Ohio is a vast campus that’s scaling up to run on 1 gigawatt of power by 2026, partly by gearing up servers in quickly built mega-tents.
More servers means more equipment to help them run efficiently, and that results in data center buildings surrounded by lots of large mechanical, cooling, and electrical equipment.
“The big thing for data centers is they always have to have backup power. Then you usually need an extra, so there’s a backup to the backup. And those take up a lot of space,” says Rob LoBuono, a critical facilities leader at Gensler, another of the top architecture firms designing data centers. Backups are also being used for the data itself. “We’re seeing more of a trend toward multiple buildings, multiple points of redundancy, separated across the campus.”
And because the server equipment is getting heavier, the buildings need more robust structures at the foundation, with more material-intensive construction. “Where we were planning for 200 or 250 pounds per square foot previously, we’re now talking about 400, 500 pounds per square foot of loading on these floor plates,” Drennan says. “The loading that you’re planning for on the building goes up.”
All these factors are combining to make the buildings enormous. It’s not uncommon for construction on the larger AI-focused campuses to cover 500,000 square feet or more, usually across a single story. And technically they can keep growing.
“If you’re talking about a new building, assuming the land is such that we’re able to shape the building in a way that we can get all the gear around the building that’s needed to serve that compute in an efficient way, then there’s really no limit to how big these can go,” Drennan says.