Expert Blog: When it Comes to Data Centers, One Size Does NOT Fit All

Eoin Byrne
Eoin Byrne, MRICS, Vice President, Linesight

In the age of big data, we need the infrastructure to manage the petabytes and zettabytes of information that we create every year. It’s easy to look at this digital wave and declare that the answer is to simply build the biggest and most powerful data centers possible. You might think the obvious solution to more data is more computing power and capacity. But whether hyperscale or scalable, the type of data center being delivered is really determined by data center developers and what works for their business models.

There is no single version of the data center of the future, because there will actually be multiple versions running in parallel and in sync with each other.

The demand for data centers is truly astounding, with over $200 billion spent in 2021. Just about every major company in the world is building facilities as quickly as they can. Microsoft announced last year that it is on track to build between 50 and 100 data centers every year for the foreseeable future. Multiply that by hundreds or even thousands, and then remember the scope of this expansion wouldn’t have even been on anyone’s radar a decade ago. For many large companies, the preferred model is the hyperscale data center, defined by the International Data Corporation as sporting at least 5,000 servers and 10,000 square feet of space. And this truly is a minimum – at Linesight, we regularly see facilities of hundreds of thousands of square feet or more.

But hyperscale data centers built by the tech giants aren’t the only model. In fact, other data center options are available as an alternative to the expense and delivery time of hyperscale facilities. Some data center developers have moved away from building entire facilities as quickly as possible, with some companies opting for smaller, scalable deployments, or expanding their options with colocation facilities to align income with the investment. In this context, bigger isn’t always necessarily better.

Why rethink the data center?

Back in 1965, Intel co-founder Gordon Moore wrote that processing speeds would double every two years. And while ‘Moore’s Law’ hasn’t held perfectly true, in reality semiconductors and servers have gotten exponentially faster over the last few decades, driving the digital revolution as devices get smaller and more powerful. What Moore did not predict was the explosion in data this computing surge would create. It wasn’t so long ago that we were talking in terms of kilobytes and megabytes, but today we are seeing the emergence of new terms, like petabytes and zettabytes, to describe the volume of information being created.

Traditional approaches to data management were never designed for this, but the advent of cloud computing, the Internet of Things (IoT) and billions of people shopping online have fundamentally rewritten the rules on how data is created and used. This is having a profound effect on data centers, which are being reimagined to support the new ways people use computers – and this will only expand in the future. New problems call for new solutions.

– Expert Blog continues below the photo –

Data Center

The new data center

As demand grows, the obvious solution seems to be creating larger and more efficient data centers. After all, more capacity and processing power helps developers meet their clients’ needs. But while hyperscale data centers are certainly an important part of the mix, they aren’t the right choice for everyone. Having multiple locations with deployable space can be more attractive, helping developers appeal to a wider market share. However, this comes with its own challenges, including the high upfront capital expense to build the data center, with the ability to scale deployment as demand comes online. This model is more flexible, and can provide redundancy when one or more locations goes down for maintenance or in an emergency. Because of technological limitations, this approach wasn’t a serious option even a decade ago, but it is rapidly becoming more popular. While hyperscale data centers boast incredible speeds and capacity, scalable and modular infrastructure is designed to meet ever-changing market demands.

It should be noted that this is not a binary situation: in fact, developers often rely on a hybrid approach to get the best of both worlds when it comes to power and flexibility. One major advantage of scalable data centers is financial, as it can be more cost-efficient to build and operate smaller facilities without large upfront costs of fitting out the entire facility when demand is not yet known.

In addition, smaller edge data centers are becoming increasingly common in urban areas, where space is limited and demand is high. As IoT, gaming and smart technologies take off in private homes, edge DCs allow for reliability, speed and connectivity close to where the users are located. The edge data market is expected to double in the next five years, creating unprecedented demand for new data facilities.

Infrastructure matters

There is a myriad of factors that go into determining what kind of data center to build, and some have little to do with demand and everything to do with available resources. Organizations thinking about building data centers must factor in a number of other considerations. When it comes to data, these factors can include access to reliable power, available land, proximity to end-users, availability of high-quality fiberoptic connections and nearby water for cooling. All of these factors play an important role in what kind of data centers to build and where.

Ten years ago, a cookie-cutter approach may have been the norm, but building a data center today is not a one-size-fits-all proposition. Developers need to pivot the conversation to figuring out exactly what kind of data center (or data centers) will meet their needs. This should be driven by internal and external considerations, ranging from cost to anticipated growth and existing infrastructure. Only when all of these questions are answered can organizations begin to effectively plan and build the data centers of the future.

Author: Eoin Byrne