The Rise of the Adaptive Data Center

Published by: Andrew Schaap – CEO at Aligned Data Centers

Addressing the evolving needs of the high-capacity buyer

The data center sector has changed dramatically over the last decade, with hyperscale and platform providers driving the most recent building boom. According to new data published by Synergy Research Group, the number of large data centers operated by hyperscale providers rose by 11% in 2018, reaching 430 by the end of the year; another 132 remain in the pipeline.

What’s driving the surge? The major public cloud service providers increasingly attract the migration of ever-expanding enterprise workloads. The leading social networks have a business model founded on consumers eagerly providing them with troves of invaluable personal data. And the raison d’être of over the top (OTTs, content providers that distribute streaming media as a standalone product) is filling the bulk of our free time with seemingly infinite libraries of entertainment.

Business at this scale appears to have reached not so much an upward trend as an irresistible gravitational force; but with success comes challenges. Among the most prominent tests these digital giants face is to strategically manage compute, network, and storage capacity effectively and efficiently to keep up with surging demand.

Use of hyperscale platforms and services is critical to daily business operations. For example, consider how the multinational enterprise has evolved from siloed approaches of production to workflows based on new tools and technology platforms offering increased collaboration. However, with digital transformation comes the on-demand consumer, who will not tolerate even a transient blip in service delivery — an issue that oftentimes results from constraints of resources and infrastructure.

Equally, if not more significant, has been the veritable data tsunami that continues to engulf cloud and platform providers every day. To provide a useful reference point, according to IBM, more than 2.5 quintillion bytes of data were created daily in 2016. By estimates provided by IDC, data creation since the dawn of the internet has doubled in size every two years, and by 2025, the amount of data created and copied annually will grow to 163 zettabytes (ZB), or one trillion gigabytes. That equates to ten times the 16.1 ZB of data generated just three years ago.

As the capacity needs of hyperscale giants, platform and cloud providers, and enterprises using high-density computing continue to swell, they must continually roll out scalable infrastructure that supports their expansion strategically, while simultaneously maintaining the same quality of service expected of them, for an increasingly larger and geographically diverse user base.

Read the full article on Mission Critical Magazine here.