In a world dominated by internet-enabled technology, data centers are the brains keeping the digital ecosystem afloat. Responsible for processing, storing, and transmitting data, server rooms are of enormous value, and – by extension – so are the infrastructure and resources used to maintain them.
The lifeblood that makes all digital systems possible is electricity, and data center managers couldn’t be more aware of this. Besides managing a fleet of servers and equipment, the electricity bill and power allocation are essential considerations for server administrators. Given the scarcity and environmental impact of fossil fuels, organizations continue to consider sustainable energy sources. With innovation and specialization in server technology, numerous organizations choose virtualization or colocation data centers that offer more scalability.
Here we’ll look at server room power consumption: how organizations approach power needs, the components using power, and trends in power consumption by data centers.
Also Read: Gain Control of Enterprise’s Great Energy Hog: The Data Center
Distributing power
Data centers, like most realms of technology, require meticulous attention to detail. Without granular visibility into module and equipment inventory, estimating power needs can be inefficient, costly, or unmanageable.
Starting with equipment specs, administrators extract the amp requirements for the entire server room. With the total amps needed in mind, the administrator can install power appropriately and determine a quantity of power distribution units (PDU). PDUs offer distributed power to multiple racks or modules, but unlike a standard power strip, a PDU is fit for network connectivity, environmental monitoring, and remote access.
Because rack servers can vary in power needs, placing server locations with appropriate power sources is essential. The more powerful the server, the greater the power density is. This means more wattage and more extensive circuits are needed to manage additional power and cooling.
Also Read: Top Rack Servers of 2021
High density = energy, cooling demands
Power density as a term describes the amount of power that can be supplied per rack and extends to the capabilities of the entire data center. Previously quantified in watts per square foot, the past metric failed to account for the number of racks and cabinets, prompting the adoption of kilowatts per rack (kW/rack).
With low demand for data center resources before the 1980s, power density wasn’t a priority, and high density was considered as little as 2 to 4kW per rack. Today, depending on the data center, some racks can hit as much as 20kW while the average is closer to 7kW.
As demand for data centers rose and continues to rise, a critical challenge for high-density server functionality is cooling and ventilation. Solutions in progress to meet this demand include liquid cooling and sustainable free cooling using green energy sources.
Estimating power demand
Starting with a needs assessment, administrators can estimate power needs from existing and forecasted equipment. Knowing the sufficient power needed to power critical loads, maintain cooling and ventilation, and more.
Also Read: Best Load Balancers of 2021
Data center equipment includes servers, lighting, environment controls, fire suppression systems, security alarms, surveillance cameras, sensors, air, cooling, and ventilation systems.
As an example, here is a breakdown of a standard data center’s electrical requirements:
Source | Power | Description |
Cooling system | 50% | Cooling mechanism for maintaining adequate conditions |
Critical loads | 36% | Using equipment nameplates to tabulate base loads and considering future loads based on additional capacity |
UPS module(s) | 11% | UPS efficiency and battery charging for redundancy |
Lighting | 3% | Lighting for server room; typically 2W/sqft |
Making up 86% of our example data center’s electrical requirements are the two most important indicators in calculating server room power consumption: total critical loads and total cooling loads. Smaller but vital to continued power, the uninterruptible power source (UPS) module offers redundancy when the main power source fails.
The future: Energy efficiency
Data centers are indispensable when considering their capacity for colocation services, cloud solutions, and compliance assurances. The concern remains whether the value of the energy burned merits the benefits.
In 2017, global data center power consumption totaled 416 terawatts, or 3% of global electricity, with American server rooms consuming 90 billion kW. Meeting those huge demands has prompted some of the biggest cloud providers to design custom servers and chips.
Advancements in data center management from the mid-2000s have made a big difference in efficiency. According to analysts, server room power consumption dropped as much as 80% with the adoption of low-power chips and SSDs.
Also Read: Greening Your Data Center You May Have No Choice
Data center and colocation vendor vXchnge offered the following tips to improve data center energy consumption.
- Eliminate cooling dependencies by cleaning up workloads, eliminating unnecessary equipment
- Test different temperatures to understand the most efficient degrees and cut costs
- Orchestrate server capacities and loads to meet requests in real-time
- Detect and squash zombie servers that are running with no existing use
- Optimize or decrease space, knowing it can inflate cooling costs
- Develop stronger supplier partnerships for a mutually beneficial relationship
Server rooms require immense power and a level of reliability and performance that keep data online and available for high-intensity uses. Adopting sustainable energy consumption practices can offer more visibility into a primary expense and reduce the overall energy burden.
The power consumption plateau
Despite accelerating digital information services in the last decade, global data center energy consumption only rose by 6% between 2010 and 2018.
Energy and environmental think tank Energy Innovation points to three factors for this plateau.
- The improved energy efficiency of IT devices (servers, hard drives)
- Server virtualization software, where multiple applications can run on one server
- Compute instances migrated to large cloud and hyperscale data centers, where ultra-efficient cooling is available
Most analysts are forced to use methods based on limited publicly available information to calculate the global server room power consumption. While some organizations like Google, Apple, and Facebook report such data, a lack of insight into the worldwide industry has generally led to estimates far higher than reality. Relative to driving, flying, and coal-sourced energy, data centers operate more efficiently.
Looking ahead, intensive workloads posed by AI and machine learning could mean less efficiency in the coming years. Investment in R&D for high-powered computing, storage, and cooling technologies – as well as energy-efficient power – are essential parts of the puzzle.
Also Read: Data Center Power Consumption on the Rise
Colocation considerations
Less costly and time-intensive than building a private data center, colocation centers are shared facilities where renters share costs. While organizations using colocation for server hosting naturally have less physical control over their modules, data centers continue to improve remote control capabilities. From the home office or workplace, administrators can manage a suite of servers a continent away.
Colocation is a scalable solution that addresses the limitations of private data centers and provides higher levels of bandwidth than the average office server room. Colocation centers tend to be more reliable, considering they specialize in server management like data backups and low-latency network options. An added benefit is the extent of physical and environmental protections. Built to withstand colocation, data centers include CCTV monitoring, private suites, and suppression systems.
Colocation vs. Public Cloud
To be virtual, or not to be virtual – that is the question. Cloud-based solutions might offer a similar service but at the cost of having less control of the peripherals for server, storage, and network elements. Like most cloud offerings, organizations are responsible for a fraction of the setup that a private data center or colocation center would need. While trends of the last decade preach the power of the cloud, public cloud vulnerabilities can’t be ignored by IT professionals, and enterprises seem to be leaning toward maintaining at least some on-premises infrastructure.
Also Read: The Wild, Wild Cost of Data Centers