4.6 Article

Survey of Techniques and Architectures for Designing Energy-Efficient Data Centers

Journal

IEEE SYSTEMS JOURNAL
Volume 10, Issue 2, Pages 507-519

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSYST.2014.2315823

Keywords

Controller design; data centers; energy efficiency

Ask authors/readers for more resources

Cloud computing has emerged as the leading paradigm for information technology businesses. Cloud computing provides a platform to manage and deliver computing services around the world over the Internet. Cloud services have helped businesses utilize computing services on demand with no upfront investments. The cloud computing paradigm has sustained its growth, which has led to increase in size and number of data centers. Data centers with thousands of computing devices are deployed as back end to provide cloud services. Computing devices are deployed redundantly in data centers to ensure 24/7 availability. However, many studies have pointed out that data centers consume large amount of electricity, thus calling for energy-efficiency measures. In this survey, we discuss research issues related to conflicting requirements of maximizing quality of services (QoSs) (availability, reliability, etc.) delivered by the cloud services while minimizing energy consumption of the data center resources. In this paper, we present the concept of inception of data center energy-efficiency controller that can consolidate data center resources with minimal effect on QoS requirements. We discuss software- and hardware-based techniques and architectures for data center resources such as server, memory, and network devices that can be manipulated by the data center controller to achieve energy efficiency.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available