A Two-stage Strategy to Optimize Energy Consumption for Latency-critical Workload under QoS Constraint

Authors

  • Jingwei Li Xi'an Jiaotong university
  • Duanyu Teng
  • Jinwei Lin

DOI:

https://doi.org/10.5755/j01.itc.49.4.25029

Keywords:

High energy cost, latency-critical workload, a two-stage strategy, energy saving, Qos constraint

Abstract

Data centers afford huge energy costs. Saving energy while providing efficient quality of service (Qos) is the goal pursued by data centers. But this is a challenging issue. To ensure the Qos of latency-critical applications, data centers always schedule processors to run at higher frequencies. The continuous high frequency operation will cause great energy waste. Modern processors are equipped with dynamic voltage and frequency scaling (DVFS) technology, which allows the processor to run at every frequency levels it supports, so we focus on how to use DVFS to trade-off between energy and Qos. In this paper, we propose a two-stage strategy based on DVFS to dynamically scaling the CPU frequency during latency-critical workload execution, aimed at minimizing the energy consumption for latency-critical workload which is under the Qos constraint. The two-stage strategy includes a static stage and dynamic stage, which are worked together to determine the optimal frequency for running workload. The static stage uses a well designed heuristic algorithm to determine the frequency-load matches under Qos constraint, while the dynamic stage leverages a threshold method to determine whether to adjust the pre-set frequency. We evaluate the two-stage strategy in terms of Qos and energy saving on the cloudsuite benchmark, and compares the two metrics with the-state-of art Ondemand. Results show that our strategy is superior to Ondemand for energy saving, improving more than 13%.

Downloads

Published

2020-12-19

Issue

Section

Articles