Ph.D - Dynamic Power Management

Intelligent power management in portable, embedded devices

PhD Thesis Tittle: Online Learning of Timeout Policies for Dynamic Power Management

Thesis Defense Presentation [Download]


The work in this PhD thesis is related to developing a mobile, embedded heterogeneous (visual) sensing platform for traffic surveillance, called MobiTrick, and proposing a novel machine learning based approach for the Dynamic Power Management (DPM) of a computing system (and in particular MobiTrick ) to reduce its power consumption during run-time while maintaining the overall performance at an optimal level. This work mainly focuses on two major aspects of dynamic power management: (i) obtaining different solutions corresponding to power consumption and performance with a user-selected criteria, and (ii) dynamically re-configuring the system during the operation so that a user-specified constraint (or level) of power consumption or performance is achieved. Considering the dynamic nature of real environments, this DPM technique uses a Reinforcement Learning (RL) based approach to adapt to the environment and adjusting the DPM decisions online during the system’s operation. The DPM decisions in this learning framework, referred to as Online Learning of Timeout Policies (OLTP), include the optimal selection of timeout values in the different device states. As opposed to the widely used static timeout policies, the OLTP learns to dynamically change the timeout decisions in the different device states including the non-operational states.

The proposed DPM approach is further able to adapt the user-specified power/performance constraints online via an Online Adaptation of Power/Performance (OAPP) framework. Additionally, the compatibility and effectiveness of the proposed OLTP/OAPP framework for a system having a higher number of power/performance states has also been demonstrated in this thesis. The proposed techniques have been implemented and evaluated on the embedded traffic surveillance platform MobiTrick.

Keywords: Reinforcement learning. dynamic power management, timeout policies, non-stationary workload, power-performance trade-off.

MobiTrick's Sensing Platform and OLTP/OAPP Performance Evaluation

High level view of MobiTrick's heterogeneous sensing platform under power management

Traffic surveillance with MobiTrick's power managed prototype

OAPP: Convergence to the user specified latency and power constraints

OLTP: Power-performance Pareto-front for various workloads

Power profile for different latency constraints with changing workload

Relevant Publications:

  1. U. A. Khan and B. Rinner. "Online Learning of Timeout Policies for Dynamic Power Management". ACM Transactions on Embedded Computing Systems, Vol. 13, No. 4, Article 96, 25 pages, February 2014. (DOI: [Download]
  2. U.A.Khan, F.A.Jokhio, I.H.Sadhayo, “Reinforcement Learning for Dynamic Power Management of Embedded Visual Sensor Nodes”, Mehran University Research Journal of Engineering & Technology, vol. 33, No.2, 2014. [Download]
  3. U.A.Khan, B.Rinner, “A Reinforcement Learning Framework For Dynamic Power Management of a Portable, Multi-Camera Traffic Monitoring System”, In Proceedings of IEEE conference on Green Computing & Communication (GreenCom), pp. 557-564, Besancon, France, 2012. [Download]
  4. U.A.Khan, M. Godec, M. Quaritsch, M. Hennecke, H. Bischof, B. Rinner, “MobiTrick – Mobile Traffic Checker”, In Proceedings of 19th Intelligent Transportation Systems (ITS) World Congress, pp. 1-10, Vienna, Austria, 2012. [Download]
  5. U.A.Khan, B.Rinner, “Dynamic Power Management for Portable, Multi-Camera Traffic Monitoring”, In Proceedings of 18th IEEE Real-Time and Embedded Technology and Applications Symposium (RTAS), pp.37-40, Beijing, China, 2012. [Download]
  6. U.A.Khan, M.Quaritsch, B.Rinner, “Design of a Heterogeneous, Energy-Aware, Stereo-Vision Based Sensing Platform for Traffic Surveillance”, In Proceedings of IEEE Ninth Workshop on Intelligent Solutions in Embedded Systems (WISES), pages 47–52, Regensberg, Germany, 2011. [Download]