Deep Reinforcement Learning for Energy-Efficient Networking with Reconfigurable Intelligent Surfaces
When deployed as reflectors for existing wireless base stations (BSs), reconfigurable intelligent surfaces (RISs) can be a promising approach to achieve high spectrum and energy efficiency. However, due to the large number of RIS elements, the joint optimization of the BS and reflector RIS configuration is challenging. In essence, the BS transmit power and RIS’s reflecting configuration must be optimized so as to improve users’ data rates and reduce the BS power consumption. In this paper, the problem of energy efficiency optimization is studied in an RIS-assisted cellular network endowed with an RIS reflector powered via energy harvesting technologies. The goal of this proposed framework is to maximize the average energy efficiency by enabling a BS to determine the transmit power and RIS configuration, under uncertainty on the wireless channel and harvested energy of the RIS system. To solve this problem, a novel approach based on deep reinforcement learning is proposed, in which the BS receives the state information, consisting of the users’ channel state information feedback and the available energy reported by the RIS. Then, the BS optimizes its action composed of the BS transmit power allocation and RIS phase shift configuration using a neural network. Due to the intractability of the formulated problem under uncertainty, a case study is conducted to analyze the performance of the studied RIS-assisted downlink system by asymptotically deriving the upper bound of the energy efficiency. Simulation results show that the proposed framework improves energy efficiency up to 77.3% when the number of RIS elements increases from 9 to 25.