Journal
IEEE TRANSACTIONS ON COMMUNICATIONS
Volume 68, Issue 9, Pages 5504-5518Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCOMM.2020.3003670
Keywords
Transceivers; Array signal processing; Hidden Markov models; Uplink; Training; Deep learning; Antenna arrays; mmWave; massive MIMO; 5G; neural networks; machine learning; deep learning
Ask authors/readers for more resources
Predicting the millimeter wave (mmWave) beams and blockages using sub-6 GHz channels has the potential of enabling mobility and reliability in scalable mmWave systems. Prior work has focused on extracting spatial channel characteristics at the sub-6 GHz band and then use them to reduce the mmWave beam training overhead. This approach still requires beam refinement at mmWave and does not normally account for the different dielectric properties at the different bands. In this paper, we first prove that under certain conditions, there exist mapping functions that can predict the optimal mmWave beam and blockage status directly from the sub-6 GHz channel. These mapping functions, however, are hard to characterize analytically which motivates exploiting deep neural network models to learn them. For that, we prove that a large enough neural network can predict mmWave beams and blockages with success probabilities that can be made arbitrarily close to one. Then, we develop a deep learning model and empirically evaluate its beam/blockage prediction performance using a publicly available dataset. The results show that the proposed solution can predict the mmWave blockages with more than 90% success probability and can predict the optimal mmWave beams to approach the upper bounds while requiring no beam training overhead.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available