Deep learning-based signal detection in OFDM systems
This is an example of using the long short-term memory (LSTM) network in the Deep Learning Toolbox to achieve symbol classification at the receiver for signal detection in OFDM systems.
The LSTM-based neural network is trained for a single subcarrier, where the symbol error rate (SER) is calculated and compared with the least square (LS) and minimum mean square error (MMSE) estimations.
The wireless channel is assumed to be fixed during the offline training and the online deployment stages in this initial investigation. To test the robustness of the neural network, a random phase shift is applied for each transmitted OFDM packet.
The impacts of the number of pilot symbols and the length of the cyclic prefix (CP) are considered.
To recreate the simulation results, please load the corresponding mat file and run the script Testing.m.
The idea of this code is inspired by the paper:
H. Ye, G. Y. Li and B. Juang, "Power of Deep Learning for Channel Estimation and Signal Detection in OFDM Systems," in IEEE Wireless Communications Letters, vol. 7, no. 1, pp. 114-117, Feb. 2018.
Cite As
- Narengerile (2023). Deep learning-based signal detection in OFDM systems (https://www.mathworks.com/matlabcentral/fileexchange/72321-deep-learning-based-signal-detection-in-ofdm-systems), MATLAB Central File Exchange. Retrieved .
MATLAB Release Compatibility
Platform Compatibility
Windows macOS LinuxCategories
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.
SignalDetection_DNN_SU_OFDM
Version | Published | Release Notes | |
---|---|---|---|
1.0.0 |
|