By Alpaslan Demir, InterDigital Communications, LLC
The increasing demand on mobile technology for video streaming and other data-intensive applications means that network requirements will soon exceed capacity. The Federal Communications Commission (FCC) estimates that by 2014, mobile device traffic will have increased by 35x over 2009 levels. While the Long Term Evolution (LTE) standard and other advances may provide up to a fivefold increase in capacity , this increase still falls far short of projected demand.
InterDigital is developing dynamic spectrum management (DSM) technologies as part of a comprehensive approach to addressing the growing demand for bandwidth. Dynamic spectrum management will help reduce the anticipated strain on the network by taking advantage of underutilized spectrum, such as TV white space (TVWS). TV signals rarely occupy the full spectrum allocated to them. In many regions, the subsequent white space between channels could be used by other communications devices.
Using MATLAB® and Simulink®, we developed and demonstrated real-time algorithms that detect underutilized spectrum that can safely be used for mobile communications. We proved our theoretical concepts by developing algorithms in MATLAB, performing system-level simulations to verify the design in Simulink, and then implementing the design on an FPGA.
Regulatory agencies, including the FCC in the United States and Ofcom in the United Kingdom, endorse dynamic spectrum access and dynamic spectrum management. These agencies have defined policies and rules to ensure that usage of underutilized spectrum does not create problems for other devices on the network. The FCC, for example, requires device manufacturers to provide a database that identifies usable and occupied spectrum . The FCC recommends the use of sensing technology as a complement to the database.
To be commercially viable, the systems for detecting available spectrum that we develop using MATLAB and Simulink must go beyond basic compliance with government regulations. Our systems must meet three additional requirements: incorporate sophisticated power management schemes to ensure that they do not drain the battery of the device on which they are deployed, meet service providers' quality-of-service requirements, and support multimode operation that includes mobile communications, machine-to-machine (M2M), and Wi-Fi operations.
DSM is part of a wider effort within InterDigital to address the imminent bandwidth crunch. This effort involves three initiatives:
Spectrum optimization – Advancing current technologies to improve cell edge performance, direct terminal-to-terminal communications, and joint transceiver designs.
Intelligent data delivery – Applying video compression algorithms, local routing and caching, and other techniques to make the most of existing bandwidth.
Connectivity and mobility – Focusing on bandwidth aggregation and dynamic spectrum management. These technologies will support networks of heterogeneous networks in which different types of devices share bandwidth and exchange data.
We based our initial algorithms for detecting unused spectrum on the Blackman-Tukey method. This method comprises four principal parts for calculating power spectral density: autocorrelation estimation, windowing, fast Fourier transform, and averaging. In its simplest form, the algorithm scans a range of radio frequencies and correlates the input signal on each frequency with a time-shifted value of the same signal. A frequency with only noise on it will have low correlation, whereas a frequency that is being used to broadcast man-made signals will correlate highly.
We coded and tested each of the four main operations in the process independently using MATLAB. By running simulations on each part as we developed it and generating plots in MATLAB, we could visualize intermediate results and verify that each part functioned correctly. We then combined the individual parts as MATLAB Function blocks in a Simulink system model and simulated the complete system.
The system output is an averaged power spectral density (PSD) estimate (Figure 1). By analyzing the PSD, we can identify a threshold for distinguishing a transmitted signal from noise. Any measured PSD below the threshold is the result of noise; any measured PSD above it is a legitimate signal. Analysis of the PSD also yields clues as to its source. For example, a narrowband signal that spans 50 to 100 KHz could be from a wireless microphone, and a wideband signal that spans 6 MHz with a peak at the expected pilot tone location could be a digital television signal. It is also possible to distinguish a spurious emission from a wireless microphone.
After verifying the Simulink model via simulation, we implemented it on a Virtex-4 FPGA. At Mobile World Congress in Barcelona, we demonstrated this prototype as part of a DSM system (Figure 2), which scanned TVWS and industrial, scientific, and medical (ISM) radio bands and dynamically changed transmission channels based on the available spectrum.
To better study the receiver operating characteristic (ROC) of our sensing algorithms, we also built an autocorrelation-based fine sensing algorithm test bench using Simulink (Figure 3).
The test bench consists of four principal components:
The test bench lets us evaluate the performance of various sensing algorithms by quantifying the probability of detecting a signal across a range of signal-to-noise ratios (SNRs), bandwidth frequency offsets, and number of antennas.
For example, we developed a bit-accurate model of the autocorrelation-based fine sensing algorithm using Simulink and Xilinx® System Generator for DSP. This algorithm performed autocorrelation on in-phase and quad-phase samples of the input signal produced by a 16-tap delay line (Figure 4).
Using the test bench, we simulated this enhanced algorithm to discover how it would perform in the presence of noise for various transmission bandwidths. We plotted the probability of detection (Pd) as a function of SNR for three different bandwidths (Figure 5).The results confirmed that a signal with a narrower bandwidth is generally easier to detect than one with a wider bandwidth. We then modified the algorithm to support two antennas, essentially providing the system with another perspective on the input signal. Since the signal is correlated between the two antennas but the noise is uncorrelated, a two-antenna system can outperform a one-antenna system. The simulations both confirmed this and enabled us to quantify the performance improvement.
Because the demand for bandwidth will soon outstrip the industry's ability to supply it, we must work quickly to develop new solutions. MATLAB and Simulink have enabled us to streamline our development process, starting with the assessment of new ideas and proceeding through algorithm development, simulation, testing, and deployment to our target FPGA. We can evaluate different algorithms, select the best one for a particular application, and rapidly develop a prototype for demonstrations and real-world tests.
Published 2011 - 91980v00
FCC 10-174, Federal Communications Commission, Second Memorandum Opinion and Order