Stuck on GSM Network Simulation

5 views (last 30 days)
Georgi
Georgi on 21 Nov 2011
Hi everybody!
I am trying to simulate a GSM Network so that I can make an Algorithm for LS Channel Estimation. Unfortunately I am stuck:
1. How do I set the length of the Burst (148 bits) to be 0.577 ms?
2. For the GMSK Modulator. How many samples per symbol?
3. What should be the Sample Time of the Rayleigh Channel for a GSM?
4. Why does the GMSK signal look exactly like PSK?
Here is the code so far. Please help. It's really important
clear all;
clc
AWGN = comm.AWGNChannel('NoiseMethod','Signal to noise ratio (SNR)','SNR', 100);
delays=[0 1e-6 2e-6 3e-6];
Multipath = rayleighchan(1e-5, 0, delays, [0 0 0 0]);
dH = comm.GMSKDemodulator('BitOutput', true, 'SamplesPerSymbol', 4);
H = comm.GMSKModulator('BitInput', true, 'SamplesPerSymbol', 4);
H.PulseLength = 1;
dH.PulseLength = 1;
hError = comm.ErrorRate('ReceiveDelay', dH.TracebackDepth);
Multipath.StoreHistory = 0;
Multipath.ResetBeforeFiltering = false;
training=randi([0 1],26,1); %Generating Training sequence
%---------------Assembling Matrix for LS Estimator--------------------%
A=zeros(21,6);
for n=1:21
A(n,:)=training(n:n+5);
end
%----------------------GSM NETWORK SIMULATION-------------------------%
for counter=1:1
%-----------------------Assembling GSM Burst--------------------------%
data1 = randi([0 1],61, 1); %Generating pre-training data
data2 = randi([0 1],61, 1); %Generating post-training data
Burstcol = [data1; training; data2 ]; %Assembling Burst as a single row matrix
%-----------------------Transmitting Signal---------------------------%
modSignal = step(H, Burstcol); %Signal through GMSK
multichannel = filter(Multipath,modSignal); %Modulated Signal through multipath
%noisySignal = step(AWGN, multichannel); %Distorted through noise
received = step(dH,multichannel); %received Signal
errorStats = step(hError, Burstcol, received); %Calculates BER
vector = received(78:98);
end
%t=(1:1:592);
B=zeros(26, 2);
B(:,1)=Burstcol(62:87);
B(:,2)=received(78:103);
B'
hLS = inv(transpose(A)*A)*transpose(A)*vector
minError = transpose(vector-A*hLS)*(vector-A*hLS)
%-------------------------Plotting Results---------------------------%
fprintf('Error rate = %f\nNumber of errors = %d\n', ...
errorStats(1), errorStats(2))

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!