Does scaling matter?
In most situations, scaling is really not all that important. The overall shape of the spectrum matters much more than the absolute scale.
What are the conventions?
But if you really are worried about it, there are several different conventions from which you can choose (see definitions below):
- Scale by dt for the FFT, and by Fs for the IFFT
- Scale by 1/M for the FFT, and by M for the IFFT
- Scale by 1 for the FFT, and by 1 for the IFFT
- Scale by 1/sqrt(M) for the FFT, and by sqrt(M) for the IFFT.
- and so on.
I generally use either option #1 or option #2 depending on my mood and whether it's raining outside.
Here I am assuming that I have a discrete-time signal x represented as an M x N matrix, where M is the number of samples and N is the number of channels.
[M,N] = size(x);
Furthermore, I am assuming that the sampling rate is Fs and that I have defined the time increment as
dt = 1/Fs;
and the frequency increment as:
dF = Fs/M;
What do all these conventions have in common?
All of these conventions have one thing in common: The product of the two scaling factors is always 1. Please note that the ifft function in MATLAB includes a scaling factor of 1/M as part of the computation, so that the overall round-trip scaling is 1/M (as it should be).