A time-frequency representation which uses a signal-dependent, radially Gaussian kernel that adapts over time. The code provides an excellent ambiguity domain filter for time-frequency analysis.
Just run the script main_AOK.m to see it in action. The test signal includes three impulses, two simultaneous sinusoids, a Gaussian pulse, and two parallel linear chirps.
The algorithms are described in detail in the paper:
"An Adaptive Optimal-Kernel Time-Frequency Representation" by D. L. Jones and R. G. Baraniuk, IEEE Transactions on Signal Processing, Vol. 43, No. 10, pp. 2361--2371, October 1995.
Special thanks to Richard Baraniuk for making the original C code available.
This software works well and can handle large signals but could use some improved documentation. However if you are very familiar with the Jones and Baraniuk paper you can piece together what it is doing and especially the form of the output matrix. I am a fan of the AOK-TFR method for adaptive time-frequency estimation.
There seems to a minor glitches in the figure produced in main_AOK.m:
The plot it produces is inverted so that the positive frequency components are shown as negative frequencies and vice versa. Also the final plot is not trimmed so that the time axis does not correspond to the time interval of the original signal and the time units on this final plot are in observation points not seconds (or other original time units of the signal)
Consequently the time axis is of questionable value, however this is easily fixed by using something like on the output matrix ofp:
There was an error in one of the lines in main_AOK.m