What determines the time axis in spectrogram using normalized frequencies?
7 views (last 30 days)
Show older comments
I'm attempting to make a spectrogram using normalized frequencies, and the resulting spectrogram looks as expected with the exception of the time axis. Specifically, while the true duration of the data is ~8 minutes, when calculating the spectrogram the time axis ranges to ~21.5 hours!
I've tried manipulating each of the input arguments, and the only thing that seems to affect the time axis is the length of the input data vector - which obviously doesn't make the time axis any more accurate. Any help would be greatly appreciated.
PS: I calculate the normalized frequencies thusly:
FreqsInHz = [0.5:0.1:50]; %The range of frequencies I'm interested in.
SamplingRate = 1000Hz;
normFreqs = (2*pi).*FreqsInHz./SamplingRate; Normalized frequencies in rad/sample.
0 Comments
Answers (1)
Sean de Wolski
on 1 Aug 2016
Time is a function of the sampling frequency, fs.
Compare
spectrogram(rand(1,100000),64,0,64,8000)
spectrogram(rand(1,100000),64,0,64,44100)
For how to use Fs
>> doc spectrogram
See Also
Categories
Find more on Time-Frequency Analysis in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!