Fractional delay using time-shift property of FFT. Why not work? #266
Replies: 1 comment 6 replies
-
@jgaeddert What do you think? Has this got to do with the fact the FFT assumes the input is an infinitely repeating sequence with period equal to the FFT size? And for whatever reason, you don't get any nasty effects when the time delay is integer? Also, it looks like
is multiplying the FFT of the input with impulse response So it's like it's doing an FFT convolution with a filter |
Beta Was this translation helpful? Give feedback.
-
I'm experimenting with applying fractional delay using the time-shift property of the FFT. This kind of thing is done inside the DLLs in OFDM demods, particularly 5G.
I wanted to see if this could be done generally rather than convolving with a FIR filter (with appropriate delay) or using a polyphase filter bank.
Here is some code:
Now if I set
delay = 1
as above i get the following graphs:This all looks correct!
Now if I set
delay = 0.5
, you get the following:What happened? Why is this gibberish?
Does the time shift property of the FFT only work for integral offsets?
Beta Was this translation helpful? Give feedback.
All reactions