<< Chapter < Page | Chapter >> Page > |
There were several challenges we had to consider when implementing our autotune function in matlab.
We need to divide the time-domain signal into windows small enough that they cannot contain more than one note. Once we have performed windowing in the time domain, we can take the FFT for each window to get the isolated spectrum of the note sung. We then compare it to the closest note on the chromatic scale to determine the amount of shift needed.
Windowing the signal in the time domain requires the use of a filter. All filters create distortions in the frequency domain because of their non-ideal frequency response in the stopband.
In addition to these magnitude distortions, windows can have nonlinear phase. This changes the relative phase between the tones in each note. For any notes comprised of more than a single tone, this phase change will create audible distortion.
Time Duration
We change the pitch of our signals via the chipmunk effect; notes are shifted up or down by re-sampling, effectively playing the signal back at a lower or higher sampling rate. This method creates minimum phase distortion, but changes the time duration of the signal. In order to counteract this effect, we must stretch or compress each portion of the song to maintain its original time duration after re-sampling to change the pitch. We can then achieve our desired note shifting by playing the entire song back at its original sampling frequency.
Notification Switch
Would you like to follow the 'Auto-tune' conversation and receive update notifications?