##ThinkDSP
This notebook contains solutions to exercises in Chapter 8: Filtering and Convolution
Copyright 2015 Allen Downey
Exercise: In this chapter I claimed that the Fourier transform of a Gaussian curve is also a Gaussian curve. For discrete Fourier transforms, this relationship is approximately true.
Try it out for a few examples. What happens to the Fourier transform as you vary std
?
Solution: I'll start with a Gaussian similar to the example in the book.
Here's what the FFT looks like:
If we roll the negative frequencies around to the left, we can see more clearly that it is Gaussian, at least approximately.
This function plots the Gaussian window and its FFT side-by-side.
Now we can make an interaction that shows what happens as std
varies.
As std
increases, the Gaussian gets wider and its FFT gets narrower.
In terms of continuous mathematics, if
which is a Gaussian with mean 0 and standard deviation , its Fourier transform is
which is a Gaussian with standard deviation . So there is an inverse relationship between the standard deviations of and .
For the proof, see http://mathworld.wolfram.com/FourierTransformGaussian.html
Exercise: If you did the exercises in Chapter 3, you saw the effect of the Hamming window, and some of the other windows provided by NumPy, on spectral leakage. We can get some insight into the effect of these windows by looking at their DFTs.
In addition to the Gaussian window we used in this window, create a Hamming window with the same size. Zero pad the windows and plot their DFTs. Which window acts as a better low-pass filter? You might find it useful to plot the DFTs on a log- scale.
Experiment with a few different windows and a few different sizes.
Solution: Following the examples from the chapter, I'll create a 1-second wave sampled at 44.1 kHz.
And I'll create a few windows. I chose the standard deviation of the Gaussian window to make it similar to the others.
Let's see what the windows look like.
They are pretty similar. Let's see what their DFTs look like:
Also pretty similar, but it looks like Hamming drops off the fastest, Blackman the slowest, and Hanning has the most visible sidelobes.
On a log scale we can see that the Hamming and Hanning drop off faster than the other two at first. And the Hamming and Gaussian windows seem to have the most persistent sidelobes. The Hanning window seems to have the best combination of fast drop off and minimal sidelobes.