Fourier Transform of the Probability Density Function of the Normal Distribution (Gaussian Function)
Proposition
The probability density function of the normal distribution $N(\mu, \sigma^2)$
$$ x(t) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp{\left[-\frac{(t - \mu)^2}{2 \sigma^2}\right]} $$
has the Fourier transform
$$ X(\omega) = \exp{\left[ -\frac{\sigma^2 \omega^2}{2} - i\mu\omega \right]} $$
In particular, the probability density function of the standard normal distribution $N(0, 1)$
$$ x(t) = \frac{1}{\sqrt{2\pi}}\exp{\left[-\frac{t^2}{2}\right]} $$
has the Fourier transform
$$ X(\omega) = \exp{\left[ -\frac{\omega^2}{2} \right]} $$
Proof
While it is possible to compute directly from the definition, we first establish the following lemma:
Time Shift Property
If the Fourier transform of $x(t)$ is $X(\omega)$,
then the Fourier transform of $x(t - \tau)$ is $e^{-i \omega \tau} X(\omega)$.
In fact, applying the definition of the Fourier transform,
$$ \int_{-\infty}^{\infty}x(t - \tau)e^{-i \omega t}dt $$
by substituting $s = t - \tau$, we have
$$ \int_{-\infty}^{\infty}x(s)e^{-i \omega (s + \tau)}ds = e^{-i \omega \tau}\int_{-\infty}^{\infty}x(s)e^{-i \omega s}ds = e^{-i \omega \tau} X(\omega) $$
This shows that a time shift in the time domain affects only the phase spectrum linearly (linear phase characteristic), leaving the amplitude spectrum unchanged.
Returning to the original problem, let
$$ y(t) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp{\left[-\frac{t^2}{2 \sigma^2}\right]} $$
Then,
$$ x(t) = y(t - \mu) $$
so by the lemma above,
$$ X(\omega) = e^{-i \omega \mu} Y(\omega) $$
Thus, it remains to show that
$$ Y(\omega) = \exp{\left[ -\frac{\sigma^2 \omega^2}{2} \right]} $$
Calculating $Y(\omega)$,
$$ Y(\omega) = \int_{-\infty}^{\infty}{y(t) \cdot e^{-i \omega t}dt} = \frac{1}{\sqrt{2 \pi \sigma^2}} \int_{-\infty}^{\infty} \exp{\left[ -\frac{t^2}{2\sigma^2} - i \omega t \right]}dt $$
Completing the square in the exponent,
$$ Y(\omega) = \frac{1}{\sqrt{2 \pi \sigma^2}} \int_{-\infty}^{\infty} \exp{\left[ -\frac{(t + i\sigma^2\omega)^2}{2\sigma^2} - \frac{\sigma^2 \omega^2}{2} \right]}dt = \exp{\left[- \frac{\sigma^2 \omega^2}{2} \right]} \cdot \frac{1}{\sqrt{2 \pi \sigma^2}} \int_{-\infty}^{\infty} \exp{\left[ -\frac{(t + i\sigma^2\omega)^2}{2\sigma^2} \right]}dt $$
Let the integral part be $I$, and by substituting $u=\frac{t}{\sigma}+i\sigma\omega$,
$$ I = \sigma \int_{i\sigma\omega -\infty}^{i\sigma\omega +\infty}{\exp{\left[ -\frac{u^2}{2} \right]}}du $$
This integral has been previously evaluated in the article Expected Value of the Cosine of a Random Variable Following the Standard Normal Distribution, but let’s review it.
Consider a rectangular contour with vertices at $-R, -R + i\sigma\omega, R + i\sigma\omega, R$.
$$ C_1 : -R \to -R+i\sigma\omega $$
$$ C_2 : -R+i\sigma\omega \to R+i\sigma\omega $$
$$ C_3 : R+i\sigma\omega \to R $$
$$ C_4 : R \to -R $$
$$ C : C_1 + C_2 + C_3 + C_4 $$
Since $C$ is a simple closed curve and $\exp{[-\frac{u^2}{2}]}$ is analytic、by Cauchy’s integral theorem,
$$ \int_{C}\exp{[-\frac{u^2}{2}]}du = 0 $$
Moreover, assuming $R$ is large enough,
$$ \left|\int_{C_1}\exp{[-\frac{u^2}{2}]}du\right| = \left|\int_{0}^{1}\exp{[-\frac{(-R+i\sigma\omega t)^2}{2}]}dt\right| \le \int_{0}^{1}\left|\exp{[-\frac{(-R+i\sigma\omega t)^2}{2}]}\right|dt = \int_{0}^{1}\exp{[-\Re{\frac{(-R+i\sigma\omega t)^2}{2}}]}dt = \int_{0}^{1}\exp{[-\frac{R^2 - (\sigma\omega t)^2}{2}]}dt \le \int_{0}^{1}\exp{[-\frac{R^2 - \sigma^2 \omega^2}{2}]}dt = \exp{[-\frac{R^2 - \sigma^2 \omega^2}{2}]} $$
Taking the limit as $R \to \infty$,
$$ \int_{C_1}\exp{[-\frac{u^2}{2}]}du \to 0 $$
Similarly,
$$ \int_{C_3}\exp{[-\frac{u^2}{2}]}du \to 0 $$
Thus, $R \to \infty$,
$$ \int_{C_2}\exp{[-\frac{u^2}{2}]}du + \int_{C_4}\exp{[-\frac{u^2}{2}]}du \to 0 $$
The integral along the real axis is the Gaussian integral,
$$ \int_{C_4}\exp{[-\frac{u^2}{2}]}du \to -\sqrt{2 \pi} $$
Therefore,
$$ \int_{C_2}\exp{[-\frac{u^2}{2}]}du \to \sqrt{2 \pi} $$
In conclusion,
$$ I = \sigma \sqrt{2 \pi} $$
Thus,
$$ Y(\omega) = \exp{\left[- \frac{\sigma^2 \omega^2}{2} \right]} \cdot \frac{1}{\sqrt{2 \pi \sigma^2}} \cdot \sigma \sqrt{2 \pi} = \exp{\left[- \frac{\sigma^2 \omega^2}{2} \right]} $$
Thus, the result is established.
Additional Notes
The probability density function of the standard normal distribution, excluding the coefficient, remains in the same form before and after the Fourier transform, and can be considered a fixed point under the Fourier transform.
Additionally, there is a variant of the Fourier transform definition that includes a factor of $\frac{1}{\sqrt{2 \pi}}$, in which case the result matches exactly including the coefficient.
Such functions that remain invariant under transformations are known as self-reciprocal functions.
Other examples of self-reciprocal functions under Fourier transform include $\frac{1}{\sqrt{|t|}}$ (An Example of a Self-Reciprocal Function in Fourier Transform).
Previously, the expectation of the cosine of a standard normal distribution-based random variable was computed in Expected Value of the Cosine of a Random Variable Following the Standard Normal Distribution.
Reflecting on this proposition, it is equivalent to substituting $\omega = 1$ into the Fourier transform of the standard normal distribution’s probability density function.
Indeed, the Fourier transform essentially computes the expectation of $\exp{[-i\omega t]}$.
Therefore, the Fourier transform of the standard normal distribution’s probability density function
$$ x(t) = \frac{1}{\sqrt{2\pi}}\exp{\left[-\frac{t^2}{2}\right]} $$
is
$$ X(\omega) = E[e^{-i \omega t}] $$
By Euler’s formula,
$$ e^{-i \omega t} = \cos{\omega t} - i \sin{\omega t} $$
substituting this in,
$$ X(\omega) = E[\cos{\omega t} - i \sin{\omega t}] = E[\cos{\omega t}] - i E[\sin{\omega t}] $$
Since $x(t)$ is an even function,
$$ E[\sin{\omega t}] = 0 $$
Thus,
$$ X(\omega) = E[\cos \omega t] $$
Using the result obtained,
$$ E[\cos t] = X(1) = \exp{\left[-\frac{1^2}{2}\right]} = \frac{1}{\sqrt{e}} $$
The conjugate of Fourier transform of the probability density function is generally referred to as the characteristic function.
$$ \phi(\omega) = E[e^{i \omega t}] = \overline{E[e^{-i \omega t}]} $$
In this context, it is common to use $X$ for the random variable and $t$ for the argument of the characteristic function, so it is usually written as
$$ \phi(t) = E[e^{i t X}] $$
Thus, if we express the proposition proven here in terms of probability theory, it would be as follows:
The probability density function of a normal distribution $N(\mu, \sigma^2)$ is
$$ f(x) = \frac{1}{\sqrt{2\pi\sigma^2}}\exp{\left[-\frac{(x - \mu)^2}{2 \sigma^2}\right]} $$
The characteristic function is given by
$$ \phi(t) = \exp{\left[ -\frac{\sigma^2 t^2}{2} + i\mu t \right]} $$
In particular, the probability density function of the standard normal distribution $N(0, 1)$ is
$$ f(x) = \frac{1}{\sqrt{2\pi}}\exp{\left[-\frac{x^2}{2}\right]} $$
The characteristic function is given by
$$ \phi(t) = \exp{\left[ -\frac{t^2}{2} \right]} $$