53 Method of Moments
53.1 Rationale
Suppose that \(X_1, X_2, \ldots, X_n {\; \stackrel{\text{iid}}{\sim}\;}F\). By the strong law of large numbers we have, as \(n \rightarrow \infty\)
\[ \frac{\sum_{i=1}^n X_i^k}{n} \stackrel{\text{a.s.}}{\longrightarrow} {\operatorname{E}}_{F}\left[X^k\right] \]
when \({\operatorname{E}}_{F}\left[X^k\right]\) exists.
This means that we can nonparametrically estimate the moments of a distribution. Also, in the parametric setting, these moments can be used to form parameter estimates.
53.2 Definition
Suppose that \(X_1, X_2, \ldots, X_n {\; \stackrel{\text{iid}}{\sim}\;}F_{{\boldsymbol{\theta}}}\) where \({\boldsymbol{\theta}}\) is \(d\)-dimensional.
Calculate moments \({\operatorname{E}}\left[X^k\right]\) for \(k = 1, 2, \ldots, d'\) where \(d' \geq d\).
For each parameter \(j = 1, 2, \ldots, d\), solve for \(\theta_j\) in terms of \({\operatorname{E}}\left[X^k\right]\) for \(k = 1, 2, \ldots, d'\).
The method of moments estimator of \(\theta_j\) is formed by replacing the function of moments \({\operatorname{E}}\left[X^k\right]\) that equals \(\theta_j\) with the empirical moments \(\sum_{i=1}^n X_i^k / n\).
53.3 Example: Normal
For a \(\mbox{Normal}(\mu, \sigma^2)\) distribution, we have
\[ {\operatorname{E}}[X] = \mu \]
\[ {\operatorname{E}}\left[X^2\right] = \sigma^2 + \mu^2 \]
Solving for \(\mu\) and \(\sigma^2\), we have \(\mu = {\operatorname{E}}[X]\) and \(\sigma^2 = {\operatorname{E}}[X^2] - {\operatorname{E}}[X]^2\). This yields method of moments estimators
\[ \tilde{\mu} = \frac{\sum_{i=1}^n X_i}{n}, \ \ \ \tilde{\sigma}^2 = \frac{\sum_{i=1}^n X_i^2}{n} - \left[\frac{\sum_{i=1}^n X_i}{n}\right]^2. \]
53.4 Exploring Goodness of Fit
As mentioned above, moments can be nonparametrically estimated. At the same time, for a given parametric distribution, these moments can also be written in terms of the parameters.
For example, consider a single parameter exponential family distribution. The variance is going to be defined in terms of the parameter. At the same time, we can estimate variance through the empirical moments
\[ \frac{\sum_{i=1}^n X_i^2}{n} - \left[\frac{\sum_{i=1}^n X_i}{n}\right]^2. \]
In the scenario where several sets of variables are measured, the MLEs of the variance in terms of the single parameter can be compared to the moment estimates of variance to assess goodness of fit of that distribution.