Abstract
Let S represent the usual unbiased estimator of a covariance matrix. ?o, whose elements are functions of a parameter vector ?o:?o = ?(?o). A generalized least squares (G.L.S.) estimate, ?. of ?o may be obtained by minimizing tr[ {S - ?(?) }V]2 where V is some positive definite matrix.
Keywords
Affiliated Institutions
Related Publications
On a Complete Class of Linear Unbiased Estimators for Randomized Factorial Experiments
Consider a factorial system of order $N = p^m$, which consists of $m$ factors each at $p$ levels. The factorial model relates the expected yield to the various treatment combina...
A Simple, Positive Semi-Definite, Heteroskedasticity and AutocorrelationConsistent Covariance Matrix
This paper describes a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction. It also...
A Heteroskedasticity-Consistent Covariance Matrix Estimator and a Direct Test for Heteroskedasticity
This paper presents a parameter covariance matrix estimator which is consistent even when the disturbances of a linear regression model are heteroskedastic. This estimator does ...
A Simple, Positive Semi-Definite, Heteroskedasticity and Autocorrelation Consistent Covariance Matrix
This paper describes a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction. It also...
Classical F-Tests and Confidence Regions for Ridge Regression
For testing general linear hypotheses in multiple regression models. it is shown that non-stochastically shrunken ridge estimators yield the same central F-ratios and t-statisti...
Publication Info
- Year
- 1974
- Type
- article
- Volume
- 8
- Issue
- 1
- Pages
- 1-24
- Citations
- 336
- Access
- Closed