Shrinkage estimation in parametric families of distributions based on divergence measures

Dick, Artur; Kamps, Udo (Thesis advisor); Kateri, Maria (Thesis advisor)

Aachen (2018)
Dissertation / PhD Thesis

Dissertation, RWTH Aachen University, 2018

Abstract

In this doctoral thesis, we establish a method which aims to improve the maximum likelihood estimator (MLE) in situations, when a pre-estimate of the underlying parameter of a probability distribution is available. Especially when the sample size is small, the MLE may have high variance and may lead to poor estimates. The main idea is to incorporate the pre-estimate to settle this drawback. This procedure is an alternative approach to the classical Bayesian paradigm in order to construct an estimator which contains prior knowledge. The construction is based on a minimum divergence approach. Given the set of all estimates having the same distance to both, the ML-estimate and the pre-estimate, we are looking fora parameter which approximates best the ML estimate and the pre-estimate. The resulting estimator is called the Equal Distance Estimator (EDE). Basically, such procedure defines a shrinkage estimator, since geometrically viewed, any ML-estimate is dragged closer to the pre-estimate. Explicit forms of the EDE are given for different divergences and a wide class of probability distributions. Moreover, conditions for existence and uniqueness of the EDEare established. Many properties of the MLE are bequeathed to the EDE. The most important one is invariance w.r.t. parametrizations. Given multinomial data, a slight extension of the EDE leads to a new approach to deal with sparsity. As a performance criterion we use Pitman’s measure of closeness to compare the EDE with the MLE. Finally we introduce an update procedure of the EDE for new samples from a multivariate normal distribution.