PHI Learning Sivumäärä: 808 sivua Asu: Pehmeäkantinen kirja Julkaisuvuosi: 2014, 30.08.2014 (lisätietoa) Kieli: Englanti
Intended for the postgraduate students of statistics, this sequel to Statistical Inference: Testing of Hypotheses introduces the problem of estimation in the light of foundations laid down by Sir R.A. Fisher (1922), and follows both classical and Bayesian approaches to solve these problems.
The book starts by discussing the growing levels of data summarization, and connects this with sufficient and minimal sufficient statistics. The book provides a complete account of theorems and results on uniformly minimum variance unbiased estimators (UMVUE)-including the famous Rao and Blackwell theorem to suggest an improved estimator based on a sufficient statistic, and the Lehmann-Scheffe theorem to give an UMVUE. It discusses the Cramer-Rao and Bhattacharyya variance lower bounds for regular models, by introducing Fishers information and Chapman, Robbins and Kiefer variance lower bounds for Pitman models. The book also introduces different methods of estimation, including the method of maximum likelihood, and discusses large sample properties such as consistency, consistent asymptotic normality (CAN) and best asymptotic normality (BAN) of different estimators.
Separate chapters are devoted to finding the Pitman estimator, among equivariant estimators, for location and scale models, by exploiting symmetry structure, present in the model, and Bayes, Empirical Bayes, Hierarchical Bayes estimators in different statistical models. Systematic exposition of the theory and results in different statistical situations and models are included. Each chapter finishes with solved examples in a number of statistical models, augmented by explanation of theorems and results.
Key features:
Provides clarifications of theorems and related eesults. Includes numerous solved examples to improve analytical insight on the subject. Incorporates chapter-end exercises to review student's comprehension of the subject. Discusses detailed theory on data summarization, unbiased estimation with large sample properties, and Bayes and Minimax estimation, separately, in different chapters.