Share
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Pardo, Leandro (Author)
·
Mdpi AG
· Paperback
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures - Pardo, Leandro
Choose the list to add your product or create one New List
✓ Product added successfully to the Wishlist.
Go to My Wishlists
Origin: U.S.A.
(Import costs included in the price)
It will be shipped from our warehouse between
Monday, July 01 and
Thursday, July 11.
You will receive it anywhere in United Kingdom between 1 and 3 business days after shipment.
Synopsis "New Developments in Statistical Information Theory Based on Entropy and Divergence Measures"
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald's statistics, likelihood ratio statistics and Rao's score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
- 0% (0)
- 0% (0)
- 0% (0)
- 0% (0)
- 0% (0)
All books in our catalog are Original.
The book is written in English.
The binding of this edition is Paperback.
✓ Producto agregado correctamente al carro, Ir a Pagar.