Maximum Penalized Likelihood Estimation (eBook)

Volume II: Regression
eBook Download: PDF
2009 | 2009
XX, 572 Seiten
Springer New York (Verlag)
978-0-387-68902-9 (ISBN)

Lese- und Medienproben

Maximum Penalized Likelihood Estimation -  Paul P. Eggermont,  Vincent N. LaRiccia
Systemvoraussetzungen
128,39 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

Unique blend of asymptotic theory and small sample practice through simulation experiments and data analysis.

Novel reproducing kernel Hilbert space methods for the analysis of smoothing splines and local polynomials. Leading to uniform error bounds and honest confidence bands for the mean function using smoothing splines

Exhaustive exposition of algorithms, including the Kalman filter, for the computation of smoothing splines of arbitrary order.


This is the second volume of a text on the theory and practice of maximum penalized likelihood estimation. It is intended for graduate students in s- tistics, operationsresearch, andappliedmathematics, aswellasresearchers and practitioners in the ?eld. The present volume was supposed to have a short chapter on nonparametric regression but was intended to deal mainly with inverse problems. However, the chapter on nonparametric regression kept growing to the point where it is now the only topic covered. Perhaps there will be a Volume III. It might even deal with inverse problems. But for now we are happy to have ?nished Volume II. The emphasis in this volume is on smoothing splines of arbitrary order, but other estimators (kernels, local and global polynomials) pass review as well. We study smoothing splines and local polynomials in the context of reproducing kernel Hilbert spaces. The connection between smoothing splines and reproducing kernels is of course well-known. The new twist is thatlettingtheinnerproductdependonthesmoothingparameteropensup new possibilities: It leads to asymptotically equivalent reproducing kernel estimators (without quali?cations) and thence, via uniform error bounds for kernel estimators, to uniform error bounds for smoothing splines and, via strong approximations, to con?dence bands for the unknown regression function. ItcameassomewhatofasurprisethatreproducingkernelHilbert space ideas also proved useful in the study of local polynomial estimators.

Preface 6
Contents 8
Contents of Volume I 12
Notations, Acronyms and Conventions 16
Nonparametric Regression 20
1. What and why? 20
2. Maximum penalized likelihood estimation 26
3. Measuring the accuracy and convergence rates 35
4. Smoothing splines and reproducing kernels 39
5. The local error in local polynomial estimation 45
6. Computation and the Bayesian view of splines 47
7. Smoothing parameter selection 54
8. Strong approximation and confidence bands 62
9. Additional notes and comments 67
Smoothing Splines 68
1. Introduction 68
2. Reproducing kernel Hilbert spaces 71
3. Existence and uniqueness of the smoothing spline 78
4. Mean integrated squared error 83
5. Boundary corrections 87
6. Relaxed boundary splines 91
7. Existence, uniqueness, and rates 102
8. Partially linear models 106
9. Estimating derivatives 114
10. Additional notes and comments 115
Kernel Estimators 117
1. Introduction 117
2. Mean integrated squared error 119
3. Boundary kernels 123
4. Asymptotic boundary behavior 128
5. Uniform error bounds for kernel estimators 132
6. Random designs and smoothing parameters 144
7. Uniform error bounds for smoothing splines 150
8. Additional notes and comments 161
Sieves 162
1. Introduction 162
2. Polynomials 165
3. Estimating derivatives 170
4. Trigonometric polynomials 172
5. Natural splines 178
6. Piecewise polynomials and locally adaptive designs 180
7. Additional notes and comments 184
Local Polynomial Estimators 185
1. Introduction 185
2. Pointwise versus local error 189
3. Decoupling the two sources of randomness 192
4. The local bias and variance after decoupling 197
5. Expected pointwise and global error bounds 199
6. The asymptotic behavior of the error 200
7. Refined asymptotic behavior of the bias 206
8. Uniform error bounds for local polynomials 211
9. Estimating derivatives 213
10. Nadaraya-Watson estimators 214
11. Additional notes and comments 218
Other Nonparametric Regression Problems 220
1. Introduction 220
2. Functions of bounded variation 223
3. Total-variation roughness penalization 231
4. Least-absolute-deviations splines: Generalities 236
5. Least-absolute-deviations splines: Error bounds 242
6. Reproducing kernel Hilbert space tricks 246
7. Heteroscedastic errors and binary regression 247
8. Additional notes and comments 251
Smoothing Parameter Selection 254
1. Notions of optimality 254
2. Mallows’ estimator and zero-trace estimators 259
3. Leave-one-out estimators and cross-validation 263
4. Coordinate-free cross-validation (GCV) 266
5. Derivatives and smooth estimation 271
6. Akaike’s optimality criterion 275
7. Heterogeneity 280
8. Local polynomials 285
9. Pointwise versus local error, again 290
10. Additional notes and comments 295
Computing Nonparametric Estimators 299
1. Introduction 299
2. Cubic splines 299
3. Cubic smoothing splines 305
4. Relaxed boundary cubic splines 308
5. Higher-order smoothing splines 312
6. Other spline estimators 320
7. Active constraint set methods 327
8. Polynomials and local polynomials 333
9. Additional notes and comments 337
Kalman Filtering for Spline Smoothing 339
1. And now, something completely different 339
2. A simple example 347
3. Stochastic processes and reproducing kernels 352
4. Autoregressive models 364
5. State-space models 366
6. Kalman filtering for state-space models 369
7. Cholesky factorization via the Kalman filter 373
8. Diffuse initial states 377
9. Spline smoothing with the Kalman filter 380
10. Notes and comments 384
Equivalent Kernels for Smoothing Splines 387
1. Random designs 387
2. The reproducing kernels 394
3. Reproducing kernel density estimation 398
4. L2 error bounds 400
5. Equivalent kernels and uniform error bounds 402
6. The reproducing kernels are convolution-like 407
7. Convolution-like operators on Lp spaces 415
8. Boundary behavior and interior equivalence 423
9. The equivalent Nadaraya-Watson estimator 428
10. Additional notes and comments 435
Strong Approximation and Confidence Bands 439
1. Introduction 439
2. Normal approximation of iid noise 443
3. Confidence bands for smoothing splines 448
4. Normal approximation in the general case 451
5. Asymptotic distribution theory for uniform designs 460
6. Proofs of the various steps 466
7. Asymptotic 100% confidence bands 478
8. Additional notes and comments 482
Nonparametric Regression in Action 484
1. Introduction 484
2. Smoothing splines 488
3. Local polynomials 498
4. Smoothing splines versus local polynomials 508
5. Confidence bands 512
6. The Wood Thrush Data Set 523
7. The Wastewater Data Set 531
8. Additional notes and comments 540
Bernstein’s Inequality 541
The TVDUAL inplementation 544
Solutions to Some Critical Exercises 549
1. Solutions to Chapter 13: Smoothing Splines 549
2. Solutions to Chapter 14: Kernel Estimators 550
3. Solutions to Chapter 17: Other Estimators 551
4. Solutions to Chapter 18: Smoothing Parameters 552
5. Solutions to Chapter 19: Computing 552
6. Solutions to Chapter 20: Kalman Filtering 553
7. Solutions to Chapter 21: Equivalent Kernels 556
References 558
Author Index 572
Subject Index 578

Erscheint lt. Verlag 2.6.2009
Reihe/Serie Springer Series in Statistics
Zusatzinfo XX, 572 p.
Verlagsort New York
Sprache englisch
Themenwelt Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Mathematik / Informatik Mathematik Statistik
Mathematik / Informatik Mathematik Wahrscheinlichkeit / Kombinatorik
Naturwissenschaften Biologie
Technik Elektrotechnik / Energietechnik
Wirtschaft Volkswirtschaftslehre Ökonometrie
Schlagworte Confidence bands • Data Analysis • Estimator • expectation–maximization algorithm • expectation–maximization algorithm • Kalman filter for smoothing splines. • likelihood • Local polynomials • Nonparametric regression • Reproducing kernel Hilbert spaces • Smoothing splines • Uniform error bounds
ISBN-10 0-387-68902-8 / 0387689028
ISBN-13 978-0-387-68902-9 / 9780387689029
Haben Sie eine Frage zum Produkt?
Wie bewerten Sie den Artikel?
Bitte geben Sie Ihre Bewertung ein:
Bitte geben Sie Daten ein:
PDFPDF (Wasserzeichen)
Größe: 10,0 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
der Praxis-Guide für Künstliche Intelligenz in Unternehmen - Chancen …

von Thomas R. Köhler; Julia Finkeissen

eBook Download (2024)
Campus Verlag
38,99