Semester.ly

Johns Hopkins University | PH.140.732

Statistical Theory II

4.0

credits

Average Course Rating

(-1)

Introduces modern statistical theory; sets principles of inference based on decision theory and likelihood (evidence) theory; derives the likelihood function based on design and model assumptions; derives the complete class theorem between Bayes and admissible estimators; derives minimal sufficient statistics as a necessary and sufficient reduction of data for accurate inference in parametric models; derives the minimal sufficient statistics in exponential families; introduces maximum likelihood and unbiased estimators; defines information and derives the Cramer-Rao variance bounds in parametric models; introduces empirical Bayes (shrinkage) estimators and compares to maximum likelihood in small-sample problems.

No Course Evaluations found

Lecture Sections

(01)

No location info
C. Frangakis
10:30 - 11:50
499 open / 500 seats