Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more ➡
Standard view
Full view
of .
Add note
Save to My Library
Sync to mobile
Look up keyword or section
Like this

Table Of Contents

Review of Probability and Distribution Theory
1.1 Introduction
1.2 Univariate Discrete Distributions
1.2.1 The Bernoulli and Binomial Distributions
1.2.2 The Poisson Distribution
1.2.3 Binomial Distribution: Normal Approximation
1.3 Univariate Continuous Distributions
1.4 Multivariate Probability Distributions
1.4.1 The Multinomial Distribution
1.4.2 The Dirichlet Distribution
1.4.3 The d-Dimensional Uniform Distribution
1.4.4 The Multivariate Normal Distribution
1.4.6 The Wishart and Inverse Wishart Distributions
1.4.7 The Multivariate-t Distribution
1.5 Distributions with Constrained Sample Space
1.6 Iterated Expectations
2.1 Introduction
2.2 Functions of a Single Random Variable
2.2.1 Discrete Random Variables
2.2.2 Continuous Random Variables
2.2.3 Approximating the Mean and Variance
2.2.4 Delta Method
2.3.1 Linear Transformations
2.3.2 Approximating the Mean and Covariance Matrix
Methods of Inference
An Introduction to Likelihood Inference
3.1 Introduction
3.2 The Likelihood Function
3.3 The Maximum Likelihood Estimator
3.4 Likelihood Inference in a Gaussian Model
3.5 Fisher’s Information Measure
3.5.1 Single Parameter Case
3.5.2 Alternative Representation of Information
3.5.3 Mean and Variance of the Score Function
3.5.4 Multiparameter Case
3.5.5 Cram´er–Rao Lower Bound
3.6 Sufficiency
3.7 Asymptotic Properties: Single Parameter Models
3.7.2 Consistency
3.7.3 Asymptotic Normality and Efficiency
3.8 Asymptotic Properties: Multiparameter Models
3.9.1 Illustration of Functional Invariance
3.9.2 Invariance in a Single Parameter Model
3.9.3 Invariance in a Multiparameter Model
Further Topics in Likelihood Inference
4.1 Introduction
4.2 Computation of Maximum Likelihood Estimates
4.3 Evaluation of Hypotheses
4.3.1 Likelihood Ratio Tests
4.3.2 Confidence Regions
4.3.3 Wald’s Test
4.3.4 Score Test
4.4 Nuisance Parameters
4.4.1 Loss of Efficiency Due to Nuisance Parameters
4.4.2 Marginal Likelihoods
4.4.3 Profile Likelihoods
4.5 Analysis of a Multinomial Distribution
4.5.1 Amount of Information per Observation
4.6 Analysis of Linear Logistic Models
4.6.1 The Logistic Distribution
4.6.2 Likelihood Function under Bernoulli Sampling
4.6.3 Mixed Effects Linear Logistic Model
An Introduction to Bayesian Inference
5.1 Introduction
5.2 Bayes Theorem: Discrete Case
5.3 Bayes Theorem: Continuous Case
5.4 Posterior Distributions
5.5 Bayesian Updating
5.6 Features of Posterior Distributions
5.6.1 Posterior Probabilities
5.6.2 Posterior Quantiles
5.6.3 Posterior Modes
5.6.4 Posterior Mean Vector and Covariance Matrix
Bayesian Analysis of Linear Models
6.1 Introduction
6.2 The Linear Regression Model
6.2.1 Inference under Uniform Improper Priors
6.2.2 Inference under Conjugate Priors
6.3.2 Joint and Conditional Posterior Distributions
6.3.3 Marginal Distribution of Variance Components
7.4.2 Entropy of a Discrete Distribution
7.4.3 Entropy of a Joint and Conditional Distribution
7.4.4 Entropy of a Continuous Distribution
7.4.5 Information about a Parameter
7.4.6 Fisher’s Information Revisited
7.5.3 Maximum Entropy Prior Distributions
7.5.4 Reference Prior Distributions
Bayesian Assessment of Hypotheses and Models
8.1 Introduction
8.2 Bayes Factors
8.2.1 Definition
8.2.2 Interpretation
8.2.3 The Bayes Factor and Hypothesis Testing
8.2.4 Influence of the Prior Distribution
8.2.5 Nested Models
8.2.6 Approximations to the Bayes Factor
8.2.7 Partial and Intrinsic Bayes Factors
8.4 Goodness of Fit and Model Complexity
8.5 Goodness of Fit and Predictive Ability of a Model
8.5.1 Analysis of Residuals
8.5.2 Predictive Ability and Predictive Cross-Validation
8.6 Bayesian Model Averaging
8.6.1 General
8.6.2 Definitions
8.6.3 Predictive Ability of BMA
Approximate Inference Via the EM Algorithm
9.1 Introduction
9.2 Complete and Incomplete Data
9.3 The EM Algorithm
9.3.1 Form of the Algorithm
9.3.2 Derivation
9.5 The Missing Information Principle
9.5.1 Complete, Observed and Missing Information
9.5.2 Rate of Convergence of the EM Algorithm
9.6 EM Theory for Exponential Families
9.7 Standard Errors and Posterior Standard Deviations
9.7.1 The Method of Louis
9.7.2 Supplemented EM Algorithm (SEM)
9.7.3 The Method of Oakes
9.8 Examples
Markov Chain Monte Carlo Methods
An Overview of Discrete Markov Chains
10.1 Introduction
10.2 Definitions
10.3 State of the System after n-Steps
10.4 Long-Term Behavior of the Markov Chain
10.5 Stationary Distribution
10.6 Aperiodicity and Irreducibility
10.7 Reversible Markov Chains
Markov Chain Monte Carlo
11.1 Introduction
11.2 Preliminaries
11.2.1 Notation
11.2.2 Transition Kernels
11.2.3 Varying Dimensionality
11.3 An Overview of Markov Chain Monte Carlo
11.4 The Metropolis–Hastings Algorithm
11.4.1 An Informal Derivation
11.4.2 A More Formal Derivation
11.5 The Gibbs Sampler
11.5.1 Fully Conditional Posterior Distributions
11.5.2 The Gibbs Sampling Algorithm
11.6 Langevin–Hastings Algorithm
11.7 Reversible Jump MCMC
11.7.1 The Invariant Distribution
11.7.2 Generating the Proposal
11.7.3 Specifying the Reversibility Condition
11.7.4 Derivation of the Acceptance Probability
11.7.5 Deterministic Proposals
11.7.6 Generating Proposals via the Identity Mapping
11.8 Data Augmentation
Implementation and Analysis of MCMC Samples
12.1 Introduction
12.2 A Single Long Chain or Several Short Chains?
12.3 Convergence Issues
12.3.1 Effect of Posterior Correlation on Convergence
12.3.2 Monitoring Convergence
12.4 Inferences from the MCMC Output
12.4.1 Estimators of Posterior Quantities
12.4.2 Monte Carlo Variance
12.5 Sensitivity Analysis
Applications in Quantitative Genetics
Gaussian and Thick-Tailed Linear Models
13.1 Introduction
13.2 The Univariate Linear Additive Genetic Model
13.2.1 A Gibbs Sampling Algorithm
13.3 Additive Genetic Model with Maternal Effects
13.3.1 Fully Conditional Posterior Distributions
13.4 The Multivariate Linear Additive Genetic Model
13.4.1 Fully Conditional Posterior Distributions
13.5 A Blocked Gibbs Sampler for Gaussian Linear Models
13.6 Linear Models with Thick-Tailed Distributions
13.6.1 Motivation
13.6.2 A Student-t Mixed Effects Model
13.6.3 Model with Clustered Random Effects
13.7 Parameterizations and the Gibbs Sampler
Threshold Models for Categorical Responses
14.1 Introduction
14.2 Analysis of a Single Polychotomous Trait
14.2.1 Sampling Model
14.2.2 Prior Distribution and Joint Posterior Density
14.2.3 Fully Conditional Posterior Distributions
14.2.4 The Gibbs Sampler
14.3.1 Sampling Model
14.3.2 Prior Distribution and Joint Posterior Density
14.3.3 Fully Conditional Posterior Distributions
14.3.4 The Gibbs Sampler
14.3.5 Implementation with Binary Traits
Bayesian Analysis of Longitudinal Data
15.1 Introduction
15.2 Hierarchical or Multistage Models
15.2.1 First Stage
15.2.2 Second Stage
15.2.3 Third Stage
15.2.4 Joint Posterior Distribution
15.3 Two-Step Approximate Bayesian Analysis
15.3.3 Special Case: Linear First Stage
15.4 Computation via Markov Chain Monte Carlo
15.5 Analysis with Thick-Tailed Distributions
16.1 Introduction
0 of .
Results for:
No results containing your search query
P. 1
Likelihood, Bayesian and MCMC Methods in Quantitative Genetics

Likelihood, Bayesian and MCMC Methods in Quantitative Genetics

Ratings: (0)|Views: 10,407|Likes:
Published by fukunaga_smc

More info:

categoriesTopics, Art & Design
Published by: fukunaga_smc on Jan 02, 2011
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See More
See less





You're Reading a Free Preview
Pages 16 to 320 are not shown in this preview.