TITULO: Introducción a la estadística Bayesiana con MCMC

IMPARTIDO POR:: Agustín Blasco

LENGUA: Impartido en castellano con diapositivas y apuntes en inglés

FECHA: 7 a 10 de Mayo de 2018

LUGAR: Edificio del Departamento de Ciencia Animal. Universidad Politécnica de Valencia. Aula 1 (Planta baja) . Plano interactivo. Plano de situación

HORARIO: 9:00 a 14:00  y  15:30 a 18:00

MATRICULA: 50 euros. Enlace para matricularse:

https://www.cfp.upv.es/formacion-permanente/cursos/introduccion-a-la-estadistica-bayesiana-con-mcmc_idiomaes-cid60986.html

LECTURE NOTES:

        Se ofrecerán en pdf los capítulos del libro Bayesian data Analysis for Animal Scientists (Agustín Blasco, Springer, 2017).

PROGRAMAS DE ORDENADOR:

        Son programas específicamente desarrollados para el curso. No se ofrecen todavía de forma pública, pero estarán disponibles para los participantes en el curso.

POWERPOINT:

        Estarán disponibles al comenzar el curso

 

ARTICULOS RECOMENDADOS:

BLASCO A. 2001. The Bayesian controversy in Animal  Breeding. J. Anim. Sci. 79:2023-2046.

BLASCO A. 2005. The use of Bayesian statistics in meat quality analyses. Meat Sci. 69: 115 -122.

HERNÁNDEZ P., et al. 2005. A Bayesian approach to the effect of selection for growth rate on sensory meat quality of rabbit. Meat Sci. 69: 123-127.

PEIRO R. et al. 2007. Identification of Single-Nucleotide Polymorphism in the Progesterone Receptor Gene and Its Association With Reproductive Traits in Rabbits. Genetics 180: 1699–1705

ZOMEÑO C., HERNANDEZ P., BLASCO A. 2013. Divergent selection for intramuscular fat content in rabbits. II. Correlated responses in meat characteristics. J. Anim. Sci. 91:4532-4539.

MARTÍNEZ-ÁLVARO M., HERNÁNDEZ P., BLASCO A. 2016. Divergent selection on intramuscular fat in rabbits: Responses to selection and genetic parameters. J. Anim. Sci. 94: 4993-5003

 

            PROGRAMA

 

LUNES

  1. Do we understand classical statistics?

  2. The Bayesian choice

 

MARTES

  3. Posterior distributions

  4. MCMC

  5. The baby model 

  Lab: Software for MCMC analyses

 

MIERCOLES

  6. The Linear model. I. The "fixed effects" model

  7. The Linear model. II. The "mixed" model

  8. A scope of the possibilities of Bayesian inference + MCMC    

  Lab: Software for MCMC analyses

 

JUEVES

  9. Prior information

 10. Model choice

  Lab: Software for MCMC analyses

 

 

 

                        PROGRAMA DETALLADO

 

1. Do we understand classical statistics?

1.1. Historical introduction

1.2. Test of hypothesis

1.2.1. The procedure

1.2.2. Common misinterpretations

1.3. Standard errors and Confidence intervals

1.3.1. Definition of standard error and confidence interval

1.3.2. Common misinterpretations

1.4. Bias and Risk of an estimator

1.4.1. Unbiased estimators

1.4.2. Common misinterpretations

1.5. Fixed and random effects

            1.5.1. Definition of “fixed” and “random” effects

1.5.2. Bias, variance and Risk of an estimator for fixed or random effects

1.5.3. Common misinterpretations

1.6. Likelihood

1.6.1. Definition of likelihood

1.6.2. The method of maximum likelihood

1.6.3. Common misinterpretations

 

2. The Bayesian choice

2.1. Bayesian inference

            2.1.1. The foundations of Bayesian inference

            2.1.2. Bayes theorem

            2.1.3. Prior information

2.2. Features of Bayesian inference

2.2.1. Point estimates: Mean, median, mode

2.2.2. Credibility intervals

2.2.3. Marginalisation

2.3. Test of hypotheses

2.3.1. Model choice

2.3.2. Bayes factors

2.3.3. Model averaging

2.4. Common misinterpretations

2.5. Bayesian Inference in practice

2.6. Advantages of Bayesian inference

 

3. Posterior distributions

3.1. Notation

3.2. Cumulative distribution

3.3. Density distribution

            3.3.1. Definition

            3.3.2. Transformed densities

3.4. Features of a density distribution

3.4.1. Mean

3.4.2. Median

3.4.3. Mode

3.4.4. Credibility intervals

3.5. Conditional distribution

            3.5.1. Definition

            3.5.2. Bayes Theorem

            3.5.3. Conditional distribution of the sample of a Normal distribution

            3.5.4. Conditional distribution of the variance of a Normal distribution

            3.5.5. Conditional distribution of the mean of a Normal distribution

3.6. Marginal distribution

            3.6.1. Definition

            3.6.2. Marginal distribution of the variance of a normal distribution

            3.6.3. Marginal distribution of the mean of a normal distribution

 

4. MCMC

4.1. Samples of Marginal Posterior distributions

            4.1.1. Taking samples of Marginal Posterior distributions

4.1.2. Making inferences from samples of Marginal Posterior distributions

4.2. Gibbs sampling

4.2.1. How it works

4.2.2. Why it works

4.2.3. When it works

4.2.4. Gibbs sampling features

4.2.5. Example

4.3. Other MCMC methods

4.3.1. Acceptance-Rejection

4.3.2. Metropolis

 

5. The “baby” model

5.1. The model

5.2. Analytical solutions

5.2.1. Marginal posterior distribution of the mean and variance

5.2.2. Joint posterior distribution of the mean and variance

5.2.3. Inferences

5.3. Working with MCMC

5.3.1. Using Flat priors

5.3.2. Using vague informative priors

5.3.3. Common misinterpretations

 

6. The linear model. I. The “fixed” effects model

6.1. The model

            6.2. Marginal posterior distributions via MCMC using Flat priors

            6.3. Marginal posterior distributions via MCMC using vague informative priors

            6.4. Least Squares as a Bayesian EstimatoR

 

7. The linear model. II. The “mixed” model

7.1. The “mixed” model         

            7.2.1. The model        

7.2.2. Marginal posterior distributions via MCMC

            7.2.3. BLUP as a Bayesian estimator

            7.2.4. REML as a Bayesian estimator

7.2. The multivariate model

            7.3.1. The model

            7.3.2. Data augmentation

 

8. A scope of the possibilities of Bayesian inference + MCM

8.1. Comparison between treatments: Examples in Meat quality analysis

8.2. Longitudinal models: Examples in growth curves

8.3. Genetic trends: Examples using selected data. Ignoring selection

8.4. Modelling residuals. Examples in canalizing selection.

8.5. Modelling priors: Examples in genomic selection: Bayes A, Bayes B, Bayes C…

 

9. Prior information

9.1. Exact prior information

9.1.1. Prior information

9.1.2. Posterior probabilities with exact prior information

9.1.3. Influence of prior information in posterior probabilities

9.2. Vague prior information

            9.2.1. A vague definition of vague prior information

9.2.2. Examples of the use of vague prior information

9.3. No prior information

9.3.1. Flat priors

9.3.2. Jeffrey’s priors

9.3.3. Bernardo’s “Reference” priors

9.4. Improper priors

9.5. The Achilles heel of Bayesian inference

 

10. Model choice

10.1. Information

            10.1.1. Fisher Information

            10.1.2. Shanon information and Entropy

            10.1.3. Kullback information and Divergence

10.2. Model choice

            10.2.1. Akaike Information Criterion (AIC)

            10.2.2. Bayes factors

            10.2.3. Bayesian Information Criterion (BIC)

            10.2.4. Deviance Information Criterion (DIC)