Bayesian statistics
Part of a series on |
Bayesian statistics |
---|
Posterior = Likelihood × Prior ÷ Evidence |
Background |
Model building |
Posterior approximation |
Estimators |
|
Evidence approximation |
Model evaluation |
|
Bayesian statistics (
Bayesian statistical methods use Bayes' theorem to compute and update probabilities after obtaining new data. Bayes' theorem describes the conditional probability of an event based on data as well as prior information or beliefs about the event or conditions related to the event.[3][4] For example, in Bayesian inference, Bayes' theorem can be used to estimate the parameters of a probability distribution or statistical model. Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters.[2][3]
Bayesian statistics is named after Thomas Bayes, who formulated a specific case of Bayes' theorem in a paper published in 1763. In several papers spanning from the late 18th to the early 19th centuries, Pierre-Simon Laplace developed the Bayesian interpretation of probability.[5] Laplace used methods that would now be considered Bayesian to solve a number of statistical problems. Many Bayesian methods were developed by later authors, but the term was not commonly used to describe such methods until the 1950s. During much of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many Bayesian methods required much computation to complete, and most methods that were widely used during the century were based on the frequentist interpretation. However, with the advent of powerful computers and new algorithms like Markov chain Monte Carlo, Bayesian methods have seen increasing use within statistics in the 21st century.[2][6]
Bayes' theorem
Bayes' theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events and , the conditional probability of given that is true is expressed as follows:[7]
where . Although Bayes' theorem is a fundamental result of probability theory, it has a specific interpretation in Bayesian statistics. In the above equation, usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips). is the prior probability of which expresses one's beliefs about before evidence is taken into account. The prior probability may also quantify prior knowledge or information about . is the likelihood function, which can be interpreted as the probability of the evidence given that is true. The likelihood quantifies the extent to which the evidence supports the proposition . is the posterior probability, the probability of the proposition after taking the evidence into account. Essentially, Bayes' theorem updates one's prior beliefs after considering the new evidence .[2]
The probability of the evidence can be calculated using the law of total probability. If is a partition of the sample space, which is the set of all outcomes of an experiment, then,[2][7]
When there are an infinite number of outcomes, it is necessary to integrate over all outcomes to calculate using the law of total probability. Often, is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product:[2]
The
Bayesian methods
The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions.
Bayesian inference
Bayesian inference refers to statistical inference where uncertainty in inferences is quantified using probability.[8] In classical frequentist inference, model parameters and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads approaches one-half as the number of coin flips increases.[9]
Statistical modeling
The formulation of
For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al.[14]
For reporting the results of a Bayesian statistical analysis, Bayesian analysis reporting guidelines (BARG) are provided in an open-access article by John K. Kruschke.[15]
Design of experiments
The
Exploratory analysis of Bayesian models
Exploratory analysis of Bayesian models is an adaptation or extension of the exploratory data analysis approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis:[16]
Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data analyses
The inference process generates a posterior distribution, which has a central role in Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly answer the questions that motivate the inference process.[17]
When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself:
- Diagnoses of the quality of the inference, this is needed when using numerical methods such as Markov chain Monte Carlo techniques
- Model criticism, including evaluations of both model assumptions and model predictions
- Comparison of models, including model selection or model averaging
- Preparation of the results for a particular audience
All these tasks are part of the Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries.[18][19][20]
See also
- Bayesian epistemology
- For a list of mathematical logic notation used in this article
References
- ^ "Bayesian". Merriam-Webster.com Dictionary.
- ^ ISBN 978-1-4398-4095-5.
- ^ ISBN 978-0-367-13991-9.
- ISBN 978-0-12-405888-0.
- ISBN 978-0-3001-8822-6.
- doi:10.1214/06-BA101.
- ^ ISBN 978-0-8218-9414-9.
- S2CID 220935477.
- ISBN 978-1-4419-0924-4.
- ISBN 978-1119951513.
- ^ Kruschke, J K; Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.). The Oxford Handbook of Computational and Mathematical Psychology (PDF). Oxford University Press. pp. 279–299.
- arXiv:1810.09433
- .
- S2CID 234108684.
- PMID 34400814.
- hdl:11336/114615.
- S2CID 26590874.
- S2CID 88522683.
- ISBN 9781789341652.
Further reading
- ISBN 0-471-92416-4.
- Bolstad, William M.; Curran, James M. (2016). Introduction to Bayesian Statistics (3rd ed.). Wiley. ISBN 978-1-118-09156-2.
- ISBN 978-1-4920-8946-9.
- Hoff, Peter D. (2009). A First Course in Bayesian Statistical Methods (2nd ed.). New York: Springer. ISBN 978-1-4419-2828-3.
- Lee, Peter M. (2012). Bayesian Statistics: An Introduction (4th ed.). Wiley. ISBN 978-1-118-33257-3.
- ISBN 978-0-387-71598-8.
- Johnson, Alicia A.; Ott, Miles Q.; Dogucu, Mine. (2022) Bayes Rules! An Introduction to Applied Bayesian Modeling. Chapman and Hall ISBN 9780367255398
External links
- Theo Kypraios. "A Gentle Tutorial in Bayesian Statistics" (PDF). Retrieved 2013-11-03.
- Jordi Vallverdu. Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning.
- Bayesian statistics David Spiegelhalter, Kenneth Rice Scholarpedia 4(8):5230. doi:10.4249/scholarpedia.5230
- Bayesian modeling book and examples available for downloading.
- Rens van de Schoot. "A Gentle Introduction to Bayesian Analysis" (PDF).
- Bayesian A/B Testing Calculator Dynamic Yield