Bayesian Inference: Grid Approximation
This interactive demo illustrates Bayesian inference using grid approximation. Imagine we're surveying people about whether they drink coffee. We want to estimate the true proportion of coffee drinkers in the population.
How It Works
- Prior: Choose your prior belief (uniform or centered on a specific value)
- Data: Each "Yes" or "No" click represents a survey response
- Posterior: The distribution updates based on Bayes' theorem
Original concept from Probabilistic Machine Learning course (Summer Term 2025)
University of Tübingen, taught by Professor Philipp Hennig
Controls
Data Collection
Responses:
Prior Selection
Visualization
Posterior Distribution
- Blue area: Posterior distribution (our updated belief)
- Red dashed line: Posterior mean (
) - Blue dashed lines:
% Credible interval - Gray dashed line: Prior distribution (
)
Understanding the Math
Bayes' Theorem
Where:
is the prior — our initial belief about is the likelihood — probability of seeing this data given is the posterior — our updated belief after seeing data
Likelihood Function
For binomial data (yes/no responses), the likelihood is:
where
Prior Distributions
Uniform Prior: All values of
Gaussian Prior: Centered at
Grid Approximation
Instead of solving analytically, we:
- Create a fine grid of possible
values from 0 to 1 - Compute the likelihood
at each grid point - Multiply by the prior
- Normalize so the total probability sums to 1
This gives us a discrete approximation of the true posterior distribution!