[ad_1]
Data Science
I recently posted an article where I used Bayesian Inference and Markov chain Monte carlo (MCMC) to predict the CL round of 16 winners. There, I tried to explain bayesian statistics in relative depth but I didn’t tell much about MCMC to avoid making it excessively large. The post:
So I decided to dedicate a full post to introduce Markov Chain Monte Carlo methods for anyone interested in learning how they work mathematically and when they proof to be useful.
To tackle this post, I’ll adopt the divide-and-conquer strategy: divide the term into its simplest terms and explain them individually to then solve the big picture. So this is what we’ll go through:
- Monte Carlo methods
- Stochastic processes
- Markov Chain
- MCMC
Monte Carlo Methods
A Monte Carlo method or simulation is a type of computational algorithm that consists in using sampling numbers repeatedly to obtain numerical results in the form of the likelihood of a range of results of occurring.
In other words, a Monte Carlo simulation is used to estimate or approximate the possible outcomes or distribution of an uncertain event.
A simple example to illustrate this is by rolling two dice and adding their values. We could easily compute the probability of each outcome but we could also use Monte Carlo methods to simulate 5,000 dice-rollings (or more) and get the underlying distribution.
Stochastic Processes
Wikipedia’s definition is “A stochastic or random process can be defined as a collection of random variables that is indexed by some mathematical set”[1].
[ad_2]
Source link