Tag

MCMC

0 views collected around this technical thread.

Ctrip Technology
Ctrip Technology
Sep 12, 2023 · Artificial Intelligence

Using BSTS and CausalImpact for Causal Effect Estimation in Structured Time‑Series Data

The article explains how Bayesian Structured Time Series (BSTS) combined with the CausalImpact library can be used to estimate causal effects for policies or marketing interventions when traditional A/B experiments are infeasible, detailing model theory, Bayesian inference, MCMC estimation, code implementation, and a real‑world holiday‑push case study.

BSTSBayesian ModelingCausalImpact
0 likes · 20 min read
Using BSTS and CausalImpact for Causal Effect Estimation in Structured Time‑Series Data
Model Perspective
Model Perspective
Nov 29, 2022 · Artificial Intelligence

MCMC Demystified: Monte Carlo Basics, Metropolis-Hastings & Gibbs Sampling

Markov Chain Monte Carlo (MCMC) extends classic Monte Carlo by generating dependent samples via a Markov chain, enabling Bayesian inference through concepts like the plug‑in principle, burn‑in, asymptotic independence, and algorithms such as Metropolis‑Hastings and Gibbs sampling, while addressing convergence and effective sample size.

Bayesian inferenceGibbs samplingMCMC
0 likes · 13 min read
MCMC Demystified: Monte Carlo Basics, Metropolis-Hastings & Gibbs Sampling
Model Perspective
Model Perspective
Oct 22, 2022 · Fundamentals

Unlocking Bayesian Sampling: How MCMC and Hamiltonian Monte Carlo Work

This article explains the principles behind Markov Chain Monte Carlo methods, including Monte Carlo sampling, the Metropolis‑Hastings algorithm, and the Hamiltonian Monte Carlo (HMC) approach, illustrating how they efficiently approximate posterior distributions in Bayesian analysis.

Bayesian inferenceHamiltonian Monte CarloMCMC
0 likes · 11 min read
Unlocking Bayesian Sampling: How MCMC and Hamiltonian Monte Carlo Work
Model Perspective
Model Perspective
Oct 20, 2022 · Artificial Intelligence

Unlocking Bayesian Inference: How Probabilistic Programming Simplifies Complex Models

This article explains Bayesian statistics as a probabilistic framework, describes how modern numerical methods and probabilistic programming languages automate inference, and reviews both Markov and non‑Markov techniques such as MCMC, grid computation, Laplace approximation, and variational inference for building complex models.

Bayesian inferenceMCMCprobabilistic programming
0 likes · 7 min read
Unlocking Bayesian Inference: How Probabilistic Programming Simplifies Complex Models
Model Perspective
Model Perspective
Oct 15, 2022 · Fundamentals

Unlock Faster Bayesian Sampling: How Hamiltonian Monte Carlo Works

Hamiltonian Monte Carlo (HMC) is a rapid sampling technique that improves upon traditional MCMC by leveraging Hamiltonian dynamics, using position and velocity to define potential and kinetic energy, and follows a series of steps—including momentum sampling, leapfrog integration, and Metropolis acceptance—to efficiently explore complex probability distributions.

Bayesian SamplingHamiltonian Monte CarloMCMC
0 likes · 4 min read
Unlock Faster Bayesian Sampling: How Hamiltonian Monte Carlo Works
Model Perspective
Model Perspective
Oct 4, 2022 · Artificial Intelligence

How Metropolis-Hastings Improves MCMC Sampling Efficiency

This article explains the detailed‑balance condition for Markov chains, shows why finding a transition matrix for a given stationary distribution is hard, and demonstrates how Metropolis‑Hastings modifies MCMC to achieve higher acceptance rates with a concrete Python example.

MCMCMarkov ChainMetropolis-Hastings
0 likes · 9 min read
How Metropolis-Hastings Improves MCMC Sampling Efficiency
Model Perspective
Model Perspective
Sep 28, 2022 · Artificial Intelligence

How Monte Carlo Sampling Powers AI: From Basics to Acceptance-Rejection

This article introduces Monte Carlo methods, explains how random sampling approximates integrals, discusses uniform and non‑uniform probability distributions, and details acceptance‑rejection sampling as a technique for generating samples from complex distributions, laying the groundwork for understanding Markov Chain Monte Carlo in AI.

Acceptance-RejectionArtificial IntelligenceMCMC
0 likes · 8 min read
How Monte Carlo Sampling Powers AI: From Basics to Acceptance-Rejection
Model Perspective
Model Perspective
Sep 23, 2022 · Fundamentals

Mastering Monte Carlo: From Acceptance-Rejection to Gibbs Sampling in Python

This article explains the motivations behind Monte Carlo methods, introduces acceptance-rejection sampling, details Markov Chain Monte Carlo concepts, and walks through Metropolis-Hastings and Gibbs sampling algorithms with Python implementations, highlighting their use in high‑dimensional probability distribution sampling.

AlgorithmsMCMCPython
0 likes · 18 min read
Mastering Monte Carlo: From Acceptance-Rejection to Gibbs Sampling in Python
Model Perspective
Model Perspective
Sep 21, 2022 · Fundamentals

Unlocking Monte Carlo Sampling: From Basics to Acceptance‑Rejection in AI

Monte Carlo methods, originally a gambling-inspired random simulation technique, provide a versatile way to approximate integrals and sums, and by using acceptance‑rejection sampling they enable drawing samples from complex probability distributions, a key step toward effective Markov Chain Monte Carlo algorithms in machine learning and AI.

Acceptance-RejectionMCMCMachine Learning
0 likes · 7 min read
Unlocking Monte Carlo Sampling: From Basics to Acceptance‑Rejection in AI
Model Perspective
Model Perspective
Sep 19, 2022 · Artificial Intelligence

Master Bayesian Linear Regression with PyMC3: A Hands‑On Guide

This tutorial explains how to use PyMC3 for Bayesian linear regression, covering model definition, data simulation, MAP estimation, advanced MCMC sampling with NUTS, and posterior analysis, all illustrated with complete Python code examples.

Bayesian inferenceMCMCPyMC3
0 likes · 11 min read
Master Bayesian Linear Regression with PyMC3: A Hands‑On Guide
Model Perspective
Model Perspective
Sep 18, 2022 · Artificial Intelligence

How Bayesian Linear Regression Reveals Uncertainty in Model Parameters

This article explains Bayesian linear regression, describing its probabilistic treatment of weights, prior and posterior computation, MAP and numerical solutions, and how it enables uncertainty quantification, online learning, and model comparison through Bayes factors.

Bayesian inferenceMAP estimationMCMC
0 likes · 9 min read
How Bayesian Linear Regression Reveals Uncertainty in Model Parameters