## Tagged “Bayesian”

Aug 26, 2014

## Joint, conditional and marginal probabilities

In this post I will discuss a topic that seems very dry at first but turns out to have many cool applications. While I will not discuss Bayesian inference in this post, understanding the relationship between joint, conditional and marginal probabilities is essential for the application of Bayesian thinking. As a result, I'll will often refer back to this discussion in future posts.

Sep 11, 2014

## Medical tests, a first example of Bayesian calculations

In this post I will discuss a first example of a Bayesian calculation using a well-known example of testing for breast cancer.

Oct 24, 2014

## Inferring probabilities, a second example of Bayesian calculations

In this post I will focus on an example of inferring probabilities given a short data series. I will start by tackling the theory of how to do the desired inference in a Bayesian way and will end by implementing the theory in Python so that we can play around with the ideas. In an attempt to keep the post more accessible, I will only consider a small set of candidate probabilities. This restriction allows me to minimize the mathematical difficulty of the inference and still obtain really cool results, including nice plots of the **prior**, **likelihood** and **posterior**.

Nov 13, 2014

## Getting started with Latent Dirichlet Allocation in Python

In this post I will go over installation and basic usage of the **lda** Python package for Latent Dirichlet Allocation (LDA). I *will not* go through the theoretical foundations of the method in this post. However, the main reference for this model, *Blei et. al. (2003)* is freely available online and I think the main idea of assigning documents in a corpus (set of documents) to latent (hidden) topics based on a vector of words is fairly simple to understand and the example will help to solidify our understanding of the LDA model.

Dec 11, 2014

## Inferring probabilities with a Beta prior, a third example of Bayesian calculations

In this post I will expand on a previous example of inferring probabilities from a data series. In particular, instead of considering a discrete set of candidate probabilities, I'll consider all (continuous) values between \( 0 \) and \( 1 \). This means our prior (and posterior) will now be a probability density function (pdf) instead of a probabilty mass function (pmf). More specifically, I'll use the Beta Distribution for this example.

May 4, 2015

## Probabilistic programming with Python and Lea

In this post I will cover installation of a probabilistic programming package for Python called Lea and provide some simple examples of using the package to do calculations with joint, conditional and marginal distributions. These examples follow the by-hand calculations done in a previous post.

May 27, 2015

## Revisiting the medical tests example with Python and Lea

In this post I will use Python, and the probabilistic programming package Lea, to re-analyze an example of Bayes' Theorem covered in an earlier post. The focus will be on translating the by-hand calculations into Python code.