Writing

Part 4: Monte Carlo Methods

Part 4: Monte Carlo Methods

DJ Rich

Monte Carlo methods answer the inference task with a set of samples, sampled approximately from the target distribution. In total, they provide a supremely general toolset. However, to use them requires a skill for managing complexities of distributional convergence and autocorrelation.

Part 3: Variational Inference

Part 3: Variational Inference

DJ Rich

Variational Inference, a category of approximate inference algorithms, achieves efficiency by restricting inference to a computationally friendly set of distributions. Using tools from information theory, we may find the distribution that best approximates results of exact inference.

Part 2: Exact Inference

Part 2: Exact Inference

DJ Rich

Given a Probabilistic Graphical Model, exact inference algorithms exploit factorization and caching to answer questions about the system it represents.

Part 1: An Introduction to Probabilistic Graphical Models

Part 1: An Introduction to Probabilistic Graphical Models

DJ Rich

Probabilistic Graphical Models are born from a remarkable synthesis of probability theory and graph theory. They are among our most powerful tools for managing nature's baffling mixture of uncertainty and complexity.

Bias-Variance Trade-Off

Bias-Variance Trade-Off

DJ Rich

The bias-variance trade-off is a rare insight into the challenge of generalization.

Information Theory and Entropy

Information Theory and Entropy

DJ Rich

Entropy and its related concepts quantify the otherwise abstract concept of information. A tour reveals its relationship to information, binary encodings and uncertainty. Most intuitively, we're left with a simple analogy to 2D areas.

Generalized Linear Models

Generalized Linear Models

DJ Rich

A Generalized Linear Model, if viewed without knowledge of their motivation, can be a confusing tool. It's easier to understand if seen as a two knob generalization of linear regression.

Jensen's Inequality

Jensen's Inequality

DJ Rich

A visual makes Jensen's Inequality obvious and intuitive.

The Fisher Information

The Fisher Information

DJ Rich

The Fisher Information quantifies the information an observation carries for a parameter. The quantification becomes intuitive once we see it measuring a certain geometric quality.