# Writing

### Part 2: Exact Inference

Given a Probabilistic Graphical Model, exact inference algorithms exploit factorization and caching to answer questions about the system it represents.

### Part 1: An Introduction to Probabilistic Graphical Models

Probabilistic Graphical Models are born from a remarkable synthesis of probability theory and graph theory. They are among our most powerful tools for managing nature's baffling mixture of uncertainty and complexity.

### Bias-Variance Trade-Off

The bias-variance trade-off is a rare insight into the challenge of generalization.

### Information Theory and Entropy

Entropy and its related concepts quantify the otherwise abstract concept of information. A tour reveals its relationship to information, binary encodings and uncertainty. Most intuitively, we're left with a simple analogy to 2D areas.

### Generalized Linear Models

A Generalized Linear Model, if viewed without knowledge of their motivation, can be a confusing tool. It's easier to understand if seen as a two knob generalization of linear regression.

### The Fisher Information

The Fisher Information quantifies the information an observation carries for a parameter. The quantification becomes intuitive once we see it measuring a certain geometric quality.

### The Copula and 2008

The copula provides a clever means for mixing and matching a set of marginal distributions with the joint-only mechanism of a joint distribution. However, its elegance and utility have been a dangerous lure.

### The Matrix Inversion Lemma

The Matrix Inversion Lemma looks intimidating, but it is easy to know when it applies. Doing so offers considerable computational speed ups.

### Bayesian Optimization

When optimizing a slow-to-evaluate and non-differentiable function, one may think random sampling is the only option--a naive approach likely to disappoint. However, Bayesian optimization, a clever exploit of the function assumed smoothness, disconfirms these intuitions.