Machine Learning and Other Topics

Bias-Variance Trade-Off

Bias-Variance Trade-Off

DJ Rich

The bias-variance trade-off is a rare insight into the challenge of generalization.

Information Theory and Entropy

Information Theory and Entropy

DJ Rich

Entropy and its related concepts quantify the otherwise abstract concept of information. A tour reveals its relationship to information, binary encodings and uncertainty. Most intuitively, we're left with a simple analogy to 2D areas.

Generalized Linear Models

Generalized Linear Models

DJ Rich

A Generalized Linear Model, if viewed without knowledge of their motivation, can be a confusing tool. It's easier to understand if seen as a two knob generalization of linear regression.

Jensen's Inequality

Jensen's Inequality

DJ Rich

A visual makes Jensen's Inequality obvious and intuitive.

The Fisher Information

The Fisher Information

DJ Rich

The Fisher Information quantifies the information an observation carries for a parameter. The quantification becomes intuitive once we see it measuring a certain geometric quality.

Bayesian Optimization

Bayesian Optimization

DJ Rich

When optimizing a slow-to-evaluate and non-differentiable function, one may think random sampling is the only option--a naive approach likely to disappoint. However, Bayesian optimization, a clever exploit of the function assumed smoothness, disconfirms these intuitions.

The Exponential Family

The Exponential Family

DJ Rich

The exponential family is a generalization of distributions, inclusive of many familiar ones plus a universe of others. The general form brings elegant properties, illuminating all distributions within. In this post, we discuss what it is, how it applies and some of its properties.

The Trace as a Measure of Complexity

The Trace as a Measure of Complexity

DJ Rich

For a class of models, the trace provides a measure of model complexity that's useful for managing the bias variance trade-off.