Machine Learning and Other Topics
When does the Delta Method approximation work?
I explore when the Delta Method approximation works and fails.
Bias-Variance Trade-Off
The bias-variance trade-off is a rare insight into the challenge of generalization.
Information Theory and Entropy
Entropy and its related concepts quantify the otherwise abstract concept of information. A tour reveals its relationship to information, binary encodings and uncertainty. Most intuitively, we're left with a simple analogy to 2D areas.
Generalized Linear Models
A Generalized Linear Model, if viewed without knowledge of their motivation, can be a confusing tool. It's easier to understand if seen as a two knob generalization of linear regression.
The Fisher Information
The Fisher Information quantifies the information an observation carries for a parameter. The quantification becomes intuitive once we see it measuring a certain geometric quality.
Bayesian Optimization
When optimizing a slow-to-evaluate and non-differentiable function, one may think random sampling is the only option--a naive approach likely to disappoint. However, Bayesian optimization, a clever exploit of the function assumed smoothness, disconfirms these intuitions.
The Exponential Family
The exponential family is a generalization of distributions, inclusive of many familiar ones plus a universe of others. The general form brings elegant properties, illuminating all distributions within. In this post, we discuss what it is, how it applies and some of its properties.
Motivating the Gini Impurity Metric
We reveal the gini impurity metric as the destination of a few natural steps.
The Trace as a Measure of Complexity
For a class of models, the trace provides a measure of model complexity that's useful for managing the bias variance trade-off.
A Brief Explanation and Application of Gaussian Processes
A clever and useful technique for inferring distributions over infinite functions using finite observations.