Linear Digressions

K Nearest Neighbors

Linear Digressions

K Nearest Neighbors is an algorithm with secrets. On one hand, the algorithm itself is as straightforward as possible: find the labeled points nearest the point that you need to predict, and make a prediction that’s the average of their answers. On the other hand, what does “nearest” mean when you’re dealing with complex data? How do you decide whether a man and a woman of the same age are “nearer” to each other than two women several years apart? What if you convert all your monetary columns from dollars to cents, your distances from miles to nanometers, your weights from pounds to kilograms? Can your definition of “nearest” hold up under these types of transformations? We’re discussing all this, and more, in this week’s episode.

Next Episodes


Linear Digressions

The Assumptions of Ordinary Least Squares @ Linear Digressions

📆 2019-02-04 00:24 / 00:25:07


Linear Digressions

Quantile Regression @ Linear Digressions

📆 2019-01-28 02:27 / 00:21:46


Linear Digressions

Heterogeneous Treatment Effects @ Linear Digressions

📆 2019-01-21 00:57 / 00:17:24