Linear Digressions

Ensemble Algorithms

Linear Digressions

If one machine learning model is good, are two models better? In a lot of cases, the answer is yes. If you build many ok models, and then bring them all together and use them in combination to make your final predictions, you've just created an ensemble model. It feels a little bit like cheating, like you just got something for nothing, but the results don't like: algorithms like Random Forests and Gradient Boosting Trees (two types of ensemble algorithms) are some of the strongest out-of-the-box algorithms for classic supervised classification problems. What makes a Random Forest random, and what does it mean to gradient boost a tree? Have a listen and find out.

Next Episodes

Linear Digressions

How to evaluate a translation: BLEU scores @ Linear Digressions

📆 2017-01-16 02:59 / 00:17:06


Linear Digressions

Zero Shot Translation @ Linear Digressions

📆 2017-01-09 04:20 / 00:25:32


Linear Digressions

Google Neural Machine Translation @ Linear Digressions

📆 2017-01-02 02:44 / 00:18:12