Linear Digressions

Backpropagation

Linear Digressions

The reason that neural nets are taking over the world right now is because they can be efficiently trained with the backpropagation algorithm. In short, backprop allows you to adjust the weights of the neural net based on how good of a job the neural net is doing at classifying training examples, thereby getting better and better at making predictions. In this episode: we talk backpropagation, and how it makes it possible to train the neural nets we know and love.

Next Episodes

Linear Digressions

Text Analysis on the State Of The Union @ Linear Digressions

📆 2016-02-26 04:51 / 00:22:22


Linear Digressions

Paradigms in Artificial Intelligence @ Linear Digressions

📆 2016-02-22 05:32 / 00:17:20


Linear Digressions

Survival Analysis @ Linear Digressions

📆 2016-02-19 04:44 / 00:15:21


Linear Digressions

Gravitational Waves @ Linear Digressions

📆 2016-02-15 03:46 / 00:20:26


Linear Digressions

The Turing Test @ Linear Digressions

📆 2016-02-12 05:11 / 00:15:15