Linear Digressions

Google Neural Machine Translation

Linear Digressions

Recently, Google swapped out the backend for Google Translate, moving from a statistical phrase-based method to a recurrent neural network. This marks a big change in methodology: the tried-and-true statistical translation methods that have been in use for decades are giving way to a neural net that, across the board, appears to be giving more fluent and natural-sounding translations. This episode recaps statistical phrase-based methods, digs into the RNN architecture a little bit, and recaps the impressive results that is making us all sound a little better in our non-native languages.

Next Episodes



Linear Digressions

How to Lose at Kaggle @ Linear Digressions

📆 2016-12-12 05:28 / 00:17:16


Linear Digressions

Attacking Discrimination in Machine Learning @ Linear Digressions

📆 2016-12-05 04:38 / 00:23:20


Linear Digressions

Recurrent Neural Nets @ Linear Digressions

📆 2016-11-28 03:47 / 00:12:36