Bias Variance Tradeoff

Data Skeptic

A discussion of the expected number of cars at a stoplight frames today's discussion of the bias variance tradeoff. The central ideal of this concept relates to model complexity. A very simple model will likely generalize well from training to testing data, but will have a very high variance since it's simplicity can prevent it from capturing the relationship between the covariates and the output. As a model grows more and more complex, it may capture more of the underlying data but the risk that it overfits the training data and therefore does not generalize (is biased) increases. The tradeoff between minimizing variance and minimizing bias is an ongoing challenge for data scientists, and an important discussion for skeptics around how much we should trust mod

Next Episodes

Data Skeptic

Big Data Doesnt Exist @ Data Skeptic

📆 2015-11-06 01:00


Data Skeptic

Covariance and Correlation @ Data Skeptic

📆 2015-10-30 01:00


Data Skeptic

Bayesian A/B Testing @ Data Skeptic

📆 2015-10-23 02:00


Data Skeptic

The Central Limit Theorem @ Data Skeptic

📆 2015-10-16 02:00


Data Skeptic

Accessible Technology @ Data Skeptic

📆 2015-10-09 02:00