Predictive modeling is one of the figureheads of big data. Machine Learning Theory asserts that the more data the better, and empirical observations suggest that the more granular data, the better the performance (provided you have modern algorithms and big data) but the paradox of predictive modeling is that when you need models the most, even all the data is not enough. There are just so many people buying luxury cars online. So even in the days and age of big data there remains an art to predictive modeling in situation where the right data is scarce. This talk will present a number of cases where enough of the right data is simply not obtainable. In those instances we discuss some tricks of the trade including transfer learning and quantile estimation.

Loading more stuff…

Hmm…it looks like things are taking a while to load. Try again?

Loading videos…