ICML 2011 workshop on unsupervised
and transfer learning
Data
Dependent Loss Functions for Focused Generalization and Transfer
Learning
Farzaneh
Mirzazadeh and Dale Schuurmans
Department of Computing Science, University of Alberta
Edmonton, AB, Canada T6G 2E8
We investigate a method for using data dependent loss functions to
focus generalization and transfer learning. The idea is to construct
loss functions that encourage more accurate predictions in the densest
regions of the output space. In particular, we use the inverse
cumulative distribution function (cdf – estimated from the data) over
targets to define a transfer that maps linear pre-predictions to
nonlinear post-predictions. By composing cdf with a target
cumulative distribution function on the pre-prediction space, any
desired distribution can be induced on the pre-prediction values via
the induced transfer function. We demonstrate the utility of this
approach by applying it to an image reconstruction problem, showing
that resulting regressors have smaller test error than existing
regressors in presence of noise. We furthermore demonstrate that data
dependent loss functions provide a promising technique for transfer
between related problems.