Abstract: 

The minimization of the logistic loss is a popular approach to batch supervised learning. It turns out that when fitting linear (or kernelized) classifiers, the minimization of the logistic loss is equivalent to the minimization of an exponential rado-loss computed (i) over transformed data that we call Rademacher observations (rados), and (ii) over the same classifier as the one of the logistic loss. Thus, a classifier learnt from rados can be directly used to classify observations. In this talk, I will present Rademacher observations and the aforementioned properties. I will present a formal boosting algorithms that learns with rados given as input, and one application of learning with Rademacher observations where rados bring a significant leverage over conventional examples: learning from privacy-compliant data. This is joint work with Arik Friedman and Giorgio Patrini. The seminar will be followed by drinks and finger food with the speaker. All attendees are welcome!

 

Speaker

Prof Richard Nock

Research Area
Affiliation

NICTA & Université des Antilles et de la Guyane

Date

Thu, 27/08/2015 - 4:00pm

Venue

RC-M032, The Red Centre, UNSW