Abstract

Group Lasso regression is a widely used method for performing model selection on groups of predictors with a natural structure. When each group is orthogonalized, the optimization yields a simple block-coordinate descent (BCD) algorithm. However, despite its simplicity, the BCD algorithm tends to converge slowly when the feature matrix is poorly conditioned. This talk introduces a novel iterative algorithm for the Group Lasso, based on the majorize-minimization (MM) principle. I will present comparative numerical studies that highlight the scenarios in which the classical BCD algorithm struggles, in contrast to the MM algorithm, which converges up to an order of magnitude faster. Additionally, I will demonstrate how the MM algorithm can solve non-group Lasso problems. As an illustrating example, we will look at the Fused Lasso problem. This is joint work with my PhD supervisors S

Speaker

Anant Mathur

 

Research Area

Statistics seminar

Affiliation

UNSW, Sydney

Date

Friday, 27 Sep 2024, 4:00 pm

Venue

Rm 3085, Anita B. Lawrence Center