I am currently a Machine Learning researcher at Technicolor. Previously, I was a Postdoctoral Fellow at the department of Computer Science, at UT Austin, where I was working with Prof. Inderjit Dhillon.
I received my PhD in Electrical and Computer Engineering from UW-Madison, under Prof. Robert Nowak.
My research interests are large scale data modeling, optimization and statistical machine learning algorithms
CONTACT:
email: <firstname><lastname>86@gmail-dot-com
I received my PhD in Electrical and Computer Engineering from UW-Madison, under Prof. Robert Nowak.
My research interests are large scale data modeling, optimization and statistical machine learning algorithms
CONTACT:
email: <firstname><lastname>86@gmail-dot-com
SOFTWARE
Code for COnditional Gradient with ENhancement and Truncation (CoGEnT). It’s highly general purpose, and includes routine to perform atomic norm regularized least squares regression. It allows you to code up your own routine for selection of atoms, and use it in the CoGEnT framework.
[LINK]
GRALS (Graph Regularized Alternating Least Squares). C++ with MATLAB wrapper.
[LINK]
MATLAB code for the sparse overlapping group lasso. The file contains routines for both Linear and Logistic regression.
[LINK]
Code for COnditional Gradient with ENhancement and Truncation (CoGEnT). It’s highly general purpose, and includes routine to perform atomic norm regularized least squares regression. It allows you to code up your own routine for selection of atoms, and use it in the CoGEnT framework.
[LINK]
GRALS (Graph Regularized Alternating Least Squares). C++ with MATLAB wrapper.
[LINK]
MATLAB code for the sparse overlapping group lasso. The file contains routines for both Linear and Logistic regression.
[LINK]