Happy Wheels Krunkerigri Araba Oyunu
Seminars
Back
Topic: Supervised Learning Incorporating Graphical Structure among Predictors
Date: 16/02/2016
Time: 2:30 p.m. - 3:30 p.m.
Venue: Lady Shaw Building (LSB) Room C2
Category: Seminar
Speaker: Mr. Guan Yu
Details:

Abstract

With the abundance of high dimensional data in various disciplines, regularization

techniques are very popular these days. Despite the success of these techniques, some

challenges remain. One challenge is the development of efficient methods

incorporating structure information among predictors. Typically, the structure information

among predictors can be modeled by the connectivity of an undirected graph using

all predictors as nodes of the graph. In this talk, I will introduce an efficient

regularization technique incorporating graphical structure information among predictors.

Specifically, according to the undirected graph, we use a latent group lasso penalty to

utilize the graph node-by-node. The predictors connected in the graph are encouraged

to be selected jointly. This new regularization technique can be used for many

supervised learning problems. For sparse regression, our new method using the proposed

regularization technique includes adaptive Lasso, group Lasso, and ridge regression as

special cases. Theoretical studies show that it enjoys model selection consistency and

acquires tight finite sample bounds for estimation and prediction. For the multi-task

learning problem, our proposed graph-guided multi-task method includes the popular

l2,1-norm regularized multi-task learning method as a special case. Numerical studies

using simulated datasets and the Alzheimer's Disease Neuroimaging Initiative (ADNI)

dataset also demonstrate the effectiveness of the proposed methods.

PDF: 201600216_YU.pdf