ML Lunch (Oct 28, 2013): Training nested functions using auxiliary coordinates





Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Nov 8, 2013

Speaker: Miguel Á. Carreira-Perpinan
University of California, Merced

Many models in machine learning, computer vision or speech processing have the form of a sequence of nested, parameterized functions, such as a multilayer neural net, an object recognition pipeline, or a "wrapper" for feature selection. Joint estimation of the parameters of all the layers and selection of an optimal architecture is widely considered to be a difficult numerical nonconvex optimization problem, difficult to parallelize for execution in a distributed computation environment, and requiring significant human expert effort, which leads to suboptimal systems in practice. We describe a general mathematical strategy to learn the parameters and, to some extent, the architecture of nested systems, called the method of auxiliary coordinates (MAC). MAC has provable convergence, is easy to implement reusing existing algorithms for single layers, can be parallelized trivially and massively, applies even when parameter derivatives are not available or not desirable (so computing gradients with the chain rule does not apply), and is competitive with state-of-the-art nonlinear optimizers even in the serial computation setting, often providing reasonable models within a few iterations.

If time permits, I will illustrate how to use MAC to derive training algorithms for a range of problems, such as deep nets, best-subset feature selection, joint dictionary and classifier learning, supervised dimensionality reduction, and others.

This is joint work with Weiran Wang.

BIOGRAPHY: Miguel Á. Carreira-Perpinan is an associate professor in Electrical Engineering and Computer Science at the University of California, Merced. He received the degree of "licenciado en informática" (MSc in computer science) from the Technical University of Madrid in 1995 and a PhD in computer science from the University of Sheffield in 2001. Prior to joining UC Merced, he did postdoctoral work at Georgetown University (in computational neuroscience) and the University of Toronto (in machine learning), and was an assistant professor at the Oregon Graduate Institute (Oregon Health and Science University). He is the recipient of an NSF CAREER award, a Google Faculty Research Award and a best student paper award at Interspeech. He is an associate editor for the IEEE Transactions on Pattern Analysis and Machine Intelligence and an area chair for NIPS. His research interests lie in machine learning, in particular unsupervised learning problems such as dimensionality reduction, clustering and denoising, with an emphasis on optimization aspects, and with applications to speech processing (e.g. articulatory inversion and model adaptation), computer vision, sensor networks and other areas.

For more ML Lunch talks, visit http://www.cs.cmu.edu/~learning/


When autoplay is enabled, a suggested video will automatically play next.

Up next

to add this to Watch Later

Add to

Loading playlists...