Defense: Hybrid-Parallel Parameter Estimation for Frequentist and Bayesian Models

Speaker Name: 
Parameswaran Raman
Speaker Title: 
Ph.D. Candidate
Start Time: 
Monday, December 2, 2019 - 8:45am
Location: 
Engineering 2, Room 280

Abstract:
Distributed algorithms in machine learning follow two main flavors: data parallel, where the data is distributed across multiple workers and model parallel, where the model parameters are partitioned across multiple workers. The main limitation of the first approach is that the model parameters need to be replicated on every machine. This is problematic when the number of parameters is very large, and hence cannot fit in a single machine. This drawback of the latter approach is that the data needs to be replicated on each machine.

In this defense talk, I will present Hybrid-Parallelism, an approach that allows us to partition both, the data as well as the model parameters simultaneously. As a result, each worker only needs access to a subset of the data and a subset of the parameters while performing parameter updates. I will present case studies concerning two popular models: (1) Multinomial Logistic Regression, and (2) Mixture of Exponential Families. In both cases, I will show how to exploit the access pattern of parameter updates to derive Hybrid-Parallel asynchronous algorithms.

Event Type: 
Adancement/Defense
Advisor: 
Professor S.V.N. Vishwanathan
Graduate Program: 
Computer Science Ph.D.