Stay Informed:

COVID-19 (coronavirus) information
Zoom Links: Zoom Help | Teaching with Zoom | Zoom Quick Guide

Defense: Tempered Bregman Divergence for Continuous and Discrete Time Mirror Descent and Robust Classification

Speaker Name: 
Ehsan Amid
Speaker Title: 
Ph.D. Candidate
Speaker Organization: 
Computer Engineering PhD
Start Time: 
Friday, May 29, 2020 - 3:00pm
End Time: 
Friday, May 29, 2020 - 4:00pm
Location: 
Zoom - https://ucsc.zoom.us/j/93123621950

Abstract: Bregman divergence is an important class of divergence functions in Machine Learning. Many well-known updates including gradient descent and (un)normalized exponentiated gradient are motivated by using a Bregman divergence as the inertia term. Moreover, Bregman divergence is used as a measure of progress for online algorithms, or as the training loss for classification models. In this thesis, we introduce a class of tempered Bregman divergences that as special cases includes many well-known distance measures such as squared Euclidean and relative entropy. We explore the tempered updates motivated by the new tempered Bregman divergence and develop theorems that allow us to unify these updates as gradient descent. We show the application of the reparameterized updates by proving regret bounds for the special case of reparameterized exponentiated gradient. Finally, we extend the notion of a matching loss to the new tempered Bregman divergence and develop bounded classification loss functions that are significantly more robust to noise.

Event Type: 
Adancement/Defense
Advisor: 
Manfred Warmuth
Graduate Program: 
Computer Science & Engineering