Artificial Intelligence Seminar

Tuesday, November 28, 2017 - 12:00pm to 1:00pm


3305 Newell-Simon Hall



Modern Convex Optimization within Deep Learning

This talk discusses a new paradigm for deep learning that integrates the solution of optimization problems "into the loop." We highlight two challenges present in today's deep learning landscape that involve adding structure to the input or latent space of a model. We will discuss how to overcome some of these challenges with the use of learnable optimization sub-problems that subsume standard architectures and layers. These architectures obtain state-of-the-art empirical results in many domains such as continuous action reinforcement learning and tasks that involve learning hard constraints like the game Sudoku.

We will cover topics from these two papers:

1. Input Convex Neural Networks. Brandon Amos, Lei Xu, J. Zico Kolter. ICML 2017.

2. OptNet: Differentiable Optimization as a Layer in Neural Networks. Brandon Amos, J. Zico Kolter. ICML 2017.

Joint work with J. Zico Kolter.

The AI Seminar is generously sponsored by Apple.

Event Website:

For More Information, Contact:


Seminar Series