Max-Planck-Institut für Informatik
max planck institut
informatik
mpii logo Minerva of the Max Planck Society
 

MPI-INF or MPI-SWS or Local Campus Event Calendar

<< Previous Entry Next Entry >> New Event Entry Edit this Entry Login to DB (to update, delete)
What and Who
Title:Stochastic Optimization Methods for Deep Learning
Speaker:Mahesh Chandra
coming from:International Max Planck Research School for Computer Science - IMPRS
Speakers Bio:MSc student
Event Type:IMPRS Research Seminar
Visibility:D1, D2, D3, D4, D5, SWS, RG1, MMCI
We use this to send out email in the morning.
Level:Public Audience
Language:English
Date, Time and Location
Date:Monday, 27 June 2016
Time:12:00
Duration:30 Minutes
Location:Saarbrücken
Building:E1 4
Room:R024
Abstract
Recent advances in Deep Learning have enabled to train deep architectures efficiently. However, they can be further improved upon by using intelligent optimization schemes. A major advance in training any large scale machine learning model is the use of Stochastic Gradient Descent (SGD). There are many drawbacks with SGD such as decreasing step-size, noisy gradients etc. This has led to a line of work on improving upon SGD and has resulted in many efficient algorithms such as Adagrad, RMSProp and Adam. In this talk we aim to introduce you to these schemes and provide insights on designing such algorithms.
Contact
Name(s):IMPRS Office Team
Phone:0681 93251802
EMail:--email address not disclosed on the web
Video Broadcast
Video Broadcast:NoTo Location:
Tags, Category, Keywords and additional notes
Note:
Attachments, File(s):
Created by:Stephanie Jörg/MPI-INF, 06/24/2016 12:31 PMLast modified by:Uwe Brahm/MPII/DE, 11/24/2016 04:13 PM
  • Stephanie Jörg, 06/24/2016 12:36 PM -- Created document.