Max-Planck-Institut für Informatik
max planck institut
informatik
mpii logo Minerva of the Max Planck Society
 

MPI-INF or MPI-SWS or Local Campus Event Calendar

New for: D1, D2, D3, D4, D5
<< Previous Entry Next Entry >> New Event Entry Edit this Entry Login to DB (to update, delete)
What and Who
Title:DNNs for Sparse Coding and Dictionary Learning
Speaker:Debabrata Mahapatra
coming from:Indian Institute of Science Bangalore
Speakers Bio:MSc graduate
Event Type:PhD Application Talk
Visibility:D1, D2, D3, INET, D4, D5, SWS, RG1, MMCI
We use this to send out email in the morning.
Level:Public Audience
Language:English
Date, Time and Location
Date:Monday, 8 October 2018
Time:11:00
Duration:60 Minutes
Location:Saarbrücken
Building:E1 5
Room:R029
Abstract
Interpreting the iterative algorithm ISTA as an unfolded Deep Neural Network (DNN), a novel architecture was proposed, in which, the activation functions are analogous to the proximal operators. Unlike the standard DNNs, in this architecture, the weights and biases were kept fixed by using prior knowledge, while the activation functions were learned from the data. This lead to a rich variety of proximal operators that are suitable, in particular, for sparse coding. Consequently, the proposed network outperformed state-of-the-art sparse coding algorithms by a margin of 4 to 6 dB. This architecture is further extended for Dictionary learning by borrowing ideas from Autoencoders, where the encoding part is performed by the proposed model. This extended model learns dictionaries with which data can be represented sparsely in an unsupervised manner.
Contact
Name(s):Stephanie Jörg
Phone:0681 9325 1800
EMail:--email address not disclosed on the web
Video Broadcast
Video Broadcast:NoTo Location:
Tags, Category, Keywords and additional notes
Note:
Attachments, File(s):
Created:
Stephanie Jörg/MPI-INF, 10/05/2018 12:12 PM
Last modified:
Stephanie Jörg/MPI-INF, 10/08/2018 09:41 AM
  • Stephanie Jörg, 10/08/2018 09:42 AM
  • Stephanie Jörg, 10/05/2018 12:18 PM -- Created document.