Parameterized Structured Pruning for Deep Neural Networks

Günther Schindler*, Wolfgang Roth, Franz Pernkopf, Holger Fröning

*Korrespondierende/r Autor/-in für diese Arbeit

Publikation: Beitrag in Buch/Bericht/KonferenzbandBeitrag in einem KonferenzbandBegutachtung


As a result of the growing size of Deep Neural Networks (DNNs), the gap to hardware capabilities in terms of memory and compute increases. To effectively compress DNNs, quantization and pruning are usually considered. However, unconstrained pruning usually leads to unstructured parallelism, which maps poorly to massively parallel processors, and substantially reduces the efficiency of general-purpose processors. Similar applies to quantization, which often requires dedicated hardware. We propose Parameterized Structured Pruning (PSP), a novel technique to dynamically learn the shape of DNNs through structured sparsity. PSP parameterizes structures (e.g. channel- or layer-wise) in a weight tensor and leverages weight decay to learn a clear distinction between important and unimportant structures. As a result, PSP maintains prediction performance, creates a substantial amount of sparsity that is structured and, thus, easy and efficient to map to a variety of massively parallel processors, which are mandatory for utmost compute power and energy efficiency.

TitelMachine Learning, Optimization, and Data Science - 6th International Conference, LOD 2020, Revised Selected Papers
Redakteure/-innenGiuseppe Nicosia, Varun Ojha, Emanuele La Malfa, Giorgio Jansen, Vincenzo Sciacca, Panos Pardalos, Giovanni Giuffrida, Renato Umeton
Herausgeber (Verlag)Springer, Cham
ISBN (elektronisch)978-3-030-64580-9
ISBN (Print)978-3-030-64579-3
PublikationsstatusVeröffentlicht - 1 Jan. 2020
Veranstaltung6th International Conference on Machine Learning, Optimization, and Data Science: LOD 2020 - Virtuell, Siena, Italien
Dauer: 19 Juli 202023 Juli 2020


NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Band12566 LNCS
ISSN (Print)0302-9743
ISSN (elektronisch)1611-3349


Konferenz6th International Conference on Machine Learning, Optimization, and Data Science
KurztitelLOD 2020
OrtVirtuell, Siena

ASJC Scopus subject areas

  • Theoretische Informatik
  • Informatik (insg.)


Untersuchen Sie die Forschungsthemen von „Parameterized Structured Pruning for Deep Neural Networks“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren