DC Field | Value | Language |
---|---|---|
dc.contributor.author | LEE, SUNG GU | - |
dc.contributor.author | HA, MINHO | - |
dc.date.accessioned | 2021-06-01T08:52:36Z | - |
dc.date.available | 2021-06-01T08:52:36Z | - |
dc.date.created | 2021-01-04 | - |
dc.date.issued | 2020-07-23 | - |
dc.identifier.uri | https://oasis.postech.ac.kr/handle/2014.oak/106097 | - |
dc.description.abstract | Hardware-efficient CNN model design can be divided into two stages: training of a large baseline network to achieve high accuracy and applying model compression to create a smaller network, at the possible expense of a slight reduction in accuracy. This paper proposes a new differential model compression (DMC) method based on bilevel optimization to find the importance of channels in a pretrained CNN. Experimental results show that, for model compression for an image classification task, DMC requires only 12 GPU minutes to achieve a similar compression ratio, but with increased image classification accuracy, when cmpared to the previous best method. | - |
dc.language | English | - |
dc.publisher | ACM SIGDA | - |
dc.relation.isPartOf | Design Automation Conference | - |
dc.relation.isPartOf | Proceedings of the 57th Design Automation Conference | - |
dc.title | DMC: Differentiable Model Compression for Hardware-Efficient Convolutional Neural Network | - |
dc.type | Conference | - |
dc.type.rims | CONF | - |
dc.identifier.bibliographicCitation | Design Automation Conference | - |
dc.citation.conferenceDate | 2020-07-19 | - |
dc.citation.conferencePlace | US | - |
dc.citation.conferencePlace | virtual | - |
dc.citation.title | Design Automation Conference | - |
dc.contributor.affiliatedAuthor | LEE, SUNG GU | - |
dc.contributor.affiliatedAuthor | HA, MINHO | - |
dc.description.journalClass | 1 | - |
dc.description.journalClass | 1 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
library@postech.ac.kr Tel: 054-279-2548
Copyrights © by 2017 Pohang University of Science ad Technology All right reserved.