Open Access System for Information Sharing

Login Library

 

Thesis
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A Channel Merging Approach to Control Sparsity in Neural Networks

Title
A Channel Merging Approach to Control Sparsity in Neural Networks
Authors
권혁준
Date Issued
2021
Publisher
포항공과대학교
Abstract
본 논문은 가중치 밀집도가 낮은 신경망에서의 가중치 스케줄 효율을 높이기 위해 1) 가중치 채널 병합(channel-merging)을 통해 밀집도가 높은 신경망을 만들어 스케줄 효율을 높이고, 2) 병합된 가중치 채널을 다룰 수 있는 하드웨어 가속기를 제안 및 평가한다.
In this thesis, a channel-merging offline scheduling scheme is presented for improving the efficiency of the previous offline scheduler in highly pruned convolutional neural networks (CNN). In the channel-merging step, two channels in the same layers are merged lane-wise to increase the network’s channel-level sparsity. Also, a modified hardware architecture is presented to handle merged and scheduled weights. With the zero-skip and outlier-aware scheduling schemes of the previous accelerator, the proposed merging and scheduling method achieve more lane utilization and speedup. Despite a little area overhead of the proposed hardware, fast calculation and reduced memory access make the energy consumption lower than the previous hardware.
URI
http://postech.dcollection.net/common/orgView/200000366536
https://oasis.postech.ac.kr/handle/2014.oak/111052
Article Type
Thesis
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse