Strategic Freezing
Date
2022-07-28
Authors
Seligman, Zachary
Patrick, David
Fernandez, Amanda
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Convolutional neural networks (CNNs) are notoriously data-intensive, requiring significantly large datasets for training accurately in an appropriate runtime. Recent approaches aiming to reduce this requirement focus on removal of low-quality samples in the data or unimportant filters, leaving a vast majority of the training set and model in tact. We propose Strategic Freezing, a new training strategy which strategically freezes features in order to maintain class retention. Preliminary results of our approach are demonstrated on the Imagenette dataset using ResNet34.
Description
This work was supported by the National Nuclear Security
Administration, Minority Serving Institutions Partnership Program
DE-NA0003948.
Keywords
undergraduate student works
Citation
Department
Computer Science