Strategic Freezing

dc.contributor.authorSeligman, Zachary
dc.contributor.authorPatrick, David
dc.contributor.authorFernandez, Amanda
dc.date.accessioned2022-08-04T21:53:55Z
dc.date.available2022-08-04T21:53:55Z
dc.date.issued2022-07-28
dc.descriptionThis work was supported by the National Nuclear Security Administration, Minority Serving Institutions Partnership Program DE-NA0003948.en_US
dc.description.abstractConvolutional neural networks (CNNs) are notoriously data-intensive, requiring significantly large datasets for training accurately in an appropriate runtime. Recent approaches aiming to reduce this requirement focus on removal of low-quality samples in the data or unimportant filters, leaving a vast majority of the training set and model in tact. We propose Strategic Freezing, a new training strategy which strategically freezes features in order to maintain class retention. Preliminary results of our approach are demonstrated on the Imagenette dataset using ResNet34.en_US
dc.description.departmentComputer Scienceen_US
dc.identifier.urihttps://hdl.handle.net/20.500.12588/1076
dc.language.isoen_USen_US
dc.subjectundergraduate student works
dc.titleStrategic Freezingen_US
dc.typePosteren_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
CONNECT Poster - Zachary.pdf
Size:
549.49 KB
Format:
Adobe Portable Document Format
Description:
Poster

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.86 KB
Format:
Item-specific license agreed upon to submission
Description: