Show brief item record

dc.contributor.authorSeligman, Zachary
dc.contributor.authorPatrick, David
dc.contributor.authorFernandez, Amanda
dc.date.accessioned2022-08-04T21:53:55Z
dc.date.available2022-08-04T21:53:55Z
dc.date.issued7/28/2022
dc.identifier.urihttps://hdl.handle.net/20.500.12588/1076
dc.descriptionThis work was supported by the National Nuclear Security Administration, Minority Serving Institutions Partnership Program DE-NA0003948.en_US
dc.description.abstractConvolutional neural networks (CNNs) are notoriously data-intensive, requiring significantly large datasets for training accurately in an appropriate runtime. Recent approaches aiming to reduce this requirement focus on removal of low-quality samples in the data or unimportant filters, leaving a vast majority of the training set and model in tact. We propose Strategic Freezing, a new training strategy which strategically freezes features in order to maintain class retention. Preliminary results of our approach are demonstrated on the Imagenette dataset using ResNet34.en_US
dc.language.isoen_USen_US
dc.subjectundergraduate student works
dc.titleStrategic Freezingen_US
dc.typePosteren_US
dc.description.departmentComputer Scienceen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show brief item record