Strategic Freezing
dc.contributor.author | Seligman, Zachary | |
dc.contributor.author | Patrick, David | |
dc.contributor.author | Fernandez, Amanda | |
dc.date.accessioned | 2022-08-04T21:53:55Z | |
dc.date.available | 2022-08-04T21:53:55Z | |
dc.date.issued | 2022-07-28 | |
dc.description | This work was supported by the National Nuclear Security Administration, Minority Serving Institutions Partnership Program DE-NA0003948. | en_US |
dc.description.abstract | Convolutional neural networks (CNNs) are notoriously data-intensive, requiring significantly large datasets for training accurately in an appropriate runtime. Recent approaches aiming to reduce this requirement focus on removal of low-quality samples in the data or unimportant filters, leaving a vast majority of the training set and model in tact. We propose Strategic Freezing, a new training strategy which strategically freezes features in order to maintain class retention. Preliminary results of our approach are demonstrated on the Imagenette dataset using ResNet34. | en_US |
dc.description.department | Computer Science | en_US |
dc.identifier.uri | https://hdl.handle.net/20.500.12588/1076 | |
dc.language.iso | en_US | en_US |
dc.subject | undergraduate student works | |
dc.title | Strategic Freezing | en_US |
dc.type | Poster | en_US |