Seligman, ZacharyPatrick, DavidFernandez, Amanda2022-08-042022-08-042022-07-28https://hdl.handle.net/20.500.12588/1076This work was supported by the National Nuclear Security Administration, Minority Serving Institutions Partnership Program DE-NA0003948.Convolutional neural networks (CNNs) are notoriously data-intensive, requiring significantly large datasets for training accurately in an appropriate runtime. Recent approaches aiming to reduce this requirement focus on removal of low-quality samples in the data or unimportant filters, leaving a vast majority of the training set and model in tact. We propose Strategic Freezing, a new training strategy which strategically freezes features in order to maintain class retention. Preliminary results of our approach are demonstrated on the Imagenette dataset using ResNet34.en-USundergraduate student worksStrategic FreezingPoster