Serverless Implementation of Convolutional Neural Networks to Analyze Complex Workout Activities
Physical exercise is a critically important aspect of living a healthy lifestyle, and now, more and more people are using technology to track their workouts. Commonly available devices, such as smartwatches, provide users with all the high-tech sensors needed to implement a gesture recognition algorithm. In this paper, we present a methodology to detect and recognize strength training workouts, using a single inertia measurement unit within a smartwatch. The goal is to create a tool that can scale easily to a larger market and utilizes commonly available devices. We classify five diﬀerent activities (curls, sit-ups, squats, burpees, and box jumps). The technical approach uses a machine learning algorithm based upon a convolutional neural network architecture. Data is collected from 2 individuals, while wearing a smartwatch. These individuals perform a standardized workout that includes three sets of varying repetitions of each activity. A convolutional neural network model is trained for each activity. Subsequently, two approaches are used to implement the models. The first approach, the serverless implementation, sends data from the smartphone application into the cloud. The data is then processed in Amazon Web Services, and the results of the analysis are sent back to the smartphone. The second approach sends data into the cloud as well, however, the data analysis is done on a local server before sending the results back to the smartphone. After the models have an implementation method, they are put to the test. Each activity is tested 49 times. In each of these tests, 10 repetitions are done, and the algorithm identifies the activity, separates data into the correct number of sets, and accurately counts the repetitions. After the conclusion of these tests, the models prove to be more reliable for the squats and curl activities and shows mixed results for burpees, box jumps, and sit ups.