A Scalable and Distributed Ai-Assisted Human Behavior Analytics Platform Integrating Visual Iot Devices on The Edge

Date

2020

Authors

Guzman, Herb

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Computer vision and the Internet of Things (IoT) have brought about marked improvements in Visual IoT devices. These are increasingly becoming a more powerful means for analyzing human behavior by leveraging their embedded deep-learning or artificial intelligence (AI) models. Amid the current COVID-19 pandemic, these devices facilitate the implementation of AI-assisted human behavior platforms for remote monitoring of in-home health, remote aid to physicians in ICU rooms, or remote monitoring of delicate older people at senior centers. This work presents the feasibility study of a prototype Platform-as-a-Service (PaaS) IT ecosystem for human behavior analytics that was developed to be scalable by the number of visual IoT devices integrated in edge devices distributed across multiple locations. The service is customizable so that it can be implemented for use in environments where patients need close monitoring from clinicians because of risk of abnormal human activity behavior. Once deployed, this platform is designed to automatically detect, acquire and pipe its raw data to the cloud. These data include human skeleton joint tracking, RGB-color, Infra-red, and depth imaging. The edge device pushes these data to cloud search engines such as Elasticsearch, Amazon Web Services, and Microsoft Azure. The skeleton or body tracking is an embedded deep neural network model characteristic of the Microsoft Kinect IV Azure development kit. The body tracking, image, and optional audio signals are all synchronized in time. The body tracking data from the Kinects can be viewed in real-time with Kibana or extracted from the cloud in pseudo-real-time for complex data analytics and modeling by developers at another end of the ecosystem. We show that our platform is capable of tracking human behavior while effectively and accurately piping this information to various cloud services in pseudo real-time. We profiled the performance of our custom Kinect software through a parametric study with four edge-device benchmarks and multiple configurations of our software. We also demonstrate that we can achieve very high correlations between the data acquired at the edge, with two Kinect IV Azure visual IoT devices, as well as the data fetched from the cloud at other endpoints of the infrastructure. In addition, we provide examples of how raw body tracking data can be fetched from the cloud, numerically fused, and visualized in various forms. We also tested our platform with skeleton joint tracking and with hand keypoints tracking using a custom deep-learning model.

Description

This item is available only to currently enrolled UTSA students, faculty or staff. To download, navigate to Log In in the top right-hand corner of this screen, then select Log in with my UTSA ID.

Keywords

behavior monitoring, elasticsearch, hand keypoints, kinect azure, skeleton tracking, unity

Citation

Department

Electrical and Computer Engineering