A Cloud-based Mobile Health Monitoring and Real-time Guidance System

CWC News
Tuesday, June 28, 2016
San Diego, CA
A Cloud-based Mobile Health Monitoring and Real-time Guidance System

Currently, most physical therapy patients receive only verbal and written instructions for their  at-home care -- a strategy that lacks the benefit of the direct feedback a physical therapist could provide. Not surprisingly, this strategy yields very low rates of compliance and accuracy. The Center for Wireless Communications’ mHealth Initiative aims to provide detailed guidance to physical therapy patients so that the exercise regimen they do at home can be as effective as what they do in the clinic. The researchers’ goal is to create a system tailored to individuals and their particular exercise programs that is also highly accurate at recording and directing force and movement.

Collaborating on this mobile health monitoring and real-time guidance system are  CWC affiliate, Prof. Pam Cosman (a leading authority in image/video processing and compression),  CWC Director Prof. Sujit Dey (an expert in cloud computing and embedded system-on-chips) and Electrical and Computer Engineering Chair Truong Nguyen(a leading figure in image/video processing), Other experts recruited for the project include Bioengineering Professor. Todd Coleman, Dr. Catherine Printz, a physical therapist at Thornton Hospital, and Associate Professor Sri Kurniawan, an electrical engineer and human factors expert at UC Santa Cruz.

The team’s system is comprised of three types of sensors: (1) Microsoft Kinect Sensors, which provide color and depth measurements, (2) FingerTPS Force Sensors, and (3) Epidermal Electronics Systems (EES), which can measure a variety of things including EMG signals from the muscles, heart rate, and sweat. Although the Kinect sensors must be fixed in the clinic and the patient’s home, FingerTPS sensors and Epidermal Electronics are worn on the patient’s body. FingerTPS sensors are placed on the patient’s finger and palm, while EES functions more like a flexible skin tattoo that washes off with water. These sensors work together to collect precise data capable of providing the patient with detailed feedback regarding their exercise regimens.

An overview of the entire system at work begins in the clinic, with a physical therapist working with a patient doing a particular exercise. The clinic must be instrumented with a camera and other sensors to allow for the creation of a 3-dimensional, spatio-temporal model of the particular exercise, and this data is recorded and sent to the cloud. Next, the patient attempts to repeat the exercises at home either alone or with an untrained caregiver. Both camera and tattoo sensors are used to collect various measurements during the exercise, and this data is then encoded and sent to the cloud where it is compared against what was done in the clinical setting. Finally, a form of guidance rendering, based off this comparison, is sent back to the patient so that they can improve on their home exercises and model them more accurately.

A target population for this project would be stroke patients, as they often have to engage in an enormous amount of physical therapy and are unable to receive feedback from their own muscle controls. The system could be particularly useful to these patients by informing them of which muscles are being recruited and whether or not they are performing their exercises in the way their physical therapist has instructed them to ensure the quickest possible recovery.

The development of the system thus far has sparked studies and publications regarding arm motion classification, force repeatability, hand articulation tracking, and data alignment and real-time guidance. Future research will involve increasing knowledge of hand-tracking, integration of epidermal electronics, cloud-based controls, and data analytics, as well as human factors and usability. Funding provided by NSF’s Smart and Connected Health program is allowing the mHealth initiative to grow and develop more broadly and robustly.