Basic “Co-creating Abilities and Personalized Experience” (CAPE)
Project ID | RRG3-19001 |
Focus | Ability Data |
Project PI | Dr Yau Wei Yun Dr Loh Yong Joo |
Researcher | Eng How Lung, Research Fellow Glenn Lee Kian Giap, Project Officer |
The challenge
The growing ageing population in Singapore will lead to significantly higher demand for rehabilitation. Furthermore, with longevity, there will also be desire for older retirement age. This raises the dichotomy of higher disability and the need to work longer among these older generation.
Therefore, there are urgent and unfilled needs to
(i) Empower a person to rehabilitate at home.
(ii) Maintain, monitor and motivate a patient performing rehabilitation at home to ensure the patient’s health and good quality of life in a sustainable way without relying on the healthcare institutions.
In this POC study, we demonstrate a platform solution with the capability to simultaneously transmit measurements from multiple IMU sensors to a backend server, where healthcare practitioners at the backend is able to review the measurements in graphical form and in a form of 3D skeletal model for further analysis.
This lays the groundwork for the effort of developing a rehab monitoring and motivation App. Such an App solution could be used during inpatient and outpatient visits, as well as to monitor patient performing the prescribed rehab activities at home. Leveraging on real-time data captured, the clinician would be able to provide prompt invention plan.
The proposed solution
The proposed CAPE platform comprises the following key components:
(i) Measurement component: It involves the use of wearable sensors such as IMU and Fitbit to extract measurements that describe motion activity of a patient. In this pilot project, the focus is on the use of IMU for lower limb movement analysis and the use of Fitbit for hand movement analysis.
(ii) Cloud component: A cloud-based platform has been setup to support sending of sensor measurements to the backend server with a UI/UX dashboard interface, which enables clinicians/healthcare practitioners to conduct regular monitoring. The cloud component is implemented using the Things Board IoT platform, where the implemented platform could be scaled up to support more user participations and various types of communication protocols such as CoAP, HTTP and MQTT.
(iii) Visualization component: It is to transform received IMU measurements into animated motion sequences, presented via a 3D skeletal model. Thus, at the backend dashboard, there are two forms of presentation for the captured motion movements/activities, i.e., (i) in graphical form and (ii) in a form of 3D animation.
(iv) Activity analysis component: Deep Learning method is applied to develop models for classifying different types of activities based on measured IMU readings.
