Sensing technologies hold enormous potential for early detection of health changes that can dramatically affect the aging experience. Embedded health assessment can enable functional independence, improve self-management of chronic or acute conditions, and thus, improve quality of life
Extracting information from the sensors installed in the homes of elderly pose a unique set of challenges. Add to it the short amount of time the clinicians and nurses have to analyze this data, and the problem becomes more complicated. The ongoing work in this project focuses on development of algorithms to glean information from in-home sensor data and then presenting it in the form of textual summaries using Natural Language Generation techniques.
The main goal of this work was to develop algorithms for early illness recognition in elderly. Early illness recognition (EIR) is important, as research has shown that results in better medical outcomes and a reduction in health care cost. We developed methodologies (see Figure 1) that link sensor data to the medical (nursing) records for monitoring the residents of TigerPlace, an aging in place community from Columbia, Missouri.
Building on our current work, we propose to validate and deploy an innovative technological approach that automatically detects when falls have occurred or when the risk of falls is increasing. Subjects will not have to press buttons, pull cords or wear any devices. This new “passive” approach using sensors in the home could revolutionize detecting and preventing falls as well as measuring fall risk.
We leverage ongoing research at a unique local eldercare facility (TigerPlace) to study active sensing and fusion using vision and acoustic sensors for the continuous assessment of a resident’s risk of falling as well as the reliable detection of falls in the home environment. The project investigates the interplay between fall detection and fall risk assessment.
Researchers at the University of Missouri-Columbia and the University of Washington have established a multidisciplinary team comprised of researchers in computer science and engineering, nursing, and medical informatics dedicated to developing and evaluating technology to keep older adults functioning at higher levels and living independently. We have leveraged ongoing research at a unique local eldercare facility (TigerPlace) to study vision-based recognition methods for multi-person environments designed to capture continuous and automated assessments of older adults’ physical function.
Our objective is to explore new information technologies to assist the independent living of elderly people and enhance their quality of life at home, while utilizing the time and attention of caregivers and eldercare specialists in the highest efficiency.
The dream of older Americans is to remain as active and independent as possible for as long as possible. They want to age in place, not in institutions like nursing homes. Recently, enabling technology in the form of low cost sensors, computers, and communications systems has become available, which with supportive health care services makes the dream of aging in place a reality.