Clinical Trials Directory

Trials / Unknown

UnknownNCT05385328

Multisensorial Analysis of Human Activity for Diagnosis and Early Detection of Functional Limitations

Multisensorial Analysis of Human Activity for Diagnosis and Early Detection of Functional Limitations (EYEFUL)

Status
Unknown
Phase
Study type
Observational
Enrollment
50 (estimated)
Sponsor
Nuria Máximo-Bocanegra · Academic / Other
Sex
All
Age
18 Years – 99 Years
Healthy volunteers
Accepted

Summary

Physical and cognitive changes provoked by a pathology worsen the functional capability of the individual making it more difficult to perform Activities of Daily Living (ADLs) and causing dependency and/or disability. There are standardized observational tests for the clinical assessment of the degree of functional limitation in basic or instrumental ADLs (e.g., Assessment Motor and Process Skills - AMPS). On the other hand, all these tests bear the problem of the subjectivity of the evaluator in the analysis. The presence of the evaluator in the test may have an influence in the way a subject performs ADLs. The goal of the project is to develop a methodology to design, implement and validate automatic clinical tests of functional limitation, that: 1) give objective assessments with clinical validity, and 2) remove the interference in the test execution caused by the physical presence of the evaluator.

Detailed description

Physical and cognitive changes provoked by a pathology worsen the functional capability of the individual making it more difficult to perform Activities of Daily Living (ADLs) and causing dependency and/or disability. There are standardized observational tests for the clinical assessment of the degree of functional limitation in basic or instrumental ADLs (e.g., Assessment Motor and Process Skills - AMPS). On the other hand, all these tests bear the problem of the subjectivity of the evaluator in the analysis. The presence of the evaluator in the test may have an influence in the way a subject performs ADLs. The goal of the project is to develop a methodology to design, implement and validate automatic clinical tests of functional limitation, that: 1) give objective assessments with clinical validity, and 2) remove the interference in the test execution caused by the physical presence of the evaluator. The human evaluator subjectivity will be replaced with an automatic system that extracts multimodal (i.e., multisensorial) information from the environment during the user functional assessment. Specifically including audio, video and depth sensors and the information collected from wearable sensors that the subject under test may carry. The objective assessment will also provide clues that, given the subject clinical history, can be used for early detection of limitations. This problem has not been systematically addressed in the literature. The project is thus a first solution of the development and clinical validation of an automated system that allows an objective evaluation of observational tests. Tackling the clinical assessment of functional limitations will be then performed in an adapted real environment (equipped with the adequate electronic sensors) in the URJC facilities. The EYEFUL-URJC subproject has a key role in the methodological design and clinical validation of the automatic evaluation tools. They will also carry out the actual tests over normal subjects and patients comparing the output of the current tools, as AMPS, with the output of the assessment tools developed for the project. In order to extract relevant features from video, depth, audio and other sensors, the project leverages the research experience on automatic sensing of human activities in intelligent spaces of EYEFUL-UPM and EYEFUL-UAH. The EYEFUL-UPM subproject concentrates in the analysis of the subject face, head pose, gaze, and accurate 3D alignment of the facial landmarks, that enable the estimation facial attributes useful for different tests (i.e., focus of attention, presence of pain, confusion, fear, etc.). The EYEFUL-UAH subproject concentrates the analysis on the user whole body activity and her/his interaction with objects, with depth and video sensors, and integrating also audio and wearables' data to automatically assess the functional capability of the evaluated persons. The three coordinated groups approach the project in an interdisciplinary way, with strong feedback requirements among them all along the development. This close interaction is fundamental to ensure the adequate focus of the technical developments given the strict clinical requirements of the task.

Conditions

Interventions

TypeNameDescription
DEVICEEYEFUL: healthy peopleThis kind of study implies that we do not require the use of any substance, medicine or therapeutic technique and neither imply any risk to the participants. The data to be captured are exclusively video and audio recordings, and different signals from smart watches (mainly related to movement). Addittionaly, the clinical evaluation of the performance of these activities will be carried out according to the assessment scales to be defined in the project. The results of this evaluation will be exclusively used in the training of the automatic systems developed in the project and in order to establish a correlation between the captured observations (video, audio, smart watches signals) and the diagnostic assessment.
DEVICEEYEFUL: people with pathologyThis kind of study implies that we do not require the use of any substance, medicine or therapeutic technique and neither imply any risk to the participants. The data to be captured are exclusively video and audio recordings, and different signals from smart watches (mainly related to movement). Addittionaly, the clinical evaluation of the performance of these activities will be carried out according to the assessment scales to be defined in the project. The results of this evaluation will be exclusively used in the training of the automatic systems developed in the project and in order to establish a correlation between the captured observations (video, audio, smart watches signals) and the diagnostic assessment.

Timeline

Start date
2022-11-01
Primary completion
2023-03-31
Completion
2024-08-31
First posted
2022-05-23
Last updated
2023-11-29

Locations

1 site across 1 country: Spain

Source: ClinicalTrials.gov record NCT05385328. Inclusion in this directory is not an endorsement.