Development of an exoskeleton for sit-to-stand (STS) transition support based on multimodal action intent recognition
1.5 million Senior citizens live under supervision and most require assistance with at least one or more Activities of Daily Living (ADL), including transferring in and out of chairs, beds and toilets which requires the ability to perform sit-to-stand transition. This sit-to-stand transition is a complex full-body activity that requires the synergistic coordination of the upper and lower limbs and trunk. The goal of this research is to come up with a working prototype of an active, assistive exoskeleton which can be controlled based on behavioral models of user’s intent. The research plan includes synchronized multimodal data-collection of sit-to-stand transitions across various environmental situations and action intent contexts and development of intelligent control algorithms to actuate and operate the exoskeleton. This work can be expanded to control robots in any environment which requires human-robot coordination to complete the same task.
Videos show simulation results for three different speeds of sit-to-stand transition. (Shadowed model is the desired motion and the solid model is the motion obtained by using a model based controller providing assistive torques)