Font Size: a A A

Research On Multimodal Fusion-based Control Strategy For Lower-limb Exoskeleton Robot

Posted on:2019-06-30Degree:DoctorType:Dissertation
Country:ChinaCandidate:D X LiuFull Text:PDF
GTID:1368330566459282Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
So far,there are a large number of special people with limb dyskinesia in the world.The limbs lost the ability to move or walk normally.However,the application of lower-limb exoskeleton robot makes them stand and walk normally again,which greatly improves their physical function and quality of life.As a human-centered human-machine collaborative intelligent system,the primary task of lower-limb exoskeleton robot is to understand the human motion intention,and then the human-machine system can coordinate motion to produce auxiliary-walking,walking-aid and other actions.In fact,various kinds of information during human walking will be reflected in different state variables,for example,the movement intention is reflected in forms of EEG signals,and also reflected by state changes during the human motion,including joint angle,angular velocity,angular acceleration,and foot pressure.At present,some achievements have been made in the field of human-robot motion recognition based on traditional physical sensors such as force,position,and attitude,and exoskeleton robot control based on biological signals such as EEG and EMG.However,little research has been done on intention recognition based on multimodal sensors fusion,vision-assisted robot autonomous decision-making,the fusion of human intelligence and robot autonomous decision-making,especially for the fusion of environmental information.In order to make full use of the rapid and global brain signals,continuous and robust physical signals and real-time and accurate computer vision,this dissertation studies the human-robot-environment fusion control strategy based on the multimodal sensors,including the following aspects:(1)In order to obtain multimodal information during human-robot-environment interaction,computer vision is introduced to lower-limb exoskeleton robot for providing visual feedback,and a variety of sensors are installed on the human-robot system,such as foot pressure sensors,joint position sensors,depth camera,EEG signal acquisition instrument and so on.To fully excavate the deep-seated features of single type sensor,many algorithms are used and experimental analysis is carried out.This dissertation studies the foot touchdown recognition based on foot pressure sensors,gait phase recognition based on lower-limb joint sensors,obstacle recognition based on depth camera and leg-raising intention recognition based on brainmachine interface.(2)The multimodal fusion decision mechanism based on multimodal sensor information is established,and the control strategy of the lower limb exoskeleton robot which combines the multi-modal information,such as foot pressure,joint position,environmental information and human intention is proposed.It includes information fusion of foot pressure and lower-limb joint position to obtain the human-robot state;robot autonomous decision based on humanrobot motion state and ground environment information;advanced decision fusion based on human motion intention and robot autonomous decision.By combining the human-robot state,the walking environment information and human intention,the mechanisms of robot autonomous decision and human-robot fusion decision are established.While facing different obstacles,the robot can make real-time gait planning and adjustment according to environmental information,and the average completion rate of task can reach 82.2%.It ensures the safety and reliability of human-robot system,and at the same time,greatly improves the adaptability of human-robot system to complex walking environment.(3)In gait planning for human-robot-environment fusion,several researches have been carried out.To improve the performance of human-robot fusion,the off-line gait trajectory fitting based on LSTM recurrent neural network and the individual gait pattern generation based on body parameters are studied.To improve the adaptability of human-robot-environment,the on-line parametric gait trajectory planning based on human-robot fusion decision is studied.By combining human decision with robot autonomous decision,the gait planning methods based on human-robot-environment fusion are preliminarily studied,which makes the lower extremities exoskeleton robot greatly improve in adapting to the wearer's gait pattern and adapting to different walking environment.The experimental results show that the gait pattern which is adaption to the humanrobot-environment can be obtained based on multimodal human-robot-environment information fusion,which makes the human-robot system more secure,stable and reliable in performing the walking task.It also demonstrates the effectiveness of the multimodal human-robotenvironment information fusion control strategy for lower limb exoskeleton robot.
Keywords/Search Tags:Lower-limb exoskeleton robot, multimodal fusion, brain-machine interface, environmental perception, human-machine fusion decision
PDF Full Text Request
Related items