Font Size: a A A

A Study On Hybrid Human-machine Interface And Application In Assisting Paralyzed Patients

Posted on:2020-06-03Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q Y HuangFull Text:PDF
GTID:1364330620958613Subject:Pattern Recognition and Intelligent Systems
Abstract/Summary:PDF Full Text Request
Human-machine interfaces(HMIs)are used to realized communication and control between human and machines,such as translating human intentions into machine commands,or providing machine information and requests to human.For patients with spinal cord injuries(SCIs),amyotrophic lateral sclerosis(ALS)and locked-in syndrome(LIS),common HMIs(e.g.,mouse,keyboard,touch screen,etc)are no longer suitable due to the dysphonia,paralysis and brain damage accompanied with these patients.Thus,it is of great significance to develop novel HMIs to assist disabled patients in daily life and during their rehabilitation processes.Multiple physiological signals can be used as the input of HMIs,such as the electroencephalogram(EEG),the electrooculogram(EOG)and the electromyograph(EMG).These signals can not only help patients complete communication and control tasks,but also can be used as bio-markers of patients’ conditions during the diagnosis.In this paper,we focus on using these physiological signals to develop novel non-manual HMIs,aiming to address current challenges in this field and help disabled patients realize effective communication and control.First,we come up with a novel EOG-based HMI for wheelchair control with the aim of helping severely paralyzed individuals regain some mobility.The user can select a button on the graphical user interface(GUI)by blinking in sync with its flashes.The algorithm detects the eye blinks from a channel of vertical EOG data and determines the user’s target button based on the synchronization between the detected blinks and the button’s flashes.Each button corresponds to a wheelchair command,such as left and right turns,forward and backward motions,acceleration,deceleration,and stopping.According to the experimental results,the proposed HMI can accurately provide sufficient commands with a satisfactory response time and a short training period.Most existing HMI are designed to control a single assistive device,such as a wheelchair,a robotic arm or a prosthetic limb.However,many daily tasks require combined functions which can only be realized by integrating multiple robotic devices.Thus,we combine a wheelchair and an intelligent robotic arm as an integrated system and then come up with a hybrid HMI to control it.The hybrid HMI is based on electrooculogram(EOG),computer vision and automatic path planning techniques.The system is used to help patients with spinal cord injuries(SCIs)accomplish a mobile self-drinking task: moving from a random position to reach a table,grasping a target bottle on the table and drinking water with a straw.The required control precision is much higher than the single-wheelchair control task.As the experimental results have shown,the proposed HMI can provide sufficient control precision to control the integrated system,help patients with SCIs accomplish the self-drinking task and keep the workload under an acceptable level.Moreover,we develop a hybrid brain-computer interface(BCI)based on EEG and EOG signals to improve the adaptivity and robustness of the BCI system.EEG signal is highly time-varying and is sensitive to the brain state and electrode positions.Compared with EEG,EOG signal usually has a higher signal-to-noise ratio and is more robust and stable.The main idea is to utilize EOG commands to modify some parameters of the EEG model to extend the validity of the model.We apply the proposed HMI to control the wheelchair robotic arm system.The user turns the wheelchair left/right by performing left/right hand motor imagery(MI),and generates other commands for the wheelchair and the robotic arm by performing eye blinks and eyebrow raising movements.The experimental results demonstrate that the proposed hybrid BCI could provide satisfied control accuracy for a system that consists of multiple robotic devices,and showed the potential of BCI-controlled systems to be applied in complex daily tasks.Last but not least,a novel HMI based on limited finger movement is proposed to help a locked-in patient realize communication and control with the outside world.The limited finger movement is first detected by a finger stall covered with AgCl and then translated into a switch signal.The finger stall is designed to decrease the interference caused by the body shaking.The patient can use the proposed HMI to control TV,answer simple questions,express his needs and input letters.According to the experimental results,the proposed HMI can help the patient improve finger mobility and enthusiasm of rehabilitation.It also shows a good potential in the clinical evaluation.
Keywords/Search Tags:human-machine interface(HMI), brain-computer interface(BCI), multiplemodality, disabled people assistance
PDF Full Text Request
Related items