Font Size: a A A

Research On The Design Method Of User Experience Of Intelligent Vehicle Products Under Multi-Modal Interaction Context

Posted on:2024-01-31Degree:DoctorType:Dissertation
Country:ChinaCandidate:T WeiFull Text:PDF
GTID:1522306917969759Subject:Design
Abstract/Summary:PDF Full Text Request
In the field of automobile manufacturing,the human-machine interaction system inside the vehicle is gradually developed towards digitization,intelligence,and networking,and emotional experience has become a focus of future research on human-machine interaction.Identifying the driver’s emotional state during the driving process and providing natural driving assistance is considered an important direction for the development of intelligent automobiles.Negative emotions can affect the driver’s cognitive level,increase the driver’s cognitive load,and thereby affect the user experience.The effective identification and regulation of driving emotions in the vehicle’s human-machine interaction experience are a focus of the human-machine interaction experience,which directly affect the naturalness of the human-machine interaction process,driver safety,driving performance,and overall user satisfaction.This study takes the perspective of emotional experience in human-machine interaction inside intelligent vehicles,with a specific focus on driver emotion recognition and regulation.The emotional experience model of intelligent onboard products and the multimodal interaction mechanism are studied.The driver’s physiological signal indicators are collected using wearable multi-channel physiological recorders to realize emotion recognition based on physiological indicators.Physiological signals are primarily controlled by the human autonomic nervous system and endocrine system and are not significantly influenced by the user’s subjective experience.This characteristic facilitates a more objective and accurate reflection of the user’s emotional state,resulting in a more comprehensive and reliable evaluation.The study analyzes the impact of anger on the driver’s cognitive load during the main driving task and uses the VACP model to predict cognitive resource allocation during the driving process.A multimodal interaction balance model is proposed to improve negative driving emotions and enhance the driver’s user experience.The effectiveness of different multimodal interaction combinations is analyzed through physiological experiments under various contexts to determine the effectiveness of emotion regulation,and the priority of each interaction scheme is evaluated.To validate the effectiveness of the proposed methodology,a deep learning-based multimodal interaction emotion regulation system and an emotional experience design strategy for intelligent on-board products are constructed.This study has significant implications for finding new breakthroughs in the application of intelligent on-board products in driving scenarios,improving the industrial design level,and market competitiveness of related products.The primary contributions of this thesis include:(1)An emotional experience model and multimodal interaction mechanism for intelligent in-vehicle products were constructed to explain the cognitive and emotional regulation mechanisms in the human-machine interaction process,effectively exploring the essence of multimodal human-machine interaction in intelligent vehicles.The impact of driving emotional states on driving experience was elucidated through literature analysis,and evaluation indicators and methods for quantifying emotions were summarized.Factors that induce negative driving emotions and their effects on driving behavior were verified through empirical analysis.Based on theoretical analysis and design research,a design index system for emotional experience of multimodal interaction in intelligent in-vehicle products and a multimodal interaction emotional experience model for intelligent in-vehicle products were constructed.(2)A relationship model between the driver’s emotions and physiological signal indicators was established to explore the correlation between driving emotions and electrocardiogram(ECG),electrodermal activity(EDA),surface electromyography(sEMG),respiration(RESP),and photoplethysmography(PPG)signals,providing a quantification approach for intelligent and real-time recognition of driving anger emotions.By combing the basic theory,emotion induction methods,and emotion recognition algorithms related to emotions,a physiological signal-driven driving emotion recognition experiment was constructed,presenting a method for feature extraction of physiological signals and an emotion recognition algorithm model.The correlation between driving emotions and physiological indicators was explored,key indicators influencing driving emotions and emotion recognition algorithms were determined.(3)A multimodal interaction equilibrium model of anger regulation was developed to investigate the relationship between driving situations,anger and cognitive load,and to develop a naturalized method of multimodal interaction for anger regulation in driving situations.A simulated driving experiment was built to analyze the cognitive load in different driving situations and angry driving emotions in combination with the NASA-TLX scale.The VACP scale was used to predict the cognitive load under angry emotions,to determine the relationship between angry emotions and cognitive load,to analyze the cognitive resource occupation when drivers have angry emotions,and to propose a multimodal interaction balance model for angry driving emotion regulation,which can effectively alleviate angry emotions through modal compensation.(4)A simulated driving experiment of multimodal interaction was organized to conduct research into regulating anger driving emotions,proposing a more naturalistic human-machine interaction method for regulating anger emotions.Studies were conducted on three modalities:visual,auditory,and olfactory,verifying the effectiveness of different multimodal interaction schemes in regulating anger driving emotions.Combining statistical tools such as SPSS,the differences between multimodal interaction schemes before and after anger driving emotion regulation were analyzed,with results showing the effectiveness of multimodal interaction methods in regulating driving anger emotions,with olfactory interaction showing significant advantages.(5)Develop a deep learning-based multimodal interactive emotion regulation system design and intelligent in-vehicle product application idea validation.The algorithm framework of multimodal interaction emotion regulation based on deep learning was proposed.The emotional experience design strategy of intelligent invehicle products was constructed,the development process of emotional experience design of intelligent in-vehicle products was proposed,and the emotional experience design of intelligent in-vehicle products was used as an example to verify the product application ideas.Overall,this research provides a systematic proposal for the multimodal interaction method model and emotion regulation algorithm framework for driving emotion regulation,based on timely detection and recognition of driver emotions,with the aim of improving the driver’s pleasure experience.This study also provides theoretical reference and methodological guidance for solving the user driving experience design in multimodal interaction situations and designing effective multimodal interfaces for intelligent vehicle cockpits.
Keywords/Search Tags:User emotional experience, multimodal interaction, intelligent vehicle products, model building
PDF Full Text Request
Related items