| With the development of computer hardware and artificial intelligence,various artificial intelligence edge chips have begun to be Applied on mobile terminals,and the offline machine learning and deep learning capabilities of mobile devices are increasing day by day,especially represented by Apple A series chips and Core ML machine learning framework.The combination of software and hardware realizes the integrated and coordinated development of software and hardware,and brings many breakthrough functions to mobile Applications.As an excellent machine learning framework on the mobile side,Core ML uses machine learning accelerators to seamlessly compute machine learning models on CPU,GPU and neural network engines,bringing great convenience to the realization of various artificial intelligence Applications on the mobile side.The natural language processing,computer vision,speech recognition,and sound analysis capabilities included in Core ML provide new ideas for intelligent design.In recent years,the concept of natural interaction design has emerged,and the intelligent and humanized natural human-computer interaction method is more easily accepted by people.The use of multi-modal human-computer interaction methods such as gesture recognition,eye tracking,facial motion tracking,and speech recognition to interact with machines is a hotspot in interaction design research.This paper explores the possibility of using the Core ML framework in the interaction design of mobile applications from the perspective of interaction design,and uses Core ML’s computer vision,speech recognition,natural language processing,voice recognition and other technologies to achieve multi-modal integration between people and mobile smart devices.natural interaction.This paper analyzes the characteristics of the natural interaction mode of the mobile terminal,investigates the existing problems of the mobile terminal interaction,analyzes the feasibility of realizing the natural interaction of the mobile terminal based on Core ML,and proposes a gesture modality based on Core ML.,natural interaction design methods and principles of face modality and voice modality,and provide strategies for mobile application design optimization from the perspective of intelligent design.This paper also verifies the feasibility of the design methods and principles through design practice,and realizes multi-modal and barrier-free technology based on Core ML voice recognition,natural language processing,eye tracking,facial motion tracking,gesture recognition and other computer vision technologies.Natural interaction design broadens the research perspective of intelligent design and development of mobile application products. |