US Patent Application 18345631. HUMAN-MACHINE INTERACTION METHOD AND HUMAN-MACHINE INTERACTION APPARATUS simplified abstract

From WikiPatents
Jump to navigation Jump to search

HUMAN-MACHINE INTERACTION METHOD AND HUMAN-MACHINE INTERACTION APPARATUS

Organization Name

HUAWEI TECHNOLOGIES CO., LTD.


Inventor(s)

Shuaihua Peng of Shanghai (CN)

Hao Wu of Shanghai (CN)

HUMAN-MACHINE INTERACTION METHOD AND HUMAN-MACHINE INTERACTION APPARATUS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18345631 titled 'HUMAN-MACHINE INTERACTION METHOD AND HUMAN-MACHINE INTERACTION APPARATUS

Simplified Explanation

- The patent application describes an application that enables human-machine interaction using gesture actions and motion tracking. - The application uses an optical sensor in an object device to detect gesture action information from the user. - It also uses a motion sensor in a mobile terminal to detect motion track information. - When the detected gesture action information matches the motion track information, the application executes a corresponding control. - The purpose of this invention is to provide a simplified and efficient method for users to interact with machines. - The application aims to improve the user experience by allowing them to control devices using natural gestures and motions. - The use of optical and motion sensors enables accurate detection and matching of gesture actions and motion tracks. - The invention can be applied to various devices and systems that require human-machine interaction. - The patent application emphasizes the importance of simplicity and effectiveness in the interaction method.


Original Abstract Submitted

This application provides a human-machine interaction method and the like. In one aspect, gesture action information of the user is detected by using an optical sensor of the object device; motion track information of the mobile terminal is detected by using a motion sensor of the mobile terminal. When the gesture action information matches the terminal motion track information, corresponding first control is executed.