18202849. EXTENDED REALITY BASED DIGITAL ASSISTANT INTERACTIONS simplified abstract (APPLE INC.)
Contents
EXTENDED REALITY BASED DIGITAL ASSISTANT INTERACTIONS
Organization Name
Inventor(s)
Lynn I. Streja of San Francisco CA (US)
Saurabh Adya of San Jose CA (US)
Keith P. Avery of Seattle WA (US)
Karan M. Daryanani of San Fransisco CA (US)
Stephen O. Lemay of Palo Alto CA (US)
Myra C. Lukens of San Francisco CA (US)
Sreeneel K. Maddika of San Ramon CA (US)
Chaitanya Mannemala of San Ramon CA (US)
Aswath Manoharan of Tucson AZ (US)
Pedro Mari of Santa Cruz CA (US)
Jay Moon of San Francisco CA (US)
Abhishek Rawat of Fremont CA (US)
Garrett L. Weinberg of Santa Cruz CA (US)
EXTENDED REALITY BASED DIGITAL ASSISTANT INTERACTIONS - A simplified explanation of the abstract
This abstract first appeared for US patent application 18202849 titled 'EXTENDED REALITY BASED DIGITAL ASSISTANT INTERACTIONS
Simplified Explanation
The patent application describes a process for using extended reality (XR) technology to interact with a digital assistant. Here are the key points:
- The process involves detecting a user's gaze at a specific object displayed in an XR environment.
- If the user gazes at the object, it expands into a list of objects, including a digital assistant representation.
- The user's gaze is then detected on the digital assistant object, initiating a digital assistant session.
- The digital assistant displays an animation to indicate that it is actively listening to the user's audio input.
Potential applications of this technology:
- XR-based digital assistants could be used in various industries, such as healthcare, education, and entertainment.
- It could enhance user experiences in XR gaming by providing interactive and responsive digital assistant characters.
- XR-based digital assistants could be integrated into smart home systems, allowing users to control their devices through gaze interaction.
Problems solved by this technology:
- Traditional methods of interacting with digital assistants may not be suitable for XR environments.
- This technology provides a more immersive and intuitive way for users to engage with digital assistants in XR.
Benefits of this technology:
- Users can seamlessly interact with digital assistants in XR environments without the need for physical input devices.
- The use of gaze detection and animations enhances the user experience and makes the interaction more engaging.
- XR-based digital assistants can provide personalized and context-aware assistance, improving efficiency and convenience for users.
Original Abstract Submitted
An example process includes: while displaying a portion of an extended reality (XR) environment representing a current field of view of a user: detecting a user gaze at a first object displayed in the XR environment, where the first object is persistent in the current field of view of the XR environment; in response to detecting the user gaze at the first object, expanding the first object into a list of objects including a second object representing a digital assistant; detecting a user gaze at the second object; in accordance with detecting the user gaze at the second object, displaying a first animation of the second object indicating that a digital assistant session is initiated; receiving a first audio input from the user; and displaying a second animation of the second object indicating that the digital assistant is actively listening to the user.
- APPLE INC.
- Lynn I. Streja of San Francisco CA (US)
- Saurabh Adya of San Jose CA (US)
- Keith P. Avery of Seattle WA (US)
- Karan M. Daryanani of San Fransisco CA (US)
- Stephen O. Lemay of Palo Alto CA (US)
- Myra C. Lukens of San Francisco CA (US)
- Sreeneel K. Maddika of San Ramon CA (US)
- Chaitanya Mannemala of San Ramon CA (US)
- Aswath Manoharan of Tucson AZ (US)
- Pedro Mari of Santa Cruz CA (US)
- Jay Moon of San Francisco CA (US)
- Abhishek Rawat of Fremont CA (US)
- Garrett L. Weinberg of Santa Cruz CA (US)
- G06T19/00
- G06T13/40
- G10L15/22
- G06F3/01