Apple inc. (20240192917). DETECTING NOTABLE OCCURRENCES ASSOCIATED WITH EVENTS simplified abstract

From WikiPatents
Jump to navigation Jump to search

DETECTING NOTABLE OCCURRENCES ASSOCIATED WITH EVENTS

Organization Name

apple inc.

Inventor(s)

Gavin K. Duffy of Los Gatos CA (US)

Raymond M. Macharia of San Francisco CA (US)

Jessica J. Peck of Morgan Hill CA (US)

Robert M. Schulman of Los Gatos CA (US)

DETECTING NOTABLE OCCURRENCES ASSOCIATED WITH EVENTS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240192917 titled 'DETECTING NOTABLE OCCURRENCES ASSOCIATED WITH EVENTS

The abstract describes a process where a first user interface and a virtual affordance representing an event are displayed concurrently. When a predetermined type of occurrence related to the event is detected, the display state of the virtual affordance is modified. Subsequently, a speech input is received, and based on context information from the modified display state, it is determined if the speech input corresponds to the virtual affordance. If so, the first user interface is replaced with a display of the event.

  • The process involves displaying a user interface and a virtual affordance representing an event simultaneously.
  • Detection of a specific type of occurrence triggers a modification in the display state of the virtual affordance.
  • Speech input is received after the modification, and its relevance to the virtual affordance is determined using context information.
  • If the speech input corresponds to the virtual affordance, the user interface is replaced with a display of the event.

Potential Applications

This technology could be applied in interactive displays, smart devices, and augmented reality systems to enhance user interaction and provide real-time event updates.

Problems Solved

This process streamlines user interaction by dynamically updating displayed content based on detected occurrences and speech input, improving user experience and efficiency.

Benefits

- Enhanced user engagement through interactive displays - Real-time event updates for improved decision-making - Streamlined user interface interactions

Commercial Applications

Title: Interactive Display Systems for Real-Time Event Updates This technology can be utilized in digital signage, smart home devices, and virtual assistants to provide users with up-to-date event information and seamless interaction experiences.

Prior Art

Further research can be conducted in the fields of human-computer interaction, augmented reality, and speech recognition technologies to explore similar approaches to dynamic content display based on user input and detected occurrences.

Frequently Updated Research

Stay informed about advancements in interactive display technologies, speech recognition systems, and event-driven user interfaces to enhance the capabilities and applications of this innovative process.

Questions about the Technology

How does this technology improve user interaction experiences?

This technology enhances user interactions by dynamically updating displayed content based on detected occurrences and speech input, providing real-time event updates and streamlining user interface interactions.

What are the potential commercial applications of this technology?

The technology can be applied in interactive displays, smart devices, and augmented reality systems for real-time event updates and improved user engagement.


Original Abstract Submitted

an example process includes concurrently displaying: a primary region displaying a first user interface; and a virtual affordance having a first display state and display content, where the display content represents an event and includes updates of the event; while concurrently displaying the primary region and the virtual affordance: detecting a predetermined type of occurrence associated with the event; in response to detecting the predetermined type of occurrence, modifying the first display state to a second display state; after modifying the first display state to the second display state, receiving a speech input; and determining, using context information determined based on the second display state, whether the speech input corresponds to the virtual affordance; and in accordance with a determination that the speech input corresponds to the virtual affordance, replacing, in the primary region, the display of the first user interface with a display of the event.