18293976. HEARING ATTENTIONAL STATE ESTIMATION APPARATUS, LEARNING APPARATUS, METHOD, AND PROGRAM THEREOF simplified abstract (NIPPON TELEGRAPH AND TELEPHONE CORPORATION)

From WikiPatents
Jump to navigation Jump to search

HEARING ATTENTIONAL STATE ESTIMATION APPARATUS, LEARNING APPARATUS, METHOD, AND PROGRAM THEREOF

Organization Name

NIPPON TELEGRAPH AND TELEPHONE CORPORATION

Inventor(s)

Yuta Suzuki of Tokyo (JP)

Hsin-I Liao of Tokyo (JP)

Shigeto Furukawa of Tokyo (JP)

Yung-Hao Yang of Tokyo (JP)

HEARING ATTENTIONAL STATE ESTIMATION APPARATUS, LEARNING APPARATUS, METHOD, AND PROGRAM THEREOF - A simplified explanation of the abstract

This abstract first appeared for US patent application 18293976 titled 'HEARING ATTENTIONAL STATE ESTIMATION APPARATUS, LEARNING APPARATUS, METHOD, AND PROGRAM THEREOF

Simplified Explanation

The patent application involves determining where a user is paying auditory attention by analyzing the correlation between visual stimulus patterns and changes in pupil diameter.

Key Features and Innovation

  • Obtaining a feature quantity based on the strength of correlation between visual stimulus patterns and pupil diameter changes.
  • Estimating the destination of auditory attention based on the feature quantity.

Potential Applications

This technology could be used in:

  • Human-computer interaction systems
  • Virtual reality environments
  • Cognitive load monitoring

Problems Solved

  • Identifying user focus in audio-visual environments
  • Enhancing user experience in interactive systems

Benefits

  • Improved user engagement
  • Personalized audio-visual experiences
  • Enhanced cognitive load management

Commercial Applications

  • Interactive gaming systems
  • Educational platforms
  • Assistive technologies for individuals with attention deficits

Prior Art

Readers interested in prior art related to this technology may explore research on eye-tracking technology and audio-visual integration in human-computer interaction systems.

Frequently Updated Research

Stay updated on advancements in eye-tracking technology and audio-visual integration in interactive systems for potential new insights and applications.

Questions about Audio-Visual Attention

How does this technology impact user engagement in virtual reality environments?

This technology enhances user engagement by accurately determining where users are focusing their auditory attention, leading to more immersive experiences.

What are the implications of this technology for cognitive load monitoring in educational settings?

This technology can help educators understand how students allocate their attention during learning activities, allowing for tailored interventions and improved learning outcomes.


Original Abstract Submitted

A feature quantity based on the strength of a correlation between each of a plurality of different visual stimulus patterns corresponding to a plurality of different sound sources and a pupil diameter change amount of a user is obtained, and a destination to which the user pays auditory attention for a sound from the sound source is estimated using the feature quantity.