Contextual Assistant Using Mouse Pointing or Touch Cues: abstract simplified (18331643)

From WikiPatents
Jump to navigation Jump to search
  • This abstract for appeared for patent application number 18331643 Titled 'Contextual Assistant Using Mouse Pointing or Touch Cues'

Simplified Explanation

This abstract describes a method for a contextual assistant to use mouse pointing or touch cues. The assistant receives audio data of a user's spoken query and a user input indication on a graphical user interface. It processes the audio data to transcribe the query and performs query interpretation to determine that the query refers to an object on the screen without specifically identifying it. The assistant then requests information about the object and uses the user input indication to uniquely identify the object. It obtains the requested information and provides a response to the query.


Original Abstract Submitted

A method for a contextual assistant to use mouse pointing or touch cues includes receiving audio data corresponding to a query spoken by a user, receiving, in a graphical user interface displayed on a screen, a user input indication indicating a spatial input applied at a first location on the screen, and processing the audio data to determine a transcription of the query. The method also includes performing query interpretation on the transcription to determine that the query is referring to an object displayed on the screen without uniquely identifying the object, and requesting information about the object. The method further includes disambiguating, using the user input indication indicating the spatial input applied at the first location on the screen, the query to uniquely identify the object that the query is referring to, obtaining the information about the object requested by the query, and providing a response to the query.