18073358. SMART HOME AUTOMATION USING MULTI-MODAL CONTEXTUAL INFORMATION simplified abstract (SAMSUNG ELECTRONICS CO., LTD.)

From WikiPatents
Jump to navigation Jump to search

SMART HOME AUTOMATION USING MULTI-MODAL CONTEXTUAL INFORMATION

Organization Name

SAMSUNG ELECTRONICS CO., LTD.

Inventor(s)

Andres Leonardo de Jesus Ortega Pena of San Jose CA (US)

Ashwin Chandra of Santa Clara CA (US)

Suk-Un Yoon of Suwon-si (KR)

David Ho Suk Chung of Rancho Palos Verdes CA (US)

SMART HOME AUTOMATION USING MULTI-MODAL CONTEXTUAL INFORMATION - A simplified explanation of the abstract

This abstract first appeared for US patent application 18073358 titled 'SMART HOME AUTOMATION USING MULTI-MODAL CONTEXTUAL INFORMATION

Simplified Explanation:

The method involves using a first electronic device with a multi-task dynamic machine learning model to process inputs, controlling sub-models based on available inputs or desired outputs, generating contextual information associated with the device and sharing it with another device in a specified environment.

  • The method utilizes a multi-task dynamic machine learning model with multiple sub-models for different machine learning functions.
  • Sub-models are dynamically controlled based on available inputs or desired outputs.
  • Contextual information associated with the first electronic device, including user-related data, is generated.
  • The contextual information can be shared with a second electronic device in a specified environment.

Key Features and Innovation:

  • Utilization of a multi-task dynamic machine learning model with multiple sub-models.
  • Dynamic control of sub-models based on inputs or outputs.
  • Generation of contextual information associated with the device.
  • Sharing of contextual information with another device in a specified environment.

Potential Applications:

The technology can be applied in various fields such as personalized recommendations, predictive analytics, and user behavior analysis.

Problems Solved:

The method addresses the need for efficient and adaptive machine learning models that can dynamically adjust based on inputs and outputs.

Benefits:

  • Improved accuracy and efficiency in processing inputs.
  • Enhanced user experience through personalized recommendations.
  • Increased adaptability to changing data patterns.

Commercial Applications:

Potential commercial applications include personalized marketing, targeted advertising, and user-specific content recommendations in various industries such as e-commerce, social media, and entertainment.

Prior Art:

There may be prior art related to multi-task machine learning models and dynamic sub-model control in the field of artificial intelligence and machine learning.

Frequently Updated Research:

Research on dynamic machine learning models, contextual information processing, and user behavior analysis may be relevant to this technology.

Questions about the Technology:

Question 1: How does the method dynamically control sub-models based on inputs and outputs?

Question 2: What are the potential implications of sharing contextual information between electronic devices in a specified environment?


Original Abstract Submitted

A method includes obtaining one or more inputs at a first electronic device. The first electronic device includes a multi-task dynamic machine learning model that includes multiple sub-models configured to perform different machine learning functions. The method also includes dynamically controlling the sub-models used to process the one or more inputs based on at least one of: (i) the one or more inputs that are available for use or (ii) one or more outputs to be generated. The method further includes generating a set of contextual information associated with the first electronic device. At least a portion of the set of contextual information is associated with at least one user of the first electronic device. The method may additionally include sharing the set of contextual information with at least a second electronic device in a specified environment.