International business machines corporation (20240185057). HYBRID ANALOG SYSTEM FOR TRANSFER LEARNING simplified abstract

From WikiPatents
Jump to navigation Jump to search

HYBRID ANALOG SYSTEM FOR TRANSFER LEARNING

Organization Name

international business machines corporation

Inventor(s)

Takashi Ando of Eastchester NY (US)

Martin Michael Frank of Dobbs Ferry NY (US)

Timothy Mathew Philip of Albany NY (US)

Vijay Narayanan of New York NY (US)

HYBRID ANALOG SYSTEM FOR TRANSFER LEARNING - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240185057 titled 'HYBRID ANALOG SYSTEM FOR TRANSFER LEARNING

Simplified Explanation

The abstract describes systems, methods, and semiconductor devices for transfer learning, where a semiconductor device includes two non-volatile memories to store weights of different sets of layers of a machine learning model.

  • A semiconductor device includes a first non-volatile memory (NVM) and a second NVM.
  • The first NVM stores weights of a fixed first set of layers of a machine learning model.
  • The second NVM stores weights of an adjustable second set of layers of the machine learning model.

Potential Applications

This technology could be applied in:

  • Autonomous vehicles for real-time decision-making.
  • Medical diagnosis for accurate predictions.

Problems Solved

This technology helps in:

  • Improving the efficiency of machine learning models.
  • Reducing the computational resources required for training models.

Benefits

The benefits of this technology include:

  • Faster inference times for machine learning tasks.
  • Enhanced accuracy in predictions.

Potential Commercial Applications

The potential commercial applications of this technology could be in:

  • Smart home devices for personalized user experiences.
  • Financial services for fraud detection.

Possible Prior Art

One possible prior art could be the use of transfer learning in machine learning models to improve performance and reduce training time.

What are the specific machine learning models used in this technology?

The specific machine learning models used in this technology are not mentioned in the abstract.

How does the adjustable second set of layers improve the performance of the machine learning model?

The abstract does not provide details on how the adjustable second set of layers improves the performance of the machine learning model.


Original Abstract Submitted

systems, methods, and semiconductor devices for transfer learning are described. a semiconductor device can include a first non-volatile memory (nvm) and a second nvm. the first nvm can be configured to store weights of a first set of layers of a machine learning model. the weights of the first set of layers can be fixed. the second nvm can be configured to store weights of a second set of layers of the machine learning model. the weights of the second set of layers can be adjustable.