18351217. COMPUTER-READABLE RECORDING MEDIUM STORING MACHINE LEARNING PROGRAM, MACHINE LEARNING METHOD, AND INFORMATION PROCESSING APPARATUS simplified abstract (FUJITSU LIMITED)

From WikiPatents
Jump to navigation Jump to search

COMPUTER-READABLE RECORDING MEDIUM STORING MACHINE LEARNING PROGRAM, MACHINE LEARNING METHOD, AND INFORMATION PROCESSING APPARATUS

Organization Name

FUJITSU LIMITED

Inventor(s)

Masayuki Hiromoto of Kawasaki (JP)

Akira Nakagawa of Sagamihara (JP)

COMPUTER-READABLE RECORDING MEDIUM STORING MACHINE LEARNING PROGRAM, MACHINE LEARNING METHOD, AND INFORMATION PROCESSING APPARATUS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18351217 titled 'COMPUTER-READABLE RECORDING MEDIUM STORING MACHINE LEARNING PROGRAM, MACHINE LEARNING METHOD, AND INFORMATION PROCESSING APPARATUS

Simplified Explanation

The patent application describes a machine learning program that utilizes an encoder-decoder architecture to generate output data based on input data by calculating latent variables and sampling noise.

  • Calculating an average of latent variables by inputting input data to an encoder
  • Sampling noise based on a probability distribution, with decreasing probability towards the center
  • Calculating the latent variable by adding the noise to the average
  • Calculating output data by inputting the latent variable to a decoder
  • Training the encoder and decoder using a loss function that includes encoding information and error between input and output data

Potential Applications

This technology can be applied in various fields such as image and speech recognition, natural language processing, and anomaly detection.

Problems Solved

This technology addresses the challenge of generating meaningful output data from input data by effectively calculating latent variables and incorporating noise sampling.

Benefits

The benefits of this technology include improved accuracy in data generation, enhanced learning capabilities of the machine learning model, and increased efficiency in processing complex data sets.

Potential Commercial Applications

One potential commercial application of this technology is in the development of advanced recommendation systems for e-commerce platforms, personalized content generation for media companies, and predictive analytics for financial institutions.

Possible Prior Art

Prior art in this field includes research on variational autoencoders, generative adversarial networks, and probabilistic graphical models for data generation tasks.

Unanswered Questions

How does this technology compare to existing noise sampling methods in machine learning models?

This article does not provide a direct comparison with existing noise sampling methods in machine learning models. It would be beneficial to understand the specific advantages and limitations of this approach compared to traditional techniques.

What are the computational requirements for implementing this technology in real-world applications?

The article does not delve into the computational resources needed to deploy this technology in practical settings. Understanding the computational demands can help assess the feasibility of integrating this innovation into existing systems.


Original Abstract Submitted

A non-transitory computer-readable recording medium stores a machine learning program causing a computer to execute a process including: calculating an average of latent variables by inputting input data to an encoder; sampling a noise, based on a probability distribution of the noise, in which a probability is decreased as the probability approaches to a center of the probability distribution from a predetermined position in the probability distribution; calculating the latent variable by adding the noise to the average; calculating output data by inputting the calculated latent variable to a decoder; and training the encoder and the decoder in accordance with a loss function, the loss function including encoding information and an error between the input data and the output data, the encoding information being information of a probability distribution of the calculated latent variable and a prior distribution of the latent variable.