18176159. REPRESENTATION LEARNING APPARATUS, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM simplified abstract (KABUSHIKI KAISHA TOSHIBA)

From WikiPatents
Jump to navigation Jump to search

REPRESENTATION LEARNING APPARATUS, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM

Organization Name

KABUSHIKI KAISHA TOSHIBA

Inventor(s)

Kentaro Takagi of Yokohama Kanagawa (JP)

Toshiyuki Oshima of Tokyo (JP)

REPRESENTATION LEARNING APPARATUS, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM - A simplified explanation of the abstract

This abstract first appeared for US patent application 18176159 titled 'REPRESENTATION LEARNING APPARATUS, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM

Simplified Explanation

The patent application describes a representation learning apparatus that calculates latent vectors in latent spaces of target data and non-interest features, corrects similarities between these vectors, and updates model parameters based on a loss function.

  • The apparatus calculates latent vectors in latent spaces of target data and non-interest features.
  • It corrects similarities between these vectors and updates model parameters based on a loss function.

Potential Applications

This technology could be applied in various fields such as image recognition, natural language processing, and recommendation systems.

Problems Solved

This technology helps in improving the accuracy and efficiency of representation learning tasks by considering non-interest features in the data.

Benefits

The benefits of this technology include enhanced performance in tasks such as classification, clustering, and feature extraction, leading to better decision-making processes.

Potential Commercial Applications

One potential commercial application of this technology could be in the development of advanced machine learning models for personalized recommendations in e-commerce platforms.

Possible Prior Art

Prior art in representation learning and similarity correction techniques may exist in the field of machine learning and artificial intelligence research.

Unanswered Questions

How does this technology compare to existing representation learning methods in terms of computational efficiency and accuracy?

This article does not provide a direct comparison with existing representation learning methods in terms of computational efficiency and accuracy. Further research and experimentation would be needed to address this question.

What are the potential limitations or drawbacks of implementing this representation learning apparatus in real-world applications?

The article does not discuss the potential limitations or drawbacks of implementing this representation learning apparatus in real-world applications. Additional studies and practical implementations would be required to identify and address any challenges that may arise.


Original Abstract Submitted

A representation learning apparatus executing: calculating a latent vector Sx in a latent space of the target data x using a first model parameter, calculate a non-interest latent vector Zx in a latent space of an non-interest feature included in the target data x and a non-interest latent vector Zb in the latent space of a non-interest data using a second model parameter, calculate a similarity S obtained by correcting a similarity between the latent vector Sx and its representative value S′x by a similarity between the latent vector Zx and its representative value Z′x, and a similarity S between the latent vector Zb and its representative value Z′b, and update the first and/or the second model parameter based on the loss function including the similarity S and S