Samsung electronics co., ltd. (20240176986). MULTI-OBJECTIVE NEURAL ARCHITECTURE SEARCH FRAMEWORK simplified abstract

From WikiPatents
Jump to navigation Jump to search

MULTI-OBJECTIVE NEURAL ARCHITECTURE SEARCH FRAMEWORK

Organization Name

samsung electronics co., ltd.

Inventor(s)

Mostafa El-khamy of San Diego CA (US)

Minsu Cho of New York NY (US)

Kee-Bong Song of San Diego CA (US)

MULTI-OBJECTIVE NEURAL ARCHITECTURE SEARCH FRAMEWORK - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240176986 titled 'MULTI-OBJECTIVE NEURAL ARCHITECTURE SEARCH FRAMEWORK

Simplified Explanation

The patent application describes a system and method for conducting a neural architecture search using continuous relaxation of a discrete network search space to determine an optimal architecture network.

  • Sampling a discrete network search space initially
  • Determining a differential architecture network from a super-network using continuous relaxation
  • Calculating a reward based on proxy accuracy or complexity
  • Updating the distribution of the discrete network search space based on the reward
  • Determining an updated differential architecture network based on the reward

Potential Applications

This technology can be applied in various fields such as computer vision, natural language processing, and speech recognition to optimize neural network architectures for improved performance.

Problems Solved

This technology addresses the challenge of manually designing neural network architectures by automating the process through a neural architecture search, leading to more efficient and effective models.

Benefits

The benefits of this technology include faster development of neural network models, improved accuracy, reduced complexity, and enhanced performance in various machine learning tasks.

Potential Commercial Applications

The technology can be utilized in industries such as healthcare, finance, autonomous vehicles, and e-commerce to develop customized and optimized neural network models for specific applications.

Possible Prior Art

Prior art in neural architecture search includes methods such as reinforcement learning-based approaches, evolutionary algorithms, and random search techniques used to automate the design of neural network architectures.

Unanswered Questions

How does this technology compare to existing neural architecture search methods?

This article does not provide a direct comparison with other neural architecture search methods, leaving the reader to wonder about the specific advantages and limitations of this approach.

What are the computational requirements of implementing this neural architecture search method?

The article does not delve into the computational resources needed to execute this neural architecture search, leaving a gap in understanding the practical implications of adopting this technology.


Original Abstract Submitted

a system and a method are disclosed for performing a neural architecture search. the method includes sampling a discrete network search space a first time, determining a differential architecture network sampled from a super-network using continuous relaxation of the discrete network search space over operators in the super-network, calculating a reward based on a proxy accuracy or a proxy complexity of the differential architecture network, updating a distribution of the discrete network search space based on the reward, and determining an updated differential architecture network based on the reward.