17547122. PERFORMING AUTOMATED TUNING OF HYPERPARAMETERS IN A FEDERATED LEARNING ENVIRONMENT simplified abstract (INTERNATIONAL BUSINESS MACHINES CORPORATION)

From WikiPatents
Jump to navigation Jump to search

PERFORMING AUTOMATED TUNING OF HYPERPARAMETERS IN A FEDERATED LEARNING ENVIRONMENT

Organization Name

INTERNATIONAL BUSINESS MACHINES CORPORATION

Inventor(s)

Yi Zhou of San Jose CA (US)

Parikshit Ram of Atlanta GA (US)

Nathalie Baracaldo Angel of San Jose CA (US)

Theodoros Salonidis of Wayne PA (US)

Horst Cornelius Samulowitz of Armonk NY (US)

Martin Wistuba of Dublin (IE)

Heiko H. Ludwig of San Francisco CA (US)

PERFORMING AUTOMATED TUNING OF HYPERPARAMETERS IN A FEDERATED LEARNING ENVIRONMENT - A simplified explanation of the abstract

This abstract first appeared for US patent application 17547122 titled 'PERFORMING AUTOMATED TUNING OF HYPERPARAMETERS IN A FEDERATED LEARNING ENVIRONMENT

Simplified Explanation

The abstract describes a computer-implemented method for hyperparameter optimization (HPO) in which a query is sent to multiple computing devices, HPO results are received from each device, a unified performance metric surface is generated using these results, and optimal global hyperparameters are determined based on this surface.

  • The method involves optimizing hyperparameters, which are parameters that define the configuration of a machine learning model.
  • Multiple computing devices are used to perform the HPO process simultaneously, increasing efficiency.
  • HPO results from each device are combined to create a unified performance metric surface, which provides a comprehensive view of the model's performance across different hyperparameter settings.
  • The optimal global hyperparameters are determined based on the unified performance metric surface, allowing for the selection of the best configuration for the machine learning model.

Potential Applications

  • This technology can be applied in various fields that utilize machine learning, such as healthcare, finance, and autonomous vehicles.
  • It can be used to optimize the performance of machine learning models in tasks like image recognition, natural language processing, and predictive analytics.

Problems Solved

  • Hyperparameter optimization is a crucial step in machine learning model development, but it can be time-consuming and computationally expensive.
  • This method solves the problem of efficiently optimizing hyperparameters by distributing the HPO process across multiple computing devices.
  • It also addresses the challenge of selecting the best hyperparameter configuration by generating a unified performance metric surface.

Benefits

  • The use of multiple computing devices speeds up the HPO process, reducing the time required for hyperparameter optimization.
  • The unified performance metric surface provides a comprehensive evaluation of the model's performance, aiding in the selection of optimal hyperparameters.
  • By optimizing hyperparameters, the overall performance and accuracy of machine learning models can be significantly improved.


Original Abstract Submitted

A computer-implemented method according to one embodiment includes issuing a hyperparameter optimization (HPO) query to a plurality of computing devices; receiving HPO results from each of the plurality of computing devices; generating a unified performance metric surface utilizing the HPO results from each of the plurality of computing devices; and determining optimal global hyperparameters, utilizing the unified performance metric surface.