20240028936. DEVICE AND COMPUTER-IMPLEMENTED METHOD FOR MACHINE LEARNING simplified abstract (Robert Bosch GmbH)

From WikiPatents
Jump to navigation Jump to search

DEVICE AND COMPUTER-IMPLEMENTED METHOD FOR MACHINE LEARNING

Organization Name

Robert Bosch GmbH

Inventor(s)

Christoph Zimmer of Stuttgart (DE)

Matthias Bitzer of Stuttgart (DE)

DEVICE AND COMPUTER-IMPLEMENTED METHOD FOR MACHINE LEARNING - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240028936 titled 'DEVICE AND COMPUTER-IMPLEMENTED METHOD FOR MACHINE LEARNING

Simplified Explanation

The patent application describes a device and method for machine learning using a probabilistic model, such as a Gaussian process or a Bayesian neural network. The model is defined by at least one hyperparameter.

  • The model is defined as a function of at least one hyperparameter, such as the Gaussian process or the Bayesian neural network.
  • In each iteration, an instruction for a measurement is determined based on the model.
  • A posteriori distribution over values for the hyperparameter is determined based on the first measurement.
  • In another iteration, an instruction for a second measurement is determined based on the model.
  • At least one value of the hyperparameter is determined based on the second measurement.

Potential applications of this technology:

  • Predictive modeling: The probabilistic model can be used to make predictions based on measurements and update the model accordingly.
  • Anomaly detection: The model can identify anomalies or outliers in the data based on the measurements.
  • Optimization: The model can be used to optimize certain parameters or processes based on the measurements.

Problems solved by this technology:

  • Uncertainty estimation: The probabilistic model allows for estimating the uncertainty associated with the predictions or measurements.
  • Hyperparameter tuning: The method provides a way to determine the optimal values for the hyperparameters of the model.
  • Data efficiency: The iterative approach allows for efficient use of data by updating the model based on new measurements.

Benefits of this technology:

  • Improved accuracy: The probabilistic model can provide more accurate predictions by considering uncertainty and updating the model based on measurements.
  • Flexibility: The method can be applied to different types of probabilistic models, allowing for flexibility in the choice of model.
  • Efficient use of data: The iterative approach allows for efficient use of data by updating the model based on new measurements, reducing the need for large amounts of data.


Original Abstract Submitted

a device and computer-implemented method for machine learning. a probabilistic model is provided, in particular a model that includes a probability distribution, preferably a gaussian process or a bayesian neural network, the model being defined as a function of at least one hyperparameter, in particular of the gaussian process or of the bayesian neural network. in one iteration, an instruction for a first measurement is determined and output as a function of the model. for the at least one hyperparameter an a posteriori distribution over values for the at least one hyperparameter being determined as a function of the first measurement. in another iteration, an instruction for a second measurement is determined and output as a function of the model. at least one value of the at least one hyperparameter is determined as a function of the second measurement.