18087425. ADAPTIVE BATCHING FOR OPTIMIZING EXECUTION OF MACHINE LEARNING TASKS simplified abstract (TOYOTA JIDOSHA KABUSHIKI KAISHA)

From WikiPatents
Jump to navigation Jump to search

ADAPTIVE BATCHING FOR OPTIMIZING EXECUTION OF MACHINE LEARNING TASKS

Organization Name

TOYOTA JIDOSHA KABUSHIKI KAISHA

Inventor(s)

Chianing Wang of Mountain View CA (US)

Ariana Joy Mann of Stanford CA (US)

ADAPTIVE BATCHING FOR OPTIMIZING EXECUTION OF MACHINE LEARNING TASKS - A simplified explanation of the abstract

This abstract first appeared for US patent application 18087425 titled 'ADAPTIVE BATCHING FOR OPTIMIZING EXECUTION OF MACHINE LEARNING TASKS

Simplified Explanation: The patent application relates to improving the processing of machine learning tasks by adapting batch sizes and execution timing to optimize latency and energy consumption.

Key Features and Innovation:

  • Method involves evaluating a queue of machine learning tasks to determine when to execute a batch.
  • Cost of executing the batch at a current time is generated to make decisions.
  • Batching processor is controlled to execute the batch when the cost satisfies a batch threshold.

Potential Applications: This technology can be applied in various industries such as healthcare, finance, e-commerce, and autonomous vehicles for optimizing machine learning tasks.

Problems Solved: This technology addresses the challenges of optimizing latency and energy consumption in machine learning tasks by selectively adapting batch sizes and execution timing.

Benefits:

  • Improved efficiency in processing machine learning tasks.
  • Reduced latency and energy consumption.
  • Enhanced performance of machine learning models.

Commercial Applications: Optimizing machine learning tasks can benefit companies in various sectors by improving efficiency, reducing costs, and enhancing the performance of their machine learning models.

Prior Art: Readers can explore prior research on optimizing batch sizes and execution timing in machine learning tasks to understand the existing knowledge in this field.

Frequently Updated Research: Stay updated on the latest advancements in optimizing machine learning tasks by following research publications and conferences in the field of artificial intelligence and machine learning.

Questions about Machine Learning Optimization: 1. What are the potential drawbacks of selectively adapting batch sizes in machine learning tasks? 2. How does this technology compare to existing methods for optimizing latency and energy consumption in machine learning tasks?


Original Abstract Submitted

Systems, methods, and other embodiments described herein relate to improving the processing of machine learning (ML) tasks by selectively adapting batch sizes and execution timing to optimize latency and energy consumption. In one embodiment, a method includes receiving, in a queue, tasks for execution, the tasks being requests to execute a machine-learning model. The method includes evaluating a current state of the queue according to a batching model to determine when to execute a batch of the tasks by generating a cost of executing the batch at a current time. The method includes, responsive to determining that the cost satisfies a batch threshold, controlling a batching processor to execute the batch using the machine-learning model.