17449871. Deployment and Management of Energy Efficient Deep Neural Network Models on Edge Inference Computing Devices simplified abstract (International Business Machines Corporation)

From WikiPatents
Jump to navigation Jump to search

Deployment and Management of Energy Efficient Deep Neural Network Models on Edge Inference Computing Devices

Organization Name

International Business Machines Corporation

Inventor(s)

Rajesh Kumar Jeyapaul of BENGALURU (IN)

Sivapatham Muthaiah of BENGALURU (IN)

Deployment and Management of Energy Efficient Deep Neural Network Models on Edge Inference Computing Devices - A simplified explanation of the abstract

This abstract first appeared for US patent application 17449871 titled 'Deployment and Management of Energy Efficient Deep Neural Network Models on Edge Inference Computing Devices

Simplified Explanation

The patent application describes a method for deploying energy-efficient deep neural network models on edge devices. Here are the key points:

  • The method assigns an overall energy efficiency rating to a deep neural network model by optimizing software and utilizing hardware accelerators during training.
  • Energy scores are assigned to edge devices in an edge inference computing environment based on their properties.
  • The method selects specific edge devices that have energy scores within a defined range for the overall energy efficiency rating of the deep neural network model.
  • The selected edge devices are then deployed with the corresponding deep neural network model.

Potential Applications

This technology has potential applications in various fields, including:

  • Internet of Things (IoT) devices
  • Edge computing systems
  • Mobile devices
  • Autonomous vehicles
  • Smart home devices

Problems Solved

The technology addresses the following problems:

  • Energy efficiency: By deploying energy-rated deep neural network models, it optimizes the energy consumption of edge devices.
  • Resource allocation: The method ensures that deep neural network models are deployed on edge devices with suitable energy scores, maximizing performance and efficiency.

Benefits

The technology offers several benefits:

  • Improved energy efficiency: By assigning energy scores and selecting suitable edge devices, it reduces energy consumption and extends battery life.
  • Enhanced performance: The method optimizes the deployment of deep neural network models, leading to faster and more efficient inference on edge devices.
  • Scalability: The approach can be applied to a wide range of edge devices, making it scalable for various applications and industries.


Original Abstract Submitted

Deploying energy-rated deep neural network models on energy-scored edge devices is provided. An overall energy efficiency rating is assigned to a deep neural network model based on utilizing software optimization and hardware accelerators during training of the deep neural network model. Energy scores are assigned to respective edge devices in an edge inference computing environment based on properties of each respective edge device. Particular edge devices are selected that have a corresponding energy score within a defined edge device energy score range for the overall energy efficiency rating that corresponds to the deep neural network model. The deep neural network model is deployed to the particular edge devices that have a corresponding energy score within the defined edge device energy score range for the overall energy efficiency rating that corresponds to the deep neural network model.