20240028203. System and Method for Distributed Input/output (IO) Performance Forecasting Across Multiple Machine Learning Models simplified abstract (Dell Products L.P.)

From WikiPatents
Jump to navigation Jump to search

System and Method for Distributed Input/output (IO) Performance Forecasting Across Multiple Machine Learning Models

Organization Name

Dell Products L.P.

Inventor(s)

Shaul Dar of Petach Tikva (IL)

Ramakanth Kanagovi of Bengaluru (IN)

Vamsi Vankamamidi of Hopkinton MA (US)

Guhesh Swaminathan of Tamil Nadu (IN)

Swati Smita Sitha of Bengaluru (IN)

System and Method for Distributed Input/output (IO) Performance Forecasting Across Multiple Machine Learning Models - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240028203 titled 'System and Method for Distributed Input/output (IO) Performance Forecasting Across Multiple Machine Learning Models

Simplified Explanation

The patent application describes a method, computer program product, and computing system for processing input/output (IO) operations on storage objects in a storage system. The storage objects are divided into groups based on the IO operations processed on them, and each group is associated with a specific IO machine learning model. This allows for the forecasting of IO performance data for the storage objects using the group-specific machine learning models.

  • The method involves processing IO operations on storage objects in a storage system.
  • The storage objects are divided into groups based on the IO operations processed on them.
  • Each group is associated with a specific IO machine learning model.
  • The machine learning models are used to forecast IO performance data for the storage objects.

Potential applications of this technology:

  • Storage system optimization: The use of machine learning models to forecast IO performance data can help optimize the storage system by identifying potential bottlenecks and improving overall performance.
  • Resource allocation: By understanding the IO performance of different storage object groups, resources can be allocated more efficiently to ensure optimal performance for critical operations.
  • Capacity planning: The forecasting of IO performance data can aid in capacity planning by predicting future storage needs based on historical IO patterns.

Problems solved by this technology:

  • Performance optimization: By using machine learning models to forecast IO performance, potential performance issues can be identified and addressed proactively, leading to improved system efficiency.
  • Resource allocation efficiency: The ability to allocate resources based on the specific IO characteristics of storage object groups allows for better resource utilization and improved overall system performance.
  • Capacity planning accuracy: The forecasting of IO performance data helps in accurately predicting future storage needs, avoiding potential capacity constraints and ensuring smooth operations.

Benefits of this technology:

  • Improved system performance: By leveraging machine learning models, the IO operations can be optimized, leading to improved overall system performance and responsiveness.
  • Efficient resource utilization: The ability to allocate resources based on the specific IO characteristics of storage object groups allows for better resource utilization and cost savings.
  • Accurate capacity planning: The forecasting of IO performance data helps in accurately predicting future storage needs, avoiding overprovisioning or underprovisioning of storage resources.


Original Abstract Submitted

a method, computer program product, and computing system for processing a plurality of input/output (io) operations on a plurality of storage objects of a storage system. the plurality of storage objects may be divided into a plurality of storage object groups based upon, at least in part, the plurality of io operations processed on the plurality of storage objects. each storage object group may be associated with an io machine learning model selected from a plurality of io machine learning models, thus defining a plurality of storage object group-specific io machine learning models. io performance data may be forecasted for the plurality of storage objects using the plurality of storage object group-specific io machine learning models.