Intel corporation (20240127031). GRAPH NEURAL NETWORK MODEL FOR NEURAL NETWORK SCHEDULING DECISIONS simplified abstract

From WikiPatents
Jump to navigation Jump to search

GRAPH NEURAL NETWORK MODEL FOR NEURAL NETWORK SCHEDULING DECISIONS

Organization Name

intel corporation

Inventor(s)

Hamza Yous of Dublin (IE)

Ian Hunter of Carnaross (IE)

Alessandro Palla of Pisa (IT)

GRAPH NEURAL NETWORK MODEL FOR NEURAL NETWORK SCHEDULING DECISIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240127031 titled 'GRAPH NEURAL NETWORK MODEL FOR NEURAL NETWORK SCHEDULING DECISIONS

Simplified Explanation

A graph neural network (GNN) model is used in a scheduling process for compiling a deep neural network (DNN). The DNN and parameter options for scheduling the DNN are represented as a graph, and the GNN predicts a set of parameters that is expected to have a low cost. Using the GNN-based model, a compiler can produce a schedule for compiling the DNN in a relatively short and predictable amount of time, even for DNNs with many layers and/or many parameter options. For example, the GNN-based model reduces the overhead of exploring every parameter combination and does not exclude combinations from consideration like prior heuristic-based approaches.

  • GNN model used in scheduling process for compiling DNN
  • DNN and parameter options represented as a graph
  • GNN predicts parameters with low cost
  • Compiler can produce schedule in short time
  • Reduces overhead of exploring all parameter combinations
  • Does not exclude combinations from consideration

Potential Applications

The technology can be applied in various fields such as:

  • Machine learning
  • Artificial intelligence
  • Computer vision
  • Natural language processing

Problems Solved

The technology addresses the following issues:

  • Efficient scheduling of deep neural network compilation
  • Reduction of overhead in exploring parameter combinations
  • Predictable and quick compilation process

Benefits

The technology offers the following benefits:

  • Faster compilation of deep neural networks
  • Improved efficiency in scheduling processes
  • Cost-effective solutions for compiling DNNs

Potential Commercial Applications

The technology can be utilized in industries such as:

  • Software development
  • Data analytics
  • Cloud computing
  • Autonomous systems

Possible Prior Art

One possible prior art is the use of heuristic-based approaches for scheduling deep neural network compilation processes.

Unanswered Questions

How does the GNN model handle complex DNN structures with numerous layers and parameter options?

The GNN model is designed to efficiently predict parameters for scheduling the compilation of deep neural networks, even when dealing with complex structures. By representing the DNN and its parameter options as a graph, the GNN can effectively navigate through the various possibilities to find an optimal solution.

What are the limitations of the GNN-based model in terms of scalability and adaptability to different types of DNNs?

While the GNN-based model shows promise in reducing overhead and improving efficiency in compiling DNNs, there may be limitations in its scalability and adaptability to diverse types of deep neural networks. Further research and development may be needed to address these potential challenges.


Original Abstract Submitted

a graph neural network (gnn) model is used in a scheduling process for compiling a deep neural network (dnn). the dnn, and parameter options for scheduling the dnn, are represented as a graph, and the gnn predicts a set of parameters that is expected to have a low cost. using the gnn-based model, a compiler can produce a schedule for compiling the dnn in a relatively short and predictable amount of time, even for dnns with many layers and/or many parameter options. for example, the gnn-based model reduces the overhead of exploring every parameter combination and does not exclude combinations from consideration like prior heuristic-based approaches.