US Patent Application 17660144. USING HEADER MATRICES FOR FEATURE IMPORTANCE ANALYSIS IN MACHINE LEARNING MODELS simplified abstract

From WikiPatents
Jump to navigation Jump to search

USING HEADER MATRICES FOR FEATURE IMPORTANCE ANALYSIS IN MACHINE LEARNING MODELS

Organization Name

Dell Products L.P.


Inventor(s)

Jaumir Valença Da Silveira Junior of Rio de Janeiro (BR)


Eduardo Vera Sousa of Niteroi/RJ (BR)


Vinicius Michel Gottin of Rio de Janeiro (BR)


USING HEADER MATRICES FOR FEATURE IMPORTANCE ANALYSIS IN MACHINE LEARNING MODELS - A simplified explanation of the abstract

  • This abstract for appeared for US patent application number 17660144 Titled 'USING HEADER MATRICES FOR FEATURE IMPORTANCE ANALYSIS IN MACHINE LEARNING MODELS'

Simplified Explanation

The abstract describes a method to determine the relative importance of features in a dataset using a header matrix and a machine learning model. The header matrix starts as an Identity matrix and stores the gradients from a backpropagation process. These gradients are then accumulated in an accumulation matrix. By analyzing the accumulation matrix, we can determine or infer the importance of each feature in the dataset.


Original Abstract Submitted

A header matrix prepended to a machine learning model allows the relative importance of a dataset's features to be determined or inferred. The header matrix begins as an Identity matrix. Gradients associated with a backpropagation are stored in the header matrix and accumulated in an accumulation matrix. The relative importance of each feature of the dataset can be determined or inferred from the accumulation matrix.