20240054249. NEURAL NETWORK INTEGRITY VALIDATION simplified abstract (THALES DIS FRANCE SAS)
Contents
NEURAL NETWORK INTEGRITY VALIDATION
Organization Name
Inventor(s)
Philippe Loubet Moundi of La Ciotat (FR)
Eric Claise of Le Castellet (FR)
Eric Bourbao of Le Beausset (FR)
NEURAL NETWORK INTEGRITY VALIDATION - A simplified explanation of the abstract
This abstract first appeared for US patent application 20240054249 titled 'NEURAL NETWORK INTEGRITY VALIDATION
Simplified Explanation
A neural network is trained to match digital samples to categories in a set of categories and identify preposterous results when presented with a golden sample outside the set of categories.
- Trained neural network matches digital samples to predefined categories
- Identifies preposterous results when presented with a golden sample outside the set of categories
- Computer system is programmed with the trained neural network for integrity checks
- Declares neural network as uncompromised if it correctly identifies the golden sample as preposterous, otherwise declares it as compromised
---
- Potential Applications
- Cybersecurity for detecting compromised neural networks
- Fraud detection in financial transactions
- Anomaly detection in data analysis
- Problems Solved
- Ensures the integrity of neural networks in critical systems
- Prevents misclassification of data leading to incorrect decisions
- Enhances trust in AI systems by detecting compromised models
- Benefits
- Improved security in neural network applications
- Increased accuracy in categorizing digital samples
- Early detection of compromised neural networks for timely intervention
Original Abstract Submitted
a neural network is trained to match digital samples to categories in a set of categories and when presented with at least one golden sample, which is a sample outside the set of categories, to output a probability vector indicative of a preposterous result that the golden sample is matched to a predefined category in the set of categories. the secure computer system is programmed with the trained neural network, adapted to receive digital samples and to present the digital samples to the trained neural network. as an integrity check, the computer system, is caused to present the golden sample to the trained neural network and if the neural network outputs a probability vector classifying the golden sample into a predefined category in a way that is a preposterous result, declaring the neural network as uncompromised and, otherwise, declaring the neural network as compromised.