Jump to content

20250167802. Federated Latent Tr (AtomBeam Technologies .)

From WikiPatents

FEDERATED LATENT TRANSFORMER DEEP LEARNING CORE

Abstract: a system and method for a federated deep learning platform utilizing homomorphically-compressed and encrypted data. the system comprises multiple client devices, each with a local dataset, and a central server hosting a deep learning core. client devices convert local data into codewords, which are also homomorphically encrypted. the central server processes these encrypted codewords without decryption, preserving data privacy. the platform supports at least two architectural variants: a conventional transformer trained on codewords, and a latent transformer operating on latent space vectors. both variants eliminate the need for embedding and positional encoding layers. the system aggregates encrypted model updates from clients, enabling collaborative learning while maintaining data confidentiality. additional features comprise differential privacy implementation and adaptive federated optimization techniques. this innovative approach allows for efficient, privacy-preserving distributed learning across diverse datasets, addressing key challenges in federated learning such as data heterogeneity, non-iid distributions, and communication efficiency.

Inventor(s): Brian Galvin

CPC Classification: H03M7/3059 (Compression (speech analysis-synthesis for redundancy reduction ; for image communication ); Expansion; Suppression of unnecessary data, e.g. redundancy reduction)

Search for rejections for patent application number 20250167802


Cookies help us deliver our services. By using our services, you agree to our use of cookies.