17968175. SYSTEMS AND METHODS FOR COMMUNICATION-EFFICIENT MODEL AGGREGATION IN FEDERATED NETWORKS FOR CONNECTED VEHICLE APPLICATIONS simplified abstract (TOYOTA JIDOSHA KABUSHIKI KAISHA)

From WikiPatents
Revision as of 05:53, 26 April 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

SYSTEMS AND METHODS FOR COMMUNICATION-EFFICIENT MODEL AGGREGATION IN FEDERATED NETWORKS FOR CONNECTED VEHICLE APPLICATIONS

Organization Name

TOYOTA JIDOSHA KABUSHIKI KAISHA

Inventor(s)

Hao Gao of Mountain View CA (US)

Yongkang Liu of Mountain View CA (US)

Emrah Akin Sisbot of Mountain View CA (US)

Kentaro Oguchi of Mountain View CA (US)

SYSTEMS AND METHODS FOR COMMUNICATION-EFFICIENT MODEL AGGREGATION IN FEDERATED NETWORKS FOR CONNECTED VEHICLE APPLICATIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 17968175 titled 'SYSTEMS AND METHODS FOR COMMUNICATION-EFFICIENT MODEL AGGREGATION IN FEDERATED NETWORKS FOR CONNECTED VEHICLE APPLICATIONS

Simplified Explanation

The abstract describes a server designed for communication-efficient model aggregation in federated networks for connected vehicle applications. The server is capable of obtaining contributions from multiple vehicles, determining weights for local gradients based on these contributions, adjusting weights using potential functions, and aggregating local gradients to obtain a global model.

  • The server is programmed to obtain contributions from a plurality of vehicles in a federated learning framework.
  • It determines weights for local gradients received from the vehicles based on their contributions.
  • The weights are adjusted by comparing potential functions for the vehicles.
  • The server aggregates the local gradients based on the adjusted weights to obtain a global model.

Potential Applications

The technology can be applied in connected vehicle systems, where multiple vehicles can collaboratively train machine learning models without sharing raw data.

Problems Solved

1. Privacy concerns: By aggregating local gradients instead of raw data, privacy is maintained as sensitive information is not shared. 2. Communication efficiency: The server optimizes communication by adjusting weights based on contributions and potential functions.

Benefits

1. Enhanced privacy protection for vehicle data. 2. Efficient model aggregation without the need for centralized data storage. 3. Improved communication efficiency in federated learning frameworks.

Potential Commercial Applications

"Communication-Efficient Model Aggregation in Federated Networks for Connected Vehicle Applications" can be utilized in: 1. Automotive industry for developing collaborative machine learning models. 2. Smart transportation systems for real-time data analysis and decision-making.

Possible Prior Art

Prior art may include research on federated learning in distributed systems, collaborative machine learning in IoT networks, and communication-efficient model aggregation techniques.

Unanswered Questions

How does this technology impact the overall performance of connected vehicle applications?

The article does not delve into the specific performance metrics or improvements that can be achieved by implementing this technology in connected vehicle applications.

What are the potential security vulnerabilities associated with this server in federated networks?

The article does not address the potential security risks or vulnerabilities that may arise from using this server for communication-efficient model aggregation in federated networks.


Original Abstract Submitted

A server for communication-efficient model aggregation in federated networks for connected vehicle applications is provided. The server includes a controller programmed to: obtain contributions of a plurality of vehicles in a federated learning framework; determine weights for local gradients received from the plurality of vehicles based on the contributions; adjust the weights based on a comparison of potential functions for the plurality of vehicles; and aggregate the local gradients based on the adjusted weights to obtain a global model.