FEDERATED LEARNING WITH FOUNDATION MODEL DISTILLATION

Number of patents in Portfolio can not be more than 2000

United States of America Patent

APP PUB NO 20250103900A1
SERIAL NO

18371476

Stats

ATTORNEY / AGENT: (SPONSORED)

Importance

Loading Importance Indicators... loading....

Abstract

See full text

Methods and systems of training neural networks with federated learning. Machine learning models are sent from a server to clients, yielding local machine learning models. At each client, the models are trained with locally-stored data, including determining a respective cross entropy loss for each of the plurality of local machine learning models. Weights for each local model are updated, and transferred to the server without transferring locally-stored data. The transferred weights are aggregated at the server to obtain an aggregated server-maintained machine learning model. At the server, a distillation loss based on a foundation model is generated. The aggregated server-maintained machine learning is updated to obtain aggregated respective weights, which are transferred to the clients for updating in the local models.

Loading the Abstract Image... loading....

First Claim

See full text

Family

Loading Family data... loading....

Patent Owner(s)

Patent OwnerAddress
ROBERT BOSCH GMBHSTUTTGART GERMANY

International Classification(s)

  • [Classification Symbol]
  • [Patents Count]

Inventor(s)

Inventor Name Address # of filed Patents Total Citations
CABRITA, CONDESSA Filipe J Pittsburgh, US 34 15
GANESH, Madan Ravi Pittsburgh, US 3 0
LI, Zhenzhen Gibsonia, US 40 278
LIN, Wan-Yi Wexford, US 55 342
WILLMOTT, Devin T Pittsburgh, US 17 9
WU, Xidong Pittsburgh, US 95 735

Cited Art Landscape

Load Citation

Patent Citation Ranking

Forward Cite Landscape

Load Citation