QUANTIZATION AND SPARSITY AWARE FINE-TUNING FOR SPEECH RECOGNITION WITH UNIVERSAL SPEECH MODELS

Number of patents in Portfolio can not be more than 2000

United States of America

APP PUB NO 20250078815A1
SERIAL NO

18826135

Stats

ATTORNEY / AGENT: (SPONSORED)

Importance

Loading Importance Indicators... loading....

Abstract

See full text

A method includes obtaining a plurality of training samples that each include a respective speech utterance and a respective textual utterance representing a transcription of the respective speech utterance. The method also includes fine-tuning, using quantization and sparsity aware training with native integer operations, a pre-trained automatic speech recognition (ASR) model on the plurality of training samples. Here, the pre-trained ASR model includes a plurality of weights and the fine-tuning includes pruning one or more weights of the plurality of weights using a sparsity mask and quantizing each weight of the plurality of weights based on an integer with a fixed-bit width. The method also includes providing the fine-tuned ASR model to a user device.

Loading the Abstract Image... loading....

First Claim

See full text

Family

Loading Family data... loading....

Patent Owner(s)

Patent OwnerAddress
GOOGLE LLC1600 AMPHITHEATRE PARKWAY MOUNTAIN VIEW CA 94043

International Classification(s)

  • [Classification Symbol]
  • [Patents Count]

Inventor(s)

Inventor Name Address # of filed Patents Total Citations
Agrawal, Shivani Mountain View, US 2 0
Ding, Shaojin Mountain View, US 7 2
Han, Zhonglin Mountain View, US 4 88
He, Yanzhang Mountain View, US 29 98
Li, Bo Fremont, US 785 4905
Li, Jian Mountain View, US 1062 5232
Prabhavalkar, Rohit Prakash Santa Clara, US 36 224
Qiu, David Fremont, US 6 67
Rim, David Mountain View, US 3 5
Rybakov, Oleg Mountain View, US 36 1101
Sainath, Tara N Jersey City, US 180 3361
Wang, Weiran Iowa City, US 23 68
Yazdanbakhsh, Amir Mountain View, US 9 4

Cited Art Landscape

Load Citation

Patent Citation Ranking

Forward Cite Landscape

Load Citation