User: Guest  Login
Title:

Synchronized Forward-Backward Transformer for End-to-End Speech Recognition

Document type:
Konferenzbeitrag
Author(s):
Watzel, Tobias; Kürzinger, Ludwig; Li, Lujun; Rigoll, Gerhard
Abstract:
Recently, various approaches utilize transformer networks, which apply a new concept of self-attention, in end-to-end speech recognition. These approaches mainly focus on the self-attention mechanism to improve the performance of transformer models. In our work, we demonstrate the benefit of adding a second transformer network during the training phase, which is optimized on time-reversed target labels. This new transformer receives a future context, which is usually not available for standard t...     »
Editor:
Karpov, Alexey; Potapova, Rodmonga
Book / Congress title:
Speech and Computer
Publisher:
Springer International Publishing
Publisher address:
Cham
Year:
2020
Month:
Sep
Pages:
646--656
Print-ISBN:
978-3-030-60276-5
 BibTeX