Benutzer: Gast  Login
Dokumenttyp:
Konferenzbeitrag
Autor(en):
Watzel, Tobias; Kürzinger, Ludwig; Li, Lujun; Rigoll, Gerhard
Titel:
Synchronized Forward-Backward Transformer for End-to-End Speech Recognition
Abstract:
Recently, various approaches utilize transformer networks, which apply a new concept of self-attention, in end-to-end speech recognition. These approaches mainly focus on the self-attention mechanism to improve the performance of transformer models. In our work, we demonstrate the benefit of adding a second transformer network during the training phase, which is optimized on time-reversed target labels. This new transformer receives a future context, which is usually not available for standard t...     »
Herausgeber:
Karpov, Alexey; Potapova, Rodmonga
Kongress- / Buchtitel:
Speech and Computer
Verlag / Institution:
Springer International Publishing
Verlagsort:
Cham
Jahr:
2020
Monat:
Sep
Seiten:
646--656
Print-ISBN:
978-3-030-60276-5
 BibTeX