Přístupnostní navigace
E-application
Search Search Close
Publication result detail
MIKOLOV, T.; DEORAS, A.; POVEY, D.; BURGET, L.; ČERNOCKÝ, J.
Original Title
Strategies for Training Large Scale Neural Network Language Models
English Title
Type
Paper in proceedings outside WoS and Scopus
Original Abstract
Techniques for effective training of recurrent neural network based language models are described, and new state-of-the-art results on standard speech recognition task are reported.
English abstract
Keywords
recurrent neural network, language model, speech recognition, maximum entropy
Key words in English
Authors
RIV year
2012
Released
11.12.2011
Publisher
IEEE Signal Processing Society
Location
Hilton Waikoloa Village, Big Island, Hawaii
ISBN
978-1-4673-0366-8
Book
Proceedings of ASRU 2011
Pages from
196
Pages to
201
Pages count
6
URL
http://www.fit.vutbr.cz/research/groups/speech/publi/2011/mikolov_asru2011_00196.pdf
BibTex
@inproceedings{BUT76453, author="Tomáš {Mikolov} and Anoop {Deoras} and Daniel {Povey} and Lukáš {Burget} and Jan {Černocký}", title="Strategies for Training Large Scale Neural Network Language Models", booktitle="Proceedings of ASRU 2011", year="2011", pages="196--201", publisher="IEEE Signal Processing Society", address="Hilton Waikoloa Village, Big Island, Hawaii", isbn="978-1-4673-0366-8", url="http://www.fit.vutbr.cz/research/groups/speech/publi/2011/mikolov_asru2011_00196.pdf" }