Publication result detail

Residual Memory Networks: Feed-forward approach to learn long-term temporal dependencies

BASKAR, M.; KARAFIÁT, M.; BURGET, L.; VESELÝ, K.; GRÉZL, F.; ČERNOCKÝ, J.

Original Title

Residual Memory Networks: Feed-forward approach to learn long-term temporal dependencies

English Title

Residual Memory Networks: Feed-forward approach to learn long-term temporal dependencies

Type

Paper in proceedings (conference paper)

Original Abstract

Training deep recurrent neural network (RNN) architectures iscomplicated due to the increased network complexity. This disruptsthe learning of higher order abstracts using deep RNN. Incase of feed-forward networks training deep structures is simpleand faster while learning long-term temporal information isnot possible. In this paper we propose a residual memory neuralnetwork (RMN) architecture to model short-time dependenciesusing deep feed-forward layers having residual and time delayedconnections. The residual connection paves way to constructdeeper networks by enabling unhindered flow of gradientsand the time delay units capture temporal information withshared weights. The number of layers in RMN signifies both thehierarchical processing depth and temporal depth. The computationalcomplexity in training RMN is significantly less whencompared to deep recurrent networks. RMN is further extendedas bi-directional RMN (BRMN) to capture both past and futureinformation. Experimental analysis is done on AMI corpus tosubstantiate the capability of RMN in learning long-term informationand hierarchical information. Recognition performanceof RMN trained with 300 hours of Switchboard corpus is comparedwith various state-of-the-art LVCSR systems. The resultsindicate that RMN and BRMN gains 6 % and 3.8 % relativeimprovement over LSTM and BLSTM networks.

English abstract

Training deep recurrent neural network (RNN) architectures iscomplicated due to the increased network complexity. This disruptsthe learning of higher order abstracts using deep RNN. Incase of feed-forward networks training deep structures is simpleand faster while learning long-term temporal information isnot possible. In this paper we propose a residual memory neuralnetwork (RMN) architecture to model short-time dependenciesusing deep feed-forward layers having residual and time delayedconnections. The residual connection paves way to constructdeeper networks by enabling unhindered flow of gradientsand the time delay units capture temporal information withshared weights. The number of layers in RMN signifies both thehierarchical processing depth and temporal depth. The computationalcomplexity in training RMN is significantly less whencompared to deep recurrent networks. RMN is further extendedas bi-directional RMN (BRMN) to capture both past and futureinformation. Experimental analysis is done on AMI corpus tosubstantiate the capability of RMN in learning long-term informationand hierarchical information. Recognition performanceof RMN trained with 300 hours of Switchboard corpus is comparedwith various state-of-the-art LVCSR systems. The resultsindicate that RMN and BRMN gains 6 % and 3.8 % relativeimprovement over LSTM and BLSTM networks.

Keywords

Automatic speech recognition, LSTM, RNN,Residual memory networks.

Key words in English

Automatic speech recognition, LSTM, RNN,Residual memory networks.

Authors

BASKAR, M.; KARAFIÁT, M.; BURGET, L.; VESELÝ, K.; GRÉZL, F.; ČERNOCKÝ, J.

RIV year

2018

Released

05.03.2017

Publisher

IEEE Signal Processing Society

Location

New Orleans

ISBN

978-1-5090-4117-6

Book

Proceedings of ICASSP 2017

Pages from

4810

Pages to

4814

Pages count

5

URL

BibTex

@inproceedings{BUT144448,
  author="Murali Karthick {Baskar} and Martin {Karafiát} and Lukáš {Burget} and Karel {Veselý} and František {Grézl} and Jan {Černocký}",
  title="Residual Memory Networks: Feed-forward approach to learn long-term temporal dependencies",
  booktitle="Proceedings of ICASSP 2017",
  year="2017",
  pages="4810--4814",
  publisher="IEEE Signal Processing Society",
  address="New Orleans",
  doi="10.1109/ICASSP.2017.7953070",
  isbn="978-1-5090-4117-6",
  url="https://www.fit.vut.cz/research/publication/11467/"
}

Documents