Publication detail

Non-parametric Speaker Turn Segmentation of Meeting Data

MOTLÍČEK, P., BURGET, L., ČERNOCKÝ, J.

Original Title

Non-parametric Speaker Turn Segmentation of Meeting Data

Type

conference paper

Language

English

Original Abstract

An extension of conventional speaker segmentation framework is presented for a scenario in which a number of microphones record the activity of speakers present at a meeting (one microphone per speaker). Although each microphone can receive speech from both the participant wearing the microphone (local speech) and other participants (cross-talk), the recorded audio can be broadly classified in three ways: local speech, cross-talk, and silence. This paper proposes a technique which takes into account cross-correlations, values of its maxima, and energy differences as features to identify and segment speaker turns. In particular, we have used classical cross-correlation functions, time smoothing and in part temporal constraints to sharpen and disambiguate timing differences between microphone channels that may be dominated by noise and reverberation. Experimental results show that proposed technique can be successively used for speaker segmentation of data collected from a number of different setups.

Keywords

speech processing, feature extraction, speaker detection, meeting data

Authors

MOTLÍČEK, P., BURGET, L., ČERNOCKÝ, J.

RIV year

2005

Released

5. 9. 2005

Publisher

International Speech Communication Association

Location

Lisabon

ISBN

1018-4074

Periodical

European Conference EUROSPEECH

Year of study

2005

Number

9

State

Swiss Confederation

Pages from

657

Pages to

660

Pages count

4

URL

BibTex

@inproceedings{BUT18288,
  author="Petr {Motlíček} and Lukáš {Burget} and Jan {Černocký}",
  title="Non-parametric Speaker Turn Segmentation of Meeting Data",
  booktitle="Interspeech'2005 - Eurospeech - 9th European Conference on Speech Communication and Technology",
  year="2005",
  journal="European Conference EUROSPEECH",
  volume="2005",
  number="9",
  pages="657--660",
  publisher="International Speech Communication Association",
  address="Lisabon",
  issn="1018-4074",
  url="http://www.fit.vutbr.cz/~motlicek/publi/2005/eurospeech_2005.pdf"
}