SciDok

Eingang zum Volltext in SciDok

Lizenz

Report (Bericht) zugänglich unter
URN: urn:nbn:de:bsz:291-scidok-41900
URL: http://scidok.sulb.uni-saarland.de/volltexte/2011/4190/


A new model-discriminant training algorithm for hybrid NN-HMM systems

Reichl, W. ; Caspary, P. ; Ruske, G.

Quelle: (1996) Saarbrücken, 1996
pdf-Format:
Dokument 1.pdf (294 KB)

Bookmark bei Connotea Bookmark bei del.icio.us
SWD-Schlagwörter: Künstliche Intelligenz
Institut: DFKI Deutsches Forschungszentrum für Künstliche Intelligenz
DDC-Sachgruppe: Informatik
Dokumentart: Report (Bericht)
Schriftenreihe: Vm-Report / Verbmobil, Verbundvorhaben, [Deutsches Forschungszentrum für Künstliche Intelligenz]
Bandnummer: 108
Sprache: Englisch
Erstellungsjahr: 1996
Publikationsdatum: 06.09.2011
Kurzfassung auf Englisch: This paper describes a hybrid system for continuous speech recognition consisting of a neural network (NN) and a hidden Markov model (HMM). The system is based on a multilayer perceptron, which approximates the a-posteriori probability of a sequence of states, derived from semi-continuous hidden Markov models. The classification is based on a total score for each hybrid model, attained from a Viterbi search on the state probabilities. Due to the unintended discrimination between the states in each model, a new training algorithm for the hybrid neural networks is presented. The utilized error function approximates the misclassification rate of the hybrid system. The discriminance between the correct and the incorrect models is optimized during the training by the "Generalized Probabilistic Descent Algorithm';, resulting in a minimum classification error. No explicit target values for the neural net output nodes are used, as in the usual backpropagation algorithm with a quadratic error function. In basic experiments up to 56% recognition rate were achieved on a vowel classification task and up to 69 % on a consonant cluster classification task.
Lizenz: Standard-Veröffentlichungsvertrag

Home | Impressum | Über SciDok | Policy | Kontakt | Datenschutzerklärung | English