|
{ |
|
"paper_id": "O18-1004", |
|
"header": { |
|
"generated_with": "S2ORC 1.0.0", |
|
"date_generated": "2023-01-19T08:09:55.649670Z" |
|
}, |
|
"title": "On the Use of Speaker-Aware Language Model Adaptation Techniques for Meeting Speech Recognition", |
|
"authors": [ |
|
{ |
|
"first": "Ying-Wen", |
|
"middle": [], |
|
"last": "\u9673\u6620\u6587", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "" |
|
}, |
|
{ |
|
"first": "Tien-Hong", |
|
"middle": [], |
|
"last": "Chen", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "" |
|
}, |
|
{ |
|
"first": "Hsiu-Jui", |
|
"middle": [], |
|
"last": "Lo", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "" |
|
}, |
|
{ |
|
"first": "Wei-Cheng", |
|
"middle": [], |
|
"last": "Chang", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "" |
|
}, |
|
{ |
|
"first": "Berlin", |
|
"middle": [], |
|
"last": "Chao", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "[email protected]" |
|
}, |
|
{ |
|
"first": "", |
|
"middle": [], |
|
"last": "Chen", |
|
"suffix": "", |
|
"affiliation": { |
|
"laboratory": "", |
|
"institution": "National Taiwan Normal University", |
|
"location": {} |
|
}, |
|
"email": "" |
|
} |
|
], |
|
"year": "", |
|
"venue": null, |
|
"identifiers": {}, |
|
"abstract": "This paper embarks on alleviating the problems caused by a multiple-speaker situation occurring frequently in a meeting for improved automatic speech recognition (ASR). There are a wide variety of ways for speakers to utter in the multiple-speaker situation. That is to say, people do not strictly follow the grammar when speaking and usually have a tendency to stutter while speaking, or often use personal idioms and some unique ways of speaking. Nevertheless, the existing language models employed in automatic transcription of meeting", |
|
"pdf_parse": { |
|
"paper_id": "O18-1004", |
|
"_pdf_hash": "", |
|
"abstract": [ |
|
{ |
|
"text": "This paper embarks on alleviating the problems caused by a multiple-speaker situation occurring frequently in a meeting for improved automatic speech recognition (ASR). There are a wide variety of ways for speakers to utter in the multiple-speaker situation. That is to say, people do not strictly follow the grammar when speaking and usually have a tendency to stutter while speaking, or often use personal idioms and some unique ways of speaking. Nevertheless, the existing language models employed in automatic transcription of meeting", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "Abstract", |
|
"sec_num": null |
|
} |
|
], |
|
"body_text": [ |
|
{ |
|
"text": "recordings rarely account for these facts but instead assume that all speakers participating in a meeting share the same speaking style or word-usage behavior. In turn, a single language model is built with all the manual transcripts of utterances compiled from multiple speakers that were taken holistically as the training set. To relax such an assumption, we endeavor to augment additional information cues into the training phase and the prediction phase of language modeling to accommodate the variety of speaker-related characteristics, through the process of speaker adaptation for language modeling. To this end, two disparate scenarios, i.e., \"known speakers\" and \"unknown speakers,\" for the prediction phase are taken into consideration for developing methods to extract speaker-related information cues to aid in the training of language models. Extensive experiments respectively carried out on automatic transcription of Mandarin and English meeting recordings show that the proposed language models along with different mechanisms for speaker adaption achieve good performance gains in relation to the baseline neural network based language model compared in this study. ", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "EQUATION", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [ |
|
{ |
|
"start": 0, |
|
"end": 8, |
|
"text": "EQUATION", |
|
"ref_id": "EQREF", |
|
"raw_str": "( | ) = ( | ) ( ) ( ) \u221d ( | ) ( ) (1) \u5176\u4e2d\uff0c ( )\u901a\u5e38\u4ee5 N \u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\u4f86\u4f30\u8a08\uff0c ( | )\u5247\u53ef\u900f\u904e\u8072\u5b78\u6a21\u578b\u4f86\u4f30\u8a08\u3002\u85c9\u7531\u9019 \u5169\u500b\u6a21\u578b\u7684\u76f8\u4e58\u53ef\u7b97\u51fa\u5019\u9078\u8a5e\u5e8f\u5217 \u7684\u5206\u6578\uff0c\u4e26\u7d93\u7531\u8a9e\u8a00\u89e3\u78bc\u904e\u7a0b\uff0c\u9069\u7576\u5730\u4fee\u526a\u5019\u9078\u8a5e \u5e8f\u5217\u800c\u5f62\u6210\u8a5e\u5716(Lattice)\u6216 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217(M-best List)\u3002\u63a5\u8457\uff0c\u5728\u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21\u578b \u91cd\u65b0\u6392\u5e8f\u6642\uff0c\u6211\u5011\u53ef\u4f7f\u7528\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u91cd\u65b0\u4f30\u6e2c ( )\uff0c\u800c\u9054\u5230\u5019\u9078\u8a5e\u5e8f\u91cd\u65b0\u6392 \u5e8f\u7684\u76ee\u7684(\u4ee5 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u91cd\u65b0\u6392\u5e8f\u70ba\u4f8b)\uff1a * = argmax \u2032 \u2208 \u2212 ( | \u2032 ) ( \u2032 ) (2) \u63a5\u8457\uff0c\u6211\u5011\u5c07\u8aaa\u660e\u5982\u4f55\u904b\u7528\u8a9e\u8005\u8cc7\u8a0a\u8f14\u52a9\u8a9e\u8a00\u6a21\u578b\u7684\u8a13\u7df4\uff0c\u4ee5\u9054\u5230\u8a9e\u8005\u8abf\u9069\u7684\u6548\u679c\u3002 (\u4e00) \u5df2\u77e5\u6bcf\u4e00\u689d\u8a5e\u5e8f\u5217\u5c0d\u61c9\u7684\u8a9e\u8005\u70ba \u6642(\u5047\u8a2d\u7e3d\u5171\u6709 \u4f4d\u8a9e\u8005)\uff1a ( \u2032 ) = \u2211 ( \u2032 | \u2032) ( \u2032) \u2032\u2208 (3) \u82e5\u5176\u4e2d ( \u2032 ) \u2236= { 1, \u2032 = 0, \u2032 \u2260 \uff0c\u5247 ( \u2032 ) = \u2211 ( \u2032 | \u2032) ( \u2032) \u2032\u2208 = ( \u2032 | )", |
|
"eq_num": "(4)" |
|
} |
|
], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "\u4f9d\u64da\u5168\u6a5f\u7387\u5b9a\u7406(Total Probability Theorem)\uff0c ( \u2032 )\u53ef\u4ee5\u5beb\u6210\u2211 ( \u2032 | \u2032) ( \u2032) \u2032\u2208 \uff0c\u800c\u7576 \u5df2\u77e5\u8a5e\u5e8f\u5217\u5c0d\u61c9\u7684\u8a9e\u8005\u70ba \u6642\uff0c\u4fbf\u53ef\u5f97\u5230\u5177\u6709\u8a9e\u8005\u8cc7\u8a0a\u7684\u6a5f\u7387\u5f0f(\u4e5f\u5c31\u662f\u8a9e\u8005\u8abf\u9069\u904e\u5f8c\u7684 \u8a9e\u8a00\u6a21\u578b) ( \u2032 | )\u4f86\u4f30\u6e2c\u8a5e\u5e8f\u5217 \u2032 \u7684\u6a5f\u7387\u3002 (\u4e8c) \u672a\u77e5\u6bcf\u4e00\u689d\u8a5e\u5e8f\u5217\u5c0d\u61c9\u7684\u8a9e\u8005\u6642\uff1a ( \u2032 ) = \u2211 ( \u2032 | \u2032) ( \u2032) \u2032\u2208 = \u2211 ( \u2032 | \u2032) \u2211 ( \u2032| \u2032\u2032 ) ( \u2032\u2032 ) \u2032\u2032 \u2032\u2208 (5) \u82e5\u5176\u4e2d ( \u2032\u2032 ) = { 1, \u2032\u2032 = \u2032 0, \u2032\u2032 \u2260 \u2032 \uff0c\u5247 ( \u2032 ) = \u2211 ( \u2032 | \u2032) ( \u2032| \u2032 ) \u2032\u2208 (6) \u7576\u672a\u77e5\u8a5e\u5e8f\u5217\u7684\u8a9e\u8005\u6642\uff0c\u540c\u6a23\u4f9d\u64da\u5168\u6a5f\u7387\u5b9a\u7406\u5c07 ( \u2032 )\u5c55\u958b\uff1b\u4f46\u662f\u6b64\u6642 ( )\u56e0\u70ba\u662f\u672a\u77e5 \u8a9e\u8005\uff0c\u6240\u4ee5\u518d\u6b21\u4f9d\u64da\u5168\u6a5f\u7387\u516c\u5f0f\u5c07 ( \u2032)\u62c6\u89e3\u6210\u2211 ( \u2032| \u2032\u2032 ) ( \u2032\u2032 ) \u2032\u2032 \u3002\u56e0\u70ba\u8a5e\u5e8f\u5217\u5df2\u77e5 \u662f \u2032 \uff0c\u6240\u4ee5\u7576 \u2032\u2032 \u70ba \u2032 \u6642\uff0c ( \u2032\u2032 ) = 1\uff0c\u5176\u9918\u70ba 0\u3002\u66f4\u5177\u9ad4\u5730\u8aaa\uff0c\u9032\u884c\u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21 \u578b\u91cd\u65b0\u6392\u5e8f\u6642\uff0c\u6211\u5011\u5148\u5c0d\u6bcf\u500b\u5019\u9078\u8a5e\u5e8f\u5217 \u2032 \u4f30\u6e2c\u5404\u8a9e\u8005 \u2032\u767c\u751f\u7684\u6a5f\u7387 ( \u2032| \u2032 )\uff0c\u4e26\u4f7f\u8a9e \u8005\u8abf\u9069\u904e\u5f8c\u7684\u8a9e\u8a00\u6a21\u578b ( \u2032 | \u2032)\u4f86\u5171\u540c\u4f30\u6e2c ( \u2032 )\u3002\u5728\u4e0b\u4e00\u7bc0\uff0c\u6211\u5c07\u8aaa\u660e\u5982\u4f55\u4f30\u6e2c ( | \u2032 )\u548c ( \u2032 | )\uff0c\u5176\u4e2d ( | \u2032 )\u6d89\u53ca\u5230\u8a9e\u8005\u7279\u5fb5\u7684\u64f7\u53d6\uff0c\u800c ( \u2032 | )\u662f\u63a2\u8a0e\u5982\u4f55\u5c07\u8a9e", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "\u5176\u4e2d ( | BG )\u4ee3\u8868\u80cc\u666f\u8a5e\u6a21\u578b\uff0c ( | SSWM )\u4ee3\u8868\u8a9e\u8005\u7279\u6b8a\u7528\u8a5e\u6a21\u578b\u3002\u800c\u8a9e\u8005\u7279\u6b8a\u7528\u8a5e\u6a21", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "EQUATION", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [ |
|
{ |
|
"start": 0, |
|
"end": 8, |
|
"text": "EQUATION", |
|
"ref_id": "EQREF", |
|
"raw_str": "\u578b\u7684\u8a13\u7df4\u76ee\u6a19\u51fd\u6578 SSWM \u53ef\u8868\u793a\u70ba SSWM = \u2211 \u2211 ( , ) (\u2211 ( | ) \u2208{ , } ) \u2208 \u2208 (11) \u6839\u64da\u5f0f(11)\u7684\u8a13\u7df4\u76ee\u6a19\u51fd\u6578\uff0c\u6211\u5011\u540c\u6a23\u53ef\u4f7f\u7528\u671f\u671b\u503c\u6700\u5927\u5316(EM)\u6f14\u7b97\u6cd5\u4f86\u4f30\u6e2c\u53c3\u6578\u3002\u5728 E \u6b65\u9a5f\u4ee5\u73fe\u6709\u7684\u6a21\u578b\u53c3\u6578\u6c42\u5f97 ( )\u7684\u671f\u671b\u503c\uff0c\u57fa\u65bc E \u6b65\u9a5f\u5f97\u5230\u7684\u671f\u671b\u503c\uff0c\u5728 M \u6b65\u9a5f\u6700 \u5927\u5316\u76ee\u6a19\u51fd\u6578\uff0c\u91cd\u8907\u76f4\u5230\u6536\u6582\uff0c\u4fbf\u53ef\u4ee5\u5f97\u5230\u7279\u6b8a\u7528\u8a5e\u6a21\u578b ( | SSWM )\u3002 E \u6b65\u9a5f(Expectation Step)\uff1a ( ) = ( | ) \u2211 \u2032 ( | \u2032 ) \u2032 {BG,SSWM}", |
|
"eq_num": "(12)" |
|
} |
|
], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "M \u6b65\u9a5f(Maximization Step)\uff1a ", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [], |
|
"section": "", |
|
"sec_num": null |
|
}, |
|
{ |
|
"text": "EQUATION", |
|
"cite_spans": [], |
|
"ref_spans": [], |
|
"eq_spans": [ |
|
{ |
|
"start": 0, |
|
"end": 8, |
|
"text": "EQUATION", |
|
"ref_id": "EQREF", |
|
"raw_str": "( | SSWM ) = \u2211 ( , s) ( SSWM ) \u2211 \u2211 ( \u2032, ) ( SSWM ) \u2032", |
|
"eq_num": "(13)" |
|
} |
|
], |
|
"section": "", |
|
"sec_num": null |
|
} |
|
], |
|
"back_matter": [], |
|
"bib_entries": { |
|
"BIBREF0": { |
|
"ref_id": "b0", |
|
"title": "Unsupervised language model adaptation", |
|
"authors": [ |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "Bacchiani", |
|
"suffix": "" |
|
}, |
|
{ |
|
"first": "B", |
|
"middle": [], |
|
"last": "Roark", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2003, |
|
"venue": "IEEE International Conference on Acoustics, Speech and Signal Processing", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "M. Bacchiani and B. Roark, \"Unsupervised language model adaptation,\" IEEE International Conference on Acoustics, Speech and Signal Processing, 2003.", |
|
"links": null |
|
}, |
|
"BIBREF1": { |
|
"ref_id": "b1", |
|
"title": "A neural probabilistic language model", |
|
"authors": [ |
|
{ |
|
"first": "Y", |
|
"middle": [], |
|
"last": "Bengio", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2003, |
|
"venue": "Journal of Machine Learning Research", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "Y. Bengio et al., \"A neural probabilistic language model,\" Journal of Machine Learning Research, 2003.", |
|
"links": null |
|
}, |
|
"BIBREF2": { |
|
"ref_id": "b2", |
|
"title": "Future word contexts in neural network language models", |
|
"authors": [ |
|
{ |
|
"first": "X", |
|
"middle": [], |
|
"last": "Chen", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2017, |
|
"venue": "IEEE Automatic Speech Recognition and Understanding Workshop", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "X. Chen et al., \"Future word contexts in neural network language models,\" IEEE Automatic Speech Recognition and Understanding Workshop, 2017.", |
|
"links": null |
|
}, |
|
"BIBREF3": { |
|
"ref_id": "b3", |
|
"title": "Recurrent neural network language model adaptation for multi-genre broadcast speech recognition", |
|
"authors": [ |
|
{ |
|
"first": "X", |
|
"middle": [], |
|
"last": "Chen", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2015, |
|
"venue": "The Annual Conference of the International Speech Communication Association", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "X. Chen et al., \"Recurrent neural network language model adaptation for multi-genre broadcast speech recognition,\" The Annual Conference of the International Speech Communication Association, 2015.", |
|
"links": null |
|
}, |
|
"BIBREF4": { |
|
"ref_id": "b4", |
|
"title": "Empirical evaluation of gated recurrent neural networks on sequence modeling", |
|
"authors": [ |
|
{ |
|
"first": "J", |
|
"middle": [], |
|
"last": "Chung", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2014, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "J. Chung et al., \"Empirical evaluation of gated recurrent neural networks on sequence modeling,\" arXiv, 2014.", |
|
"links": null |
|
}, |
|
"BIBREF5": { |
|
"ref_id": "b5", |
|
"title": "Language modeling with gated convolutional networks", |
|
"authors": [ |
|
{ |
|
"first": "Y", |
|
"middle": [ |
|
"N" |
|
], |
|
"last": "Dauphin", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2016, |
|
"venue": "", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": { |
|
"arXiv": [ |
|
"arXiv:1612.08083" |
|
] |
|
}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "Y. N. Dauphin et al., \"Language modeling with gated convolutional networks,\" arXiv:1612.08083, 2016.", |
|
"links": null |
|
}, |
|
"BIBREF6": { |
|
"ref_id": "b6", |
|
"title": "Character-aware neural language models", |
|
"authors": [ |
|
{ |
|
"first": "Y", |
|
"middle": [], |
|
"last": "Kim", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2016, |
|
"venue": "AAAI Conference on Artificial Intelligence", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "Y. Kim, et al., \"Character-aware neural language models,\" AAAI Conference on Artificial Intelligence, 2016.", |
|
"links": null |
|
}, |
|
"BIBREF7": { |
|
"ref_id": "b7", |
|
"title": "Recurrent neural network based language modeling in meeting recognition", |
|
"authors": [ |
|
{ |
|
"first": "S", |
|
"middle": [], |
|
"last": "Kombrink", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2011, |
|
"venue": "The Annual Conference of the International Speech Communication Association", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "S. Kombrink et al., \"Recurrent neural network based language modeling in meeting recognition,\" The Annual Conference of the International Speech Communication Association, 2011.", |
|
"links": null |
|
}, |
|
"BIBREF8": { |
|
"ref_id": "b8", |
|
"title": "The AMI meeting corpus: A pre-announcement", |
|
"authors": [ |
|
{ |
|
"first": "J", |
|
"middle": [], |
|
"last": "Carletta", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2005, |
|
"venue": "The International Workshop on Machine Learning for Multimodal Interaction", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "J. Carletta et al., \"The AMI meeting corpus: A pre-announcement,\" The International Workshop on Machine Learning for Multimodal Interaction, 2005.", |
|
"links": null |
|
}, |
|
"BIBREF9": { |
|
"ref_id": "b9", |
|
"title": "Recurrent neural network based language model", |
|
"authors": [ |
|
{ |
|
"first": "T", |
|
"middle": [], |
|
"last": "Mikolov", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2010, |
|
"venue": "The Annual Conference of the International Speech Communication Association", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "T. Mikolov et al., \"Recurrent neural network based language model,\" The Annual Conference of the International Speech Communication Association, 2010.", |
|
"links": null |
|
}, |
|
"BIBREF10": { |
|
"ref_id": "b10", |
|
"title": "Long short-term memory", |
|
"authors": [ |
|
{ |
|
"first": "S", |
|
"middle": [], |
|
"last": "Hochreiter", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 1997, |
|
"venue": "Neural Computation", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "S. Hochreiter et al., \"Long short-term memory,\" Neural Computation, 1997.", |
|
"links": null |
|
}, |
|
"BIBREF11": { |
|
"ref_id": "b11", |
|
"title": "LSTM neural networks for language modeling", |
|
"authors": [ |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "Sundermeyer", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2012, |
|
"venue": "The Annual Conference of the International Speech Communication Association", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "M. Sundermeyer et al., \"LSTM neural networks for language modeling,\" The Annual Conference of the International Speech Communication Association, 2012.", |
|
"links": null |
|
}, |
|
"BIBREF12": { |
|
"ref_id": "b12", |
|
"title": "Efficient lattice rescoring using recurrent neural network language models", |
|
"authors": [ |
|
{ |
|
"first": "X", |
|
"middle": [], |
|
"last": "Liu", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2014, |
|
"venue": "IEEE International Conference on Acoustics, Speech and Signal Processing", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "X. Liu et al., \"Efficient lattice rescoring using recurrent neural network language models,\" IEEE International Conference on Acoustics, Speech and Signal Processing, 2014.", |
|
"links": null |
|
}, |
|
"BIBREF13": { |
|
"ref_id": "b13", |
|
"title": "Probabilistic latent semantic analysis", |
|
"authors": [ |
|
{ |
|
"first": "T", |
|
"middle": [], |
|
"last": "Hofmann", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 1999, |
|
"venue": "The Conference on Uncertainty in Artificial Intelligence", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "T. Hofmann, \"Probabilistic latent semantic analysis,\" The Conference on Uncertainty in Artificial Intelligence, 1999.", |
|
"links": null |
|
}, |
|
"BIBREF14": { |
|
"ref_id": "b14", |
|
"title": "Modeling non-linguistic contextual signals in LSTM language models via domain adaptation", |
|
"authors": [ |
|
{ |
|
"first": "M", |
|
"middle": [], |
|
"last": "Ma", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2018, |
|
"venue": "IEEE International Conference on Acoustics, Speech and Signal Processing", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "M. Ma et al., \"Modeling non-linguistic contextual signals in LSTM language models via domain adaptation,\" IEEE International Conference on Acoustics, Speech and Signal Processing, 2018.", |
|
"links": null |
|
}, |
|
"BIBREF15": { |
|
"ref_id": "b15", |
|
"title": "Speaker-aware training of LSTM-RNNs for acoustic modelling", |
|
"authors": [ |
|
{ |
|
"first": "T", |
|
"middle": [], |
|
"last": "Tan", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2016, |
|
"venue": "IEEE International Conference on Acoustics, Speech and Signal Processing", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "T. Tan et al., \"Speaker-aware training of LSTM-RNNs for acoustic modelling,\" IEEE International Conference on Acoustics, Speech and Signal Processing, 2016.", |
|
"links": null |
|
}, |
|
"BIBREF16": { |
|
"ref_id": "b16", |
|
"title": "Purely sequence-trained neural networks for ASR based on lattice-free MMI", |
|
"authors": [ |
|
{ |
|
"first": "D", |
|
"middle": [], |
|
"last": "Povey", |
|
"suffix": "" |
|
} |
|
], |
|
"year": 2016, |
|
"venue": "Annual Conference of the International Speech Communication Association", |
|
"volume": "", |
|
"issue": "", |
|
"pages": "", |
|
"other_ids": {}, |
|
"num": null, |
|
"urls": [], |
|
"raw_text": "D. Povey et al., \"Purely sequence-trained neural networks for ASR based on lattice-free MMI,\" Annual Conference of the International Speech Communication Association, 2016.", |
|
"links": null |
|
} |
|
}, |
|
"ref_entries": { |
|
"FIGREF0": { |
|
"type_str": "figure", |
|
"uris": null, |
|
"num": null, |
|
"text": "\u8a9e\u8005\u7528\u8a5e\u7279\u5fb5\u6a21\u578b(Speaker Word-Usage Characteristics Model) \u6211\u5011\u5e0c\u671b\u80fd\u5920\u64f7\u53d6\u51fa\u4e0d\u540c\u8a9e\u8005\u4f7f\u7528\u8a5e\u5f59\u7684\u7279\u6027\u6216\u983b\u7387\u8cc7\u8a0a\u3002\u70ba \u6b64\uff0c\u6211\u5011\u63d0\u51fa\u4e09\u7a2e\u7522\u751f\u8a9e \u8005\u7279\u5fb5\u7684\u55ae\u8a5e\u6a21\u578b\uff1a\u5206\u5225\u70ba\uff0c\u8a5e\u983b\u6a21\u578b(TF-based Model)\u3001\u57fa\u65bc\u6a5f\u7387\u5f0f\u6f5b\u5728\u8a9e\u610f\u5206\u6790\u6a21 \u578b(PLSA-based Model) [14]\u548c\u8a9e\u8005\u7279\u6b8a\u7528\u8a5e\u6a21\u578b(Speaker Specific Model, SSM)\u3002 (\u4e00) \u57fa\u65bc\u8a5e\u983b\u6a21\u578b(TF-based Model) \u7684\u76ee\u6a19\u662f\u627e\u51fa\u80fd\u6700\u5927\u5316 PLSA \u7684 ( | )\u8207 ( |s)\u3002\u70ba\u4e86\u9054\u5230\u964d\u7dad\u7684\u76ee\u7684\uff0c\u6240\u4ee5\u6211\u5011 \u53d6\u7528\u4f30\u6e2c\u5f97\u5230\u7684\u6a21\u578b\u53c3\u6578 ( |s)\u505a\u70ba\u8a9e\u8005\u7684\u7279\u5fb5\u3002 (\u4e09) \u8a9e\u8005\u7279\u6b8a\u7528\u8a5e\u6a21\u578b(Speaker Specific Word Model, SSWM)" |
|
}, |
|
"TABREF0": { |
|
"content": "<table><tr><td>\u6599\u901a\u5e38\u6709\u8457\u51b7\u9580\u7528\u8a5e\u3001\u77ed\u8a9e\u53e5\u3001\u8a9e\u8a00\u6df7\u96dc\u4f7f\u7528\u548c\u500b\u4eba\u7528\u8a9e\u7fd2\u6163\u7b49\u7279\u6027\uff1b\u6709\u9451\u65bc\u6b64\uff0c\u672c\u7814 \u4e09\u3001\u985e\u795e\u7d93\u8a9e\u8a00\u6a21\u578b\u61c9\u7528\u65bc\u8a9e\u97f3\u8fa8\u8b58 \u56db\u3001\u8a9e\u8005\u8abf\u9069\u65bc\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u6240\u4f7f\u7528\u4e4b\u8a9e\u8a00\u6a21\u578b</td></tr><tr><td>\u7a76\u5617\u8a66\u91dd\u5c0d\u4e0d\u540c\u8a9e\u8005\u6709\u8457\u4e0d\u540c\u7684\u8aaa\u8a71\u7fd2\u6163\u7684\u7279\u6027\uff0c\u4f86\u767c\u5c55\u6539\u826f\u7684\u8a9e\u8a00\u6a21\u578b\u67b6\u69cb\u8207\u8a13\u7df4\u65b9</td></tr><tr><td>\u5f0f\u3002\u5f9e\u7d71\u8a08\u7684\u89c0\u9ede\u4f86\u770b\uff0c\u6bcf\u7a2e\u8a9e\u8a00\u90fd\u6709\u4e00\u5957\u6587\u6cd5\uff0c\u4f46\u662f\u5be6\u969b\u4e0a\u4eba\u5011\u8aaa\u8a71\u6642\uff0c\u4e26\u4e0d\u6703\u56b4\u683c RNN \u6216 LSTM \u8a9e\u8a00\u6a21\u578b\u5728\u505a\u9810\u6e2c\u6642\uff0c\u9700\u8981\u7d93\u904e\u6578\u500b\u77e9\u9663\u904b\u7b97\uff0c\u800c\u4e0d\u50cf\u50b3\u7d71 N \u9023\u8a5e\u8a9e\u8a00 4.1 \u554f\u984c\u89e3\u6790</td></tr><tr><td>\u9075\u5b88\u6587\u6cd5\uff0c\u4e14\u6703\u64c1\u6709\u7fd2\u6163\u7528\u8a9e\u6216\u662f\u53e3\u5403\u7b49\u7368\u7279\u7684\u8aaa\u8a71\u65b9\u5f0f\uff1b\u4f46\u662f\u73fe\u4eca\u5e38\u898b\u7684\u7528\u65bc\u8a9e\u97f3\u8fa8 \u6a21\u578b\u50c5\u9700\u900f\u904e\u67e5\u8868\u4f86\u5b8c\u6210\u3002\u5982\u524d\u9762\u6240\u63d0\u5230\uff0c\u56e0\u70ba\u5b83\u5011\u5728\u57f7\u884c\u6548\u80fd\u4e0a\u7684\u9650\u5236\uff0c\u6240\u4ee5\u591a\u534a\u662f \u5728\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u4efb\u52d9\u4e2d\uff0c\u5f85\u8f49\u5beb\u7684\u8a9e\u97f3\u7d00\u9304\u5e38\u6703\u5305\u542b\u591a\u500b\u8a9e\u8005\uff0c\u4e0d\u540c\u8a9e\u8005\u9593\u5176\u5be6\u5b58\u5728\u7528</td></tr><tr><td>\u8b58\u7684\u8a9e\u8a00\u6a21\u578b\uff0c\u4e26\u4e0d\u6703\u91dd\u5c0d\u4e0d\u540c\u8a9e\u8005\u505a\u4e0d\u540c\u7684\u8abf\u6574\uff0c\u800c\u662f\u5c07\u6574\u4efd\u8a13\u7df4\u8cc7\u6599\u7576\u4f5c\u4e00\u7a2e\u8a9e\u8a00 \u4f7f\u7528\u5728\u7b2c\u4e00\u968e\u6bb5\u89e3\u78bc\u904e\u7522\u751f\u8a5e\u5716\uff0c\u4e26\u5c07\u8a5e\u5716\u4e0a\u53ef\u80fd\u7684\u5019\u9078\u8a5e\u5e8f\u5217\u91cd\u65b0\u8a08\u5206\u3002\u53e6\u4e00\u65b9\u9762\uff0c \u8a9e\u548c\u8b1b\u8a71\u7fd2\u6163\u7684\u5dee\u7570\uff0c\u4f46\u662f\u904e\u53bb\u7528\u65bc\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u7684\u8a9e\u8a00\u6a21\u578b\uff0c\u4e26\u4e0d\u6703\u8003\u616e\u4e0d\u540c\u8a9e\u8005\u6240</td></tr><tr><td>\u6a21\u5f0f\u3002\u6240\u4ee5\u6211\u5011\u5e0c\u671b\u6839\u64da\u4e0d\u540c\u7684\u8a9e\u8005\uff0c\u5c0d\u8a9e\u8a00\u6a21\u578b\u7684\u8a13\u7df4\u8207\u9810\u6e2c\u63d0\u4f9b\u984d\u5916\u7684\u8cc7\u8a0a\uff0c\u4e5f\u5c31 \u540c\u6a23\u56e0\u70ba\u57f7\u884c\u6548\u7387\uff0c\u8a5e\u5716\u9700\u7d93\u904e\u5019\u9078\u8a5e\u5e8f\u5217\u7684\u522a\u6e1b(Pruning)\u624d\u80fd\u76f4\u63a5\u91cd\u65b0\u8a08\u5206\uff0c\u7f3a\u9ede\u662f \u9020\u6210\u8a9e\u8a00\u4f7f\u7528\u884c\u70ba\u4e0d\u540c\u7684\u554f\u984c\u3002\u4ee5\u4e0b\u5c07\u63a2\u8a0e\u5982\u4f55\u904b\u7528\u300c\u8a13\u7df4\u8a9e\u6599\u4e2d\u7684\u8a9e\u8005\u8cc7\u8a0a\u300d\u4f86\u8f14\u52a9</td></tr><tr><td>\u662f\u5c0d\u8a9e\u8a00\u6a21\u578b\u4f5c\u8a9e\u8005\u8abf\u9069(Speaker Adaptation)\u3002\u70ba\u6b64\uff0c\u672c\u8ad6\u6587\u8003\u616e\u5169\u7a2e\u6e2c\u8a66\u968e\u6bb5\u7684\u60c5\u5883 \u6703\u55aa\u5931\u4e00\u4e9b\u5019\u9078\u8a5e\u5e8f\u5217\u7684\u90e8\u5206\u8def\u5f91\u3002\u6216\u662f\u85c9\u7531 N \u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\u7522\u751f M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217 \u8a9e\u8a00\u6a21\u578b\u7684\u8a13\u7df4\u4ee5\u9054\u5230\u8a9e\u8005\u8abf\u9069\u7684\u6548\u679c\u3002</td></tr><tr><td>(M-best List)\uff0c\u518d\u5c07\u4e4b\u91cd\u65b0\u6392\u5e8f\u3002\u5716\u4e00\u662f AMI \u82f1\u6587\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u4efb\u52d9\u6e2c\u8a66\u96c6(\u8acb\u53c3\u898b\u7b2c\u4e94 \u2500\u300c\u5df2\u77e5\u8a9e\u8005\u300d\u548c\u300c\u672a\u77e5\u8a9e\u8005\u300d \uff0c\u4e26\u63d0\u51fa\u4e86\u5c0d\u61c9\u6b64\u5169\u7a2e\u60c5\u5883\u7684\u8a9e\u8005\u7279\u5fb5\u64f7\u53d6\u65b9\u6cd5\uff0c\u4ee5\u53ca \u5728\u7b2c\u4e00\u968e\u6bb5\u7684\u8a9e\u97f3\u8fa8\u8b58\u7684\u904e\u7a0b\uff0c\u901a\u5e38\u57fa\u65bc\u7576\u8a9e\u97f3\u8a0a\u865f \u767c\u751f\u6642\u8a5e\u5e8f\u5217 \u7684\u4e8b\u5f8c\u6a5f\u7387 \u7bc0)\u4e2d 1,000 \u5019\u9078\u8a5e\u5e8f\u5217\u88ab\u8a5e\u5716\u5305\u542b\u7684\u67e5\u5168\u7387(Recall)\u3002 \u63a2\u8a0e\u5982\u4f55\u5229\u7528\u8a9e\u8005\u7279\u5fb5\u4f86\u8f14\u52a9\u8a9e\u8a00\u6a21\u578b\u7684\u8a13\u7df4\u3002 \u4f86\u9032\u884c\u8a9e\u8a00\u89e3\u78bc\uff0c\u4e26\u53ef\u5316\u7c21\u6210\u4f9d ( | ) ( )\u4f86\u6c7a\u5b9a\u8a5e\u5e8f\u5217 \u662f\u6700\u7d42\u8a9e\u97f3\u8fa8\u8b58\u8f38\u51fa\u7684\u53ef</td></tr><tr><td>\u4e8c\u3001\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u76f8\u95dc\u7814\u7a76 \u80fd\u6027(\u6216\u6240\u8b02\u7684\u6392\u5e8f\u5206\u6578)\uff0c\u5982\u5f0f(1)\u6240\u793a\u3002</td></tr><tr><td>\u81f3\u4eca\uff0c\u985e\u795e\u7d93\u7db2\u8def\u61c9\u7528\u5728\u8a9e\u8a00\u6a21\u578b\u8b8a\u5316\u7e41\u591a\uff0c\u6700\u65e9\u7684\u61c9\u7528\u53ef\u56de\u6eaf\u65bc Yoshua Bengio \u5728 2003</td></tr><tr><td>\u5e74\u63d0\u51fa\u7684\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\uff0c\u5728[2]\u4e2d\u5c07 N \u9023\u8a5e\u7684\u4f30\u6e2c\u4ea4\u7531\u985e\u795e\u7d93\u7db2\u8def\u4f86\u8a08\u7b97\u3002\u53e6\u4e00</td></tr><tr><td>\u65b9\u9762\uff0c\u70ba\u4e86\u6539\u5584 N \u9023\u8a5e\u7121\u6cd5\u61c9\u4ed8\u8cc7\u6599\u592a\u904e\u7a00\u758f\u7684\u7f3a\u9ede\uff0c\u4ed6\u4e5f\u5c07\u4e00\u500b\u91cd\u8981\u7684\u6982\u5ff5\u2500\u8a5e\u5d4c</td></tr><tr><td>\u5165(Word Embeddings)\uff0c\u61c9\u7528\u5728\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u4e2d\u3002\u5728 2010 \u5e74\uff0c\u6709\u5b78\u8005\u63d0\u51fa\u4e86\u905e\u8ff4</td></tr><tr><td>\u5f0f\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b(Recurrent Neural Network Language Model, RNNLM)[10]\uff0c\u8b93\u8a9e \u4e00\u3001\u7dd2\u8ad6 \u8a00\u6a21\u578b\u4e0d\u518d\u53d7\u5230 N \u9023\u8a5e\u7684\u9650\u5236\uff0c\u6b77\u53f2\u8a5e\u4e0d\u518d\u53ea\u80fd\u662f N-1 \u500b\u8a5e\u3002\u4f46\u662f\u9019\u500b\u65b9\u6cd5\u7684\u7f3a\u9ede\u662f</td></tr><tr><td>\u8a9e\u97f3\u8fa8\u8b58\u6280\u8853\u8d8a\u8da8\u6210\u719f\uff0c\u751f\u6d3b\u4e2d\u96a8\u8655\u53ef\u898b\u5176\u61c9\u7528\uff1b\u81ea\u52d5\u8a9e\u97f3\u8fa8\u8b58\u6280\u8853(Automatic Speech \u6a21\u578b\u96e3\u4ee5\u8a13\u7df4\uff0c\u4e26\u4e14\u5bb9\u6613\u906d\u53d7\u68af\u5ea6\u6d88\u5931\u6216\u7206\u70b8(Gradient Vanishing or Exploding)\u7684\u554f\u984c[8]\u3002</td></tr><tr><td>Recognition, ASR)\u8b93\u96fb\u8166\u80fd\u807d\u5f97\u61c2\u4eba\u985e\u7684\u8a9e\u8a00\uff0c\u4e5f\u5c31\u662f\u8a66\u5716\u7406\u89e3\u4eba\u985e\u5728\u767c\u97f3\u4e0a\u548c\u7528\u8a9e\u4e0a Martin Sundermeyer \u5728 2012 \u5e74\u63d0\u51fa\u4e86\u5229\u7528\u9577\u77ed\u671f\u8a18\u61b6(Long Short-Term Memory, LSTM)</td></tr><tr><td>\u7684\u898f\u5247\u8207\u5167\u5bb9\u3002\u8a9e\u8a00\u6a21\u578b\u662f\u4e00\u7a2e\u5c07\u6587\u5b57\u6587\u672c\u6a21\u578b\u5316\u7684\u6280\u8853\uff0c\u5728\u8a9e\u97f3\u8fa8\u8b58\u4efb\u52d9\u4e0a\uff0c\u8a9e\u8a00\u6a21 \u8a9e\u8a00\u6a21\u578b\u89e3\u6c7a\u9019\u500b\u554f\u984c[11, 12]\u3002\u5f9e\u6b64\uff0cLSTM \u8a9e\u8a00\u6a21\u578b\u4e00\u76f4\u662f\u88ab\u8996\u70ba\u6700\u597d\u7684\u8a9e\u8a00\u6a21\u578b\u67b6 \u5716\u4e00\u30011,000 \u5019\u9078\u8a5e\u5e8f\u5217\u88ab\u8a5e\u5716\u5305\u542b\u7684\u67e5\u5168\u7387</td></tr><tr><td>\u578b\u53ef\u4ee5\u5224\u65b7\u4e00\u689d\u8a5e\u5e8f\u5217\u662f\u5426\u7b26\u5408\u8a13\u7df4\u6587\u672c\u7684\u6240\u96b1\u542b\u7684\u898f\u5247\u6216\u898f\u5f8b\u6027\u3002\u5e38\u7528\u65bc\u8a9e\u97f3\u8fa8\u8b58\u7684 \u69cb\u4e4b\u4e00\uff1b\u4f46\u662f\u4e5f\u6709\u4e9b\u5b78\u8005\u8a66\u5716\u4f7f\u7528\u5176\u5b83\u67b6\u69cb\u5efa\u6a21\uff0c\u4f8b\u5982 Yann N. Dauphin \u5728 2016 \u5e74\u63d0\u51fa</td></tr><tr><td>N \u9023\u8a5e\u6a21\u578b\u662f\u5229\u7528 N \u9023\u8a5e\u7684\u767c\u751f\u6a5f\u7387\u6a21\u578b\u5316\u6587\u672c\u4e2d\u8a5e\u5f59\u5171\u540c\u51fa\u73fe\u7684\u95dc\u4fc2\u3002\u4f46\u662f N \u9023\u8a5e \u4e86\u5728\u647a\u7a4d\u5f0f\u985e\u795e\u7d93\u7db2\u8def\u4e0a\u9762\u52a0\u4e0a\u95a5\u9580(Gate)\uff0c\u80fd\u7a0d\u5fae\u7684\u6539\u5584\u8a9e\u8a00\u6a21\u578b\u6548\u80fd\uff0c\u4f46\u662f\u4e5f\u56e0\u70ba \u53ef\u4ee5\u7531\u5716\u4e00\u767c\u73fe\uff0c1,000 \u5019\u9078\u8a5e\u5e8f\u5217\u4e2d\u6709\u8a31\u591a\u8a5e\u5e8f\u5217\u4e26\u6c92\u6709\u5728\u8a5e\u5716\u91cd\u65b0\u6392\u5e8f\u6642\u88ab\u8a08\u7b97\uff0c</td></tr><tr><td>\u6a21\u578b\u5728\u9762\u81e8\u8cc7\u6599\u904e\u65bc\u7a00\u758f(Data Sparseness)\u6642\uff0c\u4fbf\u6709\u96e3\u4ee5\u4f30\u6e2c\u7684\u554f\u984c\u3002\u539f\u56e0\u662f N \u9023\u8a5e\u6a21 \u5b83\u7684\u7db2\u8def\u904e\u65bc\u8907\u96dc\u800c\u7522\u751f\u8a13\u7df4\u4e0d\u6613\u7b49\u554f\u984c[6]\u3002 95~100%\u53ea\u4f54\u5168\u90e8\u7684 64.68%\u3002\u53e6\u4e00\u65b9\u9762\uff0c\u5f9e\u8868\u4e00\u6240\u793a(AMI \u82f1\u6587\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u4efb\u52d9)\u8a5e\u932f</td></tr><tr><td>\u578b\u5047\u8a2d\u6bcf\u500b\u8a5e\u6709\u5404\u81ea\u7368\u7acb\u7684\u8a9e\u610f\uff0c\u50c5\u5f9e\u7d71\u8a08\u7684\u89c0\u9ede\u4f86\u8003\u616e\u5f7c\u6b64\u5171\u540c\u51fa\u73fe\u95dc\u4fc2\u4e26\u4f30\u6e2c\u6a21\u578b \u8aa4\u7387\u6578\u64da\u4e5f\u53ef\u4ee5\u767c\u73fe\uff0c\u5c31\u6b64\u6703\u8b70\u8a9e\u97f3\u8a9e\u6599\u5eab\uff0c\u8a5e\u5716\u91cd\u65b0\u6392\u5e8f\u4e26\u4e0d\u6703\u6bd4 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f</td></tr><tr><td>[1]\u3002\u985e\u795e\u7d93\u7db2\u8def\u4fbf\u53ef\u4ee5\u89e3\u6c7a\u9019\u500b\u554f\u984c\uff0c\u56e0\u70ba\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u53ef\u4ee5\u7fd2\u5f97\u8a5e\u8a9e\u7684\u5206\u5e03 \u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u56e0\u57f7\u884c\u6548\u7387\u5c0e\u81f4\u5169\u500b\u554f\u984c\uff0c\u5176\u4e00\uff0c\u96e3\u4ee5\u7528\u5728\u8a9e\u97f3\u8fa8\u8b58\u7684\u7b2c\u4e00\u968e \u5217\u91cd\u65b0\u6392\u5e8f\u7684\u7d50\u679c\u4f86\u7684\u597d\u3002\u9a57\u8b49\u904e\u53bb\u7814\u7a76\u7684\u5be6\u9a57\u4e00\u81f4 [13]\u3002\u4f9d\u64da\u6b64\u7d50\u679c\uff0c\u6211\u5011\u7684\u8a9e\u8a00</td></tr><tr><td>\u5f0f\u8868\u793a(Distributed Representation) [2]\uff0c\u4f7f\u5f97\u8a5e\u5f59\u9593\u7684\u8a9e\u610f\u95dc\u4fc2\u80fd\u88ab\u8868\u793a\u51fa\u4f86\u3002\u64da\u6b64\uff0c\u900f \u6bb5\u89e3\u78bc(First Pass Decoding)\uff0c\u6240\u4ee5\u901a\u5e38\u7528\u5728\u7b2c\u4e8c\u968e\u6bb5\u7684\u8a9e\u8a00\u6a21\u578b\u91cd\u65b0\u6392\u5e8f(Rescoring)\uff1b \u6a21\u578b\u8abf\u9069\u5be6\u9a57\u5c07\u6703\u5728 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217\u91cd\u65b0\u6392\u5e8f\u6642\u9032\u884c\u3002</td></tr><tr><td>\u5176\u4e8c\uff0c\u53ea\u80fd\u5c07\u985e\u795e\u7d93\u7db2\u8def\u61c9\u7528\u5728 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217(M-best List)\u7684\u91cd\u65b0\u6392\u5e8f\uff0c\u800c\u4e0d\u80fd\u61c9 \u904e\u63a5\u7e8c\u4e32\u63a5\u7684\u524d\u994b\u5f0f\u795e\u7d93\u7db2\u8def(Feedforward Neural Networks, FNN)\u6216\u662f\u905e\u8ff4\u5f0f\u795e\u7d93\u7db2\u8def \u8868\u4e00\u3001\u8a5e\u5716\u91cd\u65b0\u6392\u5e8f\u8207 M \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217\u91cd\u65b0\u6392\u5e8f(M=1,000)\u4e4b\u6bd4\u8f03 \u7528\u5728\u8a5e\u5716(Lattice)\u3002\u70ba\u4e86\u89e3\u6c7a\u9019\u500b\u554f\u984c\uff0cXunying Liu \u63d0\u51fa\u85c9\u7531\u6e1b\u5c11\u8a5e\u5716\u7684\u5206\u652f\u4ee5\u52a0\u901f\u8a5e (Recurrent Neural Networks, RNN)\u53ef\u4ee5\u4f86\u505a\u9810\u6e2c\u3002\u8fd1\u5e7e\u5e74\u5df2\u6709\u8a31\u591a\u7684\u7814\u7a76\u5617\u8a66\u6539\u826f\u524d\u994b \u8a5e\u932f\u8aa4\u7387 \u767c\u5c55\u96c6 \u6e2c\u8a66\u96c6 \u5716\u4e2d\u5019\u9078\u8a5e\u5e8f\u5217\u7684\u91cd\u65b0\u6392\u5e8f\u3002\u5b83\u5c6c\u65bc\u4e00\u7a2e\u8fd1\u4f3c\u7684\u65b9\u6cd5\uff0c\u6240\u4ee5\u6703\u4f7f\u5f97\u7d50\u679c\u7565\u905c\u65bc\u5019\u9078\u8a5e\u5e8f \u5f0f\u795e\u7d93\u7db2\u8def\uff0c\u4e26\u7528\u61c9\u7528\u5230\u4e0d\u540c\u7684\u81ea\u7136\u8a9e\u8a00\u8655\u7406\u548c\u8a9e\u97f3\u8fa8\u8b58\u4efb\u52d9\u4e0a[3-8]\u3002 \u5217\u91cd\u65b0\u6392\u5e8f\uff0c\u4f46\u537b\u80fd\u4f7f\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u4e5f\u53ef\u6709\u6548\u61c9\u7528\u5728\u8a5e\u5716\u91cd\u65b0\u6392\u5e8f[13]\u3002 1,000 \u5019\u9078\u8a5e\u5e8f\u5217\u91cd\u65b0\u6392\u5e8f 21.17% 20.41%</td></tr><tr><td>\u6703\u8b70\u8a9e\u97f3\u8a9e\u6599\u548c\u4e00\u822c\u65b0\u805e\u3001\u6717\u8b80\u7b49\uff0c\u8f03\u70ba\u56b4\u8b39\u7684\u8a9e\u6599\u6709\u975e\u5e38\u5927\u7684\u5dee\u7570[9]\uff0c\u6703\u8b70\u8a9e \u8a5e\u5716\u91cd\u65b0\u6392\u5e8f 21.53% 20.75%</td></tr></table>", |
|
"text": "\u95dc\u9375\u8a5e\uff1a\u6703\u8b70\u8a9e\u97f3\u8fa8\u8b58\u3001\u8a9e\u8a00\u6a21\u578b\u3001\u8a9e\u8005\u8abf\u9069\u3001\u905e\u8ff4\u5f0f\u985e\u795e\u7d93\u7db2\u8defKeywords: speech recognition, language modeling, speaker adaptation, recurrent neural networks.", |
|
"type_str": "table", |
|
"html": null, |
|
"num": null |
|
}, |
|
"TABREF1": { |
|
"content": "<table><tr><td colspan=\"5\">\u8868\u4e8c\u3001\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u8a9e\u8a00\u6a21\u578b\u4e4b\u8a13\u7df4\u3001\u767c\u5c55\u8207\u6e2c\u8a66\u96c6 \u8868\u56db\u3001\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u4e0a\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6\u548c\u8a9e\u97f3\u8fa8\u8b58\u7d50\u679c \u8868\u4e94\u3001AMI \u6703\u8b70\u8a9e\u6599\u5be6\u9a57\u7d50\u679c</td><td/></tr><tr><td>\u8a9e\u6599\u578b\u5225 \u83ef\u8a9e\u6703\u8b70\u8a9e\u6599 AMI \u82f1\u6587\u6703\u8b70\u8a9e\u6599 \u8a13\u7df4\u96c6</td><td>\u767c\u5c55\u96c6</td><td>\u767c\u5c55\u96c6 \u767c\u5c55\u96c6</td><td>\u6e2c\u8a66\u96c6</td><td>\u6e2c\u8a66\u96c6 \u6e2c\u8a66\u96c6</td><td>\u7e3d\u8a08</td></tr><tr><td colspan=\"6\">\u8a9e\u8005\u7279\u5fb5\u4f5c\u70ba\u8f14\u52a9\u7279\u5fb5\u8207\u4e3b\u8981\u7279\u5fb5(\u4e00\u822c\u8a5e\u5f59\u7279\u5fb5)\u5171\u540c\u8f38\u5165\u5230\u96b1\u85cf\u5c64(\u5982\u5716\u4e09\u6240\u793a)\u3002 4.2.4 \u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b(Speaker Adaptive Mixture Model, SAMM) \u5716\u4e09\u3001\u8a9e\u8005\u8abf\u9069\u8cc7\u8a0a\u878d\u5165 RNN(\u6216 LSTM)\u70ba\u57fa\u790e\u7684\u8a9e\u8a00\u6a21\u578b \u6211\u5011\u6240\u4f7f\u7528\u7684\u8a9e\u6599\u5eab\u70ba\u300c\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u300d\u4ee5\u53ca\u300cAMI \u6703\u8b70\u8a9e\u6599\u300d[9]\u3002\u5176\u4e2d\uff0c\u83ef\u8a9e\u6703\u8b70 \u5c0f\u6642\u6578(\u5c0f\u6642) 44.2 1.5 1.1 46.8 \u8a9e\u53e5\u6578(\u53e5) 42,998 1,267 1,019 \u8907\u96dc\u5ea6 CER \u8907\u96dc\u5ea6 CER \u8907\u96dc\u5ea6 CER \u8907\u96dc\u5ea6 CER 45,284 \u8a9e\u8005\u6578(\u4f4d) 20 9 (1 \u7121\u51fa\u73fe\u5728\u8a13\u7df4\u96c6) 6 (1 \u7121\u51fa\u73fe\u5728\u8a13\u7df4\u96c6) 21 \u8868\u4e09\u3001AMI \u6703\u8b70\u4e4b\u8a13\u7df4\u3001\u767c\u5c55\u8207\u6e2c\u8a66\u96c6 \u8a9e\u6599\u578b\u5225 \u8a13\u7df4\u96c6 \u767c\u5c55\u96c6 \u6e2c\u8a66\u96c6 \u7e3d\u8a08 \u5c0f\u6642\u6578(\u5c0f\u6642) 70.29 7.81 8.71 95.79 \u8a9e\u53e5\u6578(\u53e5) 97,222 10,882 13,059 \u7b2c\u4e00\u968e\u6bb5\u8a9e\u97f3\u8fa8(\u4e09\u9023\u8a5e\u8a9e\u8a00 \u6a21\u578b) 205.11 20.19 210.26 17.23 \u7b2c\u4e00\u968e\u6bb5\u8a9e\u97f3\u8fa8\u8b58(\u4e09\u9023\u8a5e\u8a9e\u8a00 85.19 23.25 76.44 23.02 \u6a21\u578b) \u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21\u578b\u91cd\u65b0\u6392\u5e8f (\u57fa\u790e LSTM \u8a9e\u8a00\u6a21\u578b) 161.20 (184.44) 16.89 165.44 (191.97) 15.91 \u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21\u578b\u91cd\u65b0\u6392\u5e8f(\u57fa 68.02 (73.40) 21.17 60.61 (65.40) 20.41 \u790e LSTM \u8a9e\u8a00\u6a21\u578b) +\u57fa\u65bc TF \u7684\u8a9e\u8005\u7279\u5fb5 158.99 (202.35) 16.84 163.26 15.84 +\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5 66.28 (104.1) 21.07 59.05 (93.12) 20.32 (208.35) 133,775 \u8a9e\u8005\u6578(\u4f4d) 155 21 (19 \u7121\u51fa\u73fe\u5728\u8a13\u7df4\u96c6) 16 (16 \u7121\u51fa\u73fe\u5728\u8a13\u7df4\u96c6) 173 +\u57fa\u65bc PLSA \u7684\u8a9e\u8005\u7279\u5fb5 156.20 (188.00) 16.75 158.41 (199.51) 16.84 162.94 (210.31) \u8868\u53cd\u4f8b\u7684\u8a9e\u53e5\uff0c\u5c0e\u81f4\u4f9d\u5176\u6240\u5efa\u7acb\u4e4b\u5177\u8a9e\u8005\u8abf\u9069\u6027\u7684\u8a9e\u8a00\u6a21\u578b\u6c92\u6709\u5c55\u73fe\u51fa\u671f\u671b\u4e2d\u7684\u6548\u80fd\u3002 15.91 +\u57fa\u65bc SSWM \u7684\u8a9e\u8005\u7279\u5fb5 \u5f97 CER \u6709\u4e9b\u8a31\u7684\u4e0a\u5347\uff0c\u63a8\u6e2c\u61c9\u662f\u5728 CNN \u6a21\u578b\u8a13\u7df4\u6642\uff0c\u53cd\u4f8b\u8a9e\u53e5\u6aa2\u7d22\u4e26\u6c92\u6709\u627e\u5230\u8db3\u4ee5\u4ee3 (194.16) 15.86 160.93 \u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b(SAMM) 67.61 (99.80) 21.04 60.43 (93.60) 20.33</td></tr><tr><td colspan=\"6\">\u5118\u7ba1\u55ae\u8a5e\u6a21\u578b\u80fd\u5920\u8868\u73fe\u8a9e\u8005\u7684\u7528\u8a5e\u7fd2\u6163\uff0c\u4f46\u662f\u9019\u985e\u55ae\u8a5e\u6a21\u578b\u7684\u65b9\u6cd5\u5177\u6709\u5169\u9805\u7f3a\u9ede\uff1a(1) \u7121\u6cd5\u8868\u73fe\u8a9e\u8005\u7684\u524d\u5f8c\u6587\u7528\u8a9e\u7fd2\u6163\uff1b(2)\u6e2c\u8a66\u8a9e\u53e5(\u5019\u9078\u8a5e\u5e8f\u5217)\u4e5f\u5fc5\u9808\u6709\u5c0d\u61c9\u7684\u8a9e\u8005\u8cc7\u8a0a\u3002 4.2.2 \u8a9e\u8005\u6163\u7528\u8a9e\u6a21\u578b(Speaker Slang Model, SSM) \u4e0a\u8ff0\u7684\u65b9\u6cd5\u6ce8\u91cd\u7684\u662f\u8a9e\u8005\u7684\u7528\u8a5e\u7279\u5fb5\uff0c\u7528\u55ae\u8a5e\u6a21\u578b\u7684\u7d50\u69cb\u63cf\u8ff0\u8a9e\u8005\u3002\u4f46\u662f\u9664\u4e86\u7528\u8a5e\u5916\uff0c \u8aaa\u8a71\u6642\u4eba\u5011\u4e5f\u5e38\u6703\u6709\u7fd2\u6163\u6027\u7684\u7528\u8a9e\uff0c\u4e14\u4e26\u4e0d\u9650\u65bc\u55ae\u4e00\u8a5e\u5f59\uff0c\u4f8b\u5982\uff1a\u6709\u7684\u4eba\u8aaa\u300c\u5c0d\u554a\u300d\u6642 \u6703\u7fd2\u6163\u6027\u7684\u8b1b\u5169\u6b21\u8b8a\u6210\uff0c \u300c\u5c0d\u554a\u3001\u5c0d\u554a\u300d \u3002\u70ba\u6b64\uff0c\u6211\u5011\u5e0c\u671b\u8a2d\u8a08\u51fa\u80fd\u8868\u793a\u6163\u7528\u8a9e\u7279\u5fb5\u7684\u8a9e \u8005\u6a21\u578b\u3002\u5728\u672c\u7814\u7a76\uff0c\u6211\u5011\u4f7f\u7528\u647a\u7a4d\u5f0f\u985e\u795e\u7d93\u7db2\u8def(Convolutional Neural Network, CNN)\u5c0d \u6bcf\u4e00\u500b\u8a9e\u53e5\u9032\u884c\u7279\u5fb5\u64f7\u53d6\uff0c\u4e26\u4f7f\u7528\u5176\u96b1\u85cf\u5c64\u4f86\u505a\u70ba\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u8868\u793a\u3002 \u5716\u4e8c\u3001\u4ee5\u647a\u7a4d\u5f0f\u985e\u795e\u7d93\u7db2\u8def\u64f7\u53d6\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5 \u76f8\u8f03\u65bc\u4e0a\u8ff0\u5169\u968e\u6bb5\u5f0f\u65b9\u6cd5\uff0c\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b(SAMM)\u5247\u662f\u5728\u8a13\u7df4\u8a9e\u8a00\u6a21\u578b\u6642\uff0c\u52d5\u614b\u5730 \u64f7\u53d6\u8a9e\u8005\u7279\u5fb5\uff0c\u6240\u4ee5\u53ef\u76f4\u63a5\u4f7f\u7528\u65bc\u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21\u578b\u91cd\u65b0\u6392\u5e8f\u3002SAMM \u7684\u4e3b\u8981\u60f3\u6cd5\u662f \u8b93\u8a9e\u8a00\u6a21\u578b\u53ef\u81ea\u884c\u52d5\u614b\u5730\u4f30\u6e2c\u76ee\u524d\u7684\u8a9e\u8005\uff0c\u5148\u8a13\u7df4\u7279\u5b9a\u8a9e\u8005\u6216\u5177\u4ee3\u8868\u6027\u8a9e\u8005(Specific or \u7b2c\u4e00\u6b65\uff1a\u4f7f\u7528\u6240\u6709\u8a13\u7df4\u8a9e\u6599\u4f86\u8a13\u7df4\u80cc\u666f LSTM \u8a9e\u8a00\u6a21\u578b\u3002 \u8a9e\u6599\u5eab\u70ba\u570b\u5167\u4f01\u696d\u6240\u6574\u7406\u7684\u8a9e\u6599\u5eab\u3002\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u5c0d\u65bc\u6703\u8b70\u8ac7\u8a71\u5167\u5bb9\u8207\u53c3\u8207\u4eba\u54e1\u7684\u5c0d\u8a71 \u65b9\u5f0f\u4e26\u6c92\u6709\u7d93\u904e\u8a2d\u8a08\uff0c\u800c\u662f\u8cbc\u8fd1\u4e00\u822c\u516c\u53f8\u5728\u5be6\u969b\u958b\u6703\u4e2d\u5c07\u6703\u9762\u81e8\u7684\u554f\u984c\u3002\u4f8b\u5982\u804a\u5230\u5c08\u696d +\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279 \u5fb5 161.20 (255.85) 16.88 \u53e6\u4e00\u65b9\u9762\u672c\u8ad6\u6587\u6240\u63d0\u51fa\u7684\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b(SAMM)\u4e0d\u7ba1\u662f\u5728\u8907\u96dc\u5ea6\u6216\u662f CER\uff0c\u5747 \u8207\u4e0a 165.44 15.94 (264.11) \u8ff0\u65b9\u6cd5\u65d7\u9f13\u76f8\u7576\u3002 \u5716\u56db\u3001SAMM \u8a9e\u8005\u7279\u5fb5\u64f7\u53d6 \u6280\u8853\u6642\uff0c\u5e38\u6703\u51fa\u73fe\u4e2d\u82f1\u6587\u593e\u96dc\u7684\u5c0d\u8a71\uff1b\u767c\u8868\u8ac7\u8a71\u6642\u53ef\u80fd\u6709\u505c\u9813\u3001\u53e3\u9f52\u4e0d\u6e05\u6216\u53e3\u5403\u7684\u73fe\u8c61\u3002 \u76f8\u8f03\u65bc AMI \u8a9e\u6599\u5eab\uff0c\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u66f4\u5177\u6311\u6230\u6027\uff1b\u8868\u4e8c\u662f\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u7684\u8a73\u7d30\u7d71\u8a08\u8cc7\u8a0a\u3002 \u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b(SAMM) 158.52 (184.45) 16.75 161.71 15.89 \u672c\u8ad6\u6587\u7b2c\u4e8c\u7d44\u7684\u5be6\u9a57\u5728 AMI \u82f1\u6587\u6703\u8b70\u8a9e\u6599\uff1b\u8868\u4e94\u662f\u5728\u6b64\u8a9e\u6599\u4e0a\u7684\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6 (187.68) \u647a\u7a4d\u5f0f\u985e\u795e\u7d93\u7db2\u8def\u7684\u4efb\u52d9\u662f\u8a9e\u8005\u8b58\u5225\u3002\u56e0\u70ba\u6bcf\u500b\u8a9e\u53e5\u7684\u5167\u5bb9\u4e26\u4e0d\u4e00\u5b9a\u6703\u53ea\u4f86\u81ea\u67d0\u4f4d \u8a9e\u8005(\u8209\u4f8b\u4f86\u8aaa\uff0c\u6211\u5011\u80fd\u78ba\u5b9a\u67d0 A \u53e5\u662f\u7531\u67d0\u8a9e\u8005\u6240\u8ff0\uff0c\u4f46\u662f\u4e0d\u80fd\u80af\u5b9a A \u53e5\u4e0d\u6703\u51fa\u81ea\u5176\u4ed6 \u8a9e\u8005)\uff0c\u6240\u4ee5\u5728\u8f38\u51fa\u5c64\u7684\u8a2d\u8a08\u4e0a\uff0c\u6211\u5011\u4e0d\u662f\u9078\u7528\u5206\u985e\u7684\u6b78\u4e00\u5316\u6307\u6578\u51fd\u6578(Softmax)\uff0c\u800c\u662f \u91dd\u5c0d\u6bcf\u4f4d\u8a9e\u8005\u5c0d\u61c9\u5404\u81ea\u7684 S \u51fd\u6578(Sigmoid)\u3002\u800c\u5728\u6b63\u4f8b\u548c\u53cd\u4f8b\u4e0a\uff0c\u6b63\u4f8b\u70ba\u5c6c\u65bc\u8a72\u8a9e\u8005\u7684 \u8a9e\u53e5\uff1b\u53e6\u4e00\u65b9\u9762\uff0c\u6211\u5011\u85c9\u7531\u67e5\u8a62\u76f8\u4f3c\u5ea6\u4f30\u8a08(Query Likelihood Estimation, QLE)\u8a08\u7b97\u8207\u5176 \u76f8\u8ddd\u6700\u9060\u7684\u8a9e\u8005\uff0c\u5f9e\u4e2d\u96a8\u6a5f\u6311\u9078\u8a9e\u53e5\u4f5c\u70ba\u8a72\u8a9e\u8005\u7684\u53cd\u4f8b\u8a9e\u53e5\u3002\u5716\u4e8c\u70ba\u4ee5\u647a\u7a4d\u5f0f\u985e\u795e\u7d93\u7db2 \u8def\u64f7\u53d6\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u7684\u793a\u610f\u5716\u3002 4.2.3 \u8a9e\u8005\u7279\u5fb5\u7528\u65bc\u8a9e\u8a00\u6a21\u578b\u8abf\u9069 \u5728\u7372\u5f97\u8a9e\u8005\u8a5e\u5f59\u6216\u6163\u7528\u8a9e\u7279\u5fb5\u5f8c\uff0c\u6211\u5011\u4fbf\u53ef\u5c07\u4e4b\u904b\u7528\u5728\u8a9e\u8a00\u6a21\u578b\u7684\u8abf\u9069\u3002\u904e\u5f80\u5e38\u898b\u65bc\u985e \u795e\u7d93\u7db2\u8def\u7684\u8abf\u9069\u67b6\u69cb\u4e3b\u8981\u53ef\u5206\u6210\u5169\u985e\u3002\u7b2c\u4e00\u985e\u662f\u6dfb\u52a0\u8f14\u52a9\u7279\u5fb5\u5230\u4e3b\u4efb\u52d9\u7684\u96b1\u85cf\u5c64(\u4e3b\u4efb \u52d9\u70ba\u8a9e\u8a00\u6a21\u578b\u8a13\u7df4) \uff0c\u53e6\u4e00\u985e\u5247\u5c07\u7279\u5fb5\u7528\u65bc\u591a\u4efb\u52d9\u5b78\u7fd2(Multi-task Learning)\u7684\u526f\u4efb\u52d9\u3002 \u53e6\u4e00\u65b9\u9762\uff0c\u5728[15]\u7684\u5be6\u9a57\u7d50\u679c\u537b\u6307\u51fa\uff0c\u5728\u8f38\u5165\u5c64\u6dfb\u52a0\u8f14\u52a9\u7279\u5fb5\u53ef\u7372\u5f97\u66f4\u597d\u7684\u6548\u679c\u3002\u5176\u4ed6 \u795e\u7d93\u7db2\u7d61\u6a21\u578b\u8abf\u9069\u7684\u76f8\u95dc\u7814\u7a76\u4e5f\u8868\u660e\uff0c\u5c07\u8f14\u52a9\u7279\u5fb5\u76f4\u63a5\u9644\u52a0\u5230\u4e3b\u8981\u7279\u5fb5\u53ef\u5e36\u4f86\u6700\u4f73\u6548\u80fd\uff0c \u4f8b\u5982\u4f7f\u7528 i-vector \u9032\u884c\u8072\u5b78\u6a21\u578b\u8a9e\u8005\u8abf\u9069[16]\u3002\u56e0\u6b64\u5728\u672c\u7814\u7a76\u4e2d\uff0c\u6211\u5011\u63a1\u7528\u7684\u67b6\u69cb\u662f\u5c07 Representative Speakers)\u7684\u5404\u81ea\u8a9e\u8a00\u6a21\u578b\uff0c\u63a5\u8457\u7531\u7d44\u5408\u5668(Combinator)\u52d5\u614b\u5730\u70ba\u6bcf\u500b\u8a9e\u53e5 \u6c7a\u5b9a\u7279\u5b9a\u8a9e\u8005\u8a9e\u8a00\u6a21\u578b\u7684\u6b0a\u91cd\uff0c\u5716\u56db\u70ba\u5176\u793a\u610f\u5716\u3002\u503c\u5f97\u4e00\u63d0\u7684\u662f\uff0c\u5728\u6a21\u578b\u8a13\u7df4\u6642\uff0c\u7279\u5b9a \u8a9e\u8005\u7684\u9078\u53d6\u662f\u57fa\u65bc\u8a9e\u8005\u5011\u5404\u81ea\u7684 N \u9023\u8a5e\u8a9e\u8a00\u6a21\u578b(\u4f7f\u7528\u5c0d\u61c9\u5230\u6b64\u8a9e\u8005\u7684\u8a13\u7df4\u8a9e\u53e5\u8a13\u7df4\u800c \u6210)\u8207\u80cc\u666f N \u9023\u8a5e\u8a9e\u8a00\u6a21\u578b(\u4f7f\u7528\u6240\u6709\u8a9e\u8005\u7684\u8a13\u7df4\u8a9e\u53e5\u8a13\u7df4\u800c\u6210)\u7684\u5dee\u7570\u4f86\u6c7a\u5b9a\u3002\u5728\u6b64\u7814 \u7a76\uff0c\u6211\u5011\u662f\u9078\u53d6\u5dee\u7570\u6700\u5927\u7684\u524d L \u4f4d\u8a9e\u8005\u4f86\u8a13\u7df4 L \u5957\u7279\u5b9a\u8a9e\u8005\u8a9e\u8a00\u6a21\u578b\u3002 \u5f9e\u5716\u56db\u53ef\u770b\u51fa\uff0c\u5728\u6a21\u578b\u7684\u6e2c\u8a66\u968e\u6bb5\uff0c\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b\u5171\u7528\u5d4c\u5165\u5c64\u548c\u8f38\u51fa\u5c64\u3002\u7576\u524d \u4e00\u500b\u6642\u9593\u9ede\u7684\u8a5e\u8f38\u5165\u5230\u6b64\u6a21\u578b\uff0c\u5148\u7d93\u904e\u5d4c\u5165\u5c64\u6295\u5f71\u81f3\u7a7a\u9593\u5411\u91cf\uff0c\u63a5\u8457\u7d93\u904e\u5404\u500b\u7279\u5b9a\u8a9e\u8005 \u8a9e\u8a00\u6a21\u578b\u548c\u7d44\u5408\u5668\u7684 LSTM \u6a21\u578b\uff0c\u7136\u5f8c\u8f38\u51fa\u6bcf\u4f4d\u7279\u5b9a\u8a9e\u8005\u7684\u96b1\u85cf\u5c64\u8f38\u51fa\u4ee5\u53ca\u7d44\u5408\u6b0a\u91cd\uff0c AMI \u6703\u8b70\u8a9e\u6599\u662f\u7531\u6b50\u76df\u8cc7\u52a9\u958b\u767c\uff0cAMI \u5718\u968a\u81f4\u529b\u65bc\u7814\u7a76\u548c\u958b\u767c\u8f14\u52a9\u5718\u9ad4\u4e92\u52d5\u7684\u6280\u8853\uff0c \u548c CER \u7d50\u679c\u3002\u7531\u65bc AMI \u82f1\u6587\u6703\u8b70\u8a9e\u6599\u7684\u767c\u5c55\u96c6\u8207\u8a13\u7df4\u96c6\u7684\u8a9e\u8005\u91cd\u8907\u6975\u5c11\u3001\u6e2c\u8a66\u96c6\u8207\u8a13 \u7684\u8a9e\u8a00\u6a21\u578b\u9810\u6e2c\u80fd\u529b\u3002\u5c24\u5176\u4f7f\u7528\u57fa\u790e LSTM \u8a9e\u8a00\u6a21\u578b\u7d50\u5408\u50b3\u7d71\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\u80fd\u8f03\u50c5 \u7b2c\u4e8c\u6b65\uff1a\u4ee5\u80cc\u666f LSTM \u8a9e\u8a00\u6a21\u578b\u70ba\u57fa\u790e\uff0c\u4f7f\u7528\u5176\u53c3\u6578\u505a\u70ba\u6bcf\u4e00\u5957\u7279\u5b9a\u8a9e\u8005\u8a9e\u8a00\u6a21\u578b\u7684 \u5176\u4e3b\u8981\u7684\u76ee\u7684\u662f\u958b\u767c\u6703\u8b70\u700f\u89bd\u5668\uff0c\u4f7f\u5f97\u6703\u8b70\u8a18\u9304\u6613\u65bc\u7d22\u5f15\u3002\u8a72\u5718\u968a\u6536\u96c6\u4e86 AMI \u6703\u8b70\u8a9e \u7df4\u96c6\u7684\u8a9e\u8005\u5b8c\u5168\u4e0d\u540c\uff0c\u5728\u8868\u4e94\u50c5\u5448\u73fe\u4f7f\u7528\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u548c\u8a9e\u8005\u8abf\u9069\u6df7\u548c \u4f7f\u7528\u50b3\u7d71\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\uff0c\u4e0d\u8ad6\u662f\u5728\u767c\u5c55\u96c6\u6216\u662f\u6e2c\u8a66\u96c6\u5747\u80fd\u63d0\u4f9b\u8d85\u904e 20%\u7684\u76f8\u5c0d\u8907\u96dc\u5ea6 \u521d\u59cb\u5316\u53c3\u6578\u3002\u63a5\u8457\uff0c\u5c0d\u65bc\u6bcf\u4e00\u5957\u7279\u5b9a\u8a9e\u8005\u8a9e\u8a00\u6a21\u578b\u56fa\u5b9a\u5176\u5d4c\u5165\u5c64\u548c\u8f38\u51fa\u5c64\u3001\u4fdd \u6599\uff0c\u4e00\u7cfb\u5217\u5df2\u8a18\u9304\u7684\u6703\u8b70\u73fe\u5728\u5df2\u63d0\u4f9b\u7d66\u5927\u773e\u505a\u70ba\u7814\u7a76\u958b\u767c\u4f7f\u7528\uff0c\u96d6\u7136\u6578\u64da\u96c6\u662f\u5c08\u9580\u70ba\u8a72 \u6a21\u578b\u7684\u5be6\u9a57\u7d50\u679c\u3002\u4f7f\u7528\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u7684\u8a9e\u8a00\u6a21\u578b\u5728 AMI \u82f1\u6587\u6703\u8b70\u8a9e\u6599\u7684 \u964d\u4f4e\u3002\u5176\u6b21\uff0c\u5f9e\u8a9e\u97f3\u8fa8\u8b58\u7b2c\u4e8c\u968e\u6bb5\u4f7f\u7528\u57fa\u790e LSTM \u8a9e\u8a00\u6a21\u578b\u7d50\u5408\u50b3\u7d71\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b \u6301\u9019\u4e9b\u53c3\u6578\u4e0d\u8b8a\uff0c\u4e26\u50c5\u5c0d\u61c9\u5230\u6b64\u8a9e\u8005\u7684\u8a13\u7df4\u8a9e\u53e5\u4f86\u8a13\u7df4\u6bcf\u4e00\u5957\u7279\u5b9a\u8a9e\u8005 LSTM \u5de5\u4f5c\u6240\u8a2d\u8a08\u7684\uff0c\u4f46\u5b83\u53ef\u7528\u65bc\u8a9e\u8a00\u5b78\u3001\u7d44\u7e54\u548c\u793e\u6703\u5fc3\u7406\u5b78\u3001\u8a9e\u97f3\u548c\u8a9e\u8a00\u5de5\u7a0b\u3001\u5f71\u97f3\u8655\u7406\u548c \u8a9e\u8a00\u8907\u96dc\u5ea6\u964d\u4f4e\u4e0a\u4e26\u6c92\u6709\u9810\u671f\u4e2d\u7684\u8868\u73fe\uff0c\u4f46\u662f\u5728\u8a9e\u97f3\u8fa8\u8b58\u5be6\u9a57\u537b\u6709\u4e0d\u932f\u7684\u6548\u80fd\uff0c\u53ef\u4ee5\u964d \u4f86\u5c0d\u65bc\u7b2c\u4e00\u968e\u6bb5\u7522\u751f\u7684 1000 \u6700\u4f73\u5019\u9078\u8a5e\u5e8f\u5217\u91cd\u65b0\u6392\u5e8f\u7684\u5be6\u9a57\uff0c\u9019\u6a23\u7684\u7d50\u5408\u80fd\u5c0d\u65bc\u767c\u5c55 \u8a9e\u8a00\u6a21\u578b\u7684\u96b1\u85cf\u5c64\u9593\u7db2\u8def\u7684\u53c3\u6578\u3002 \u96c6\u548c\u6e2c\u8a66\u96c6\u7684\u5b57\u932f\u8aa4\u7387\u5206\u5225\u70ba 16.89%\u8207 15.91%\u7684\u6539\u9032\u3002 \u4f4e\u7d04 0.5%\u7684\u8a5e\u932f\u8aa4\u7387\u3002\u53e6\u4e00\u65b9\u9762\uff0c\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b\u5728\u8a9e\u8a00\u8907\u96dc\u5ea6\u4e0a\uff0c\u6c92\u6709\u9810\u671f\u4e2d\u7684 \u591a\u6a21\u5f0f\u7cfb\u7d71\u7b49\u591a\u7a2e\u4e0d\u540c\u76ee\u7684\uff0c\u8868\u4e09\u662f AMI \u7684\u8a73\u7d30\u7d71\u8a08\u8cc7\u8a0a\u3002\u672c\u7814\u7a76\u8a9e\u97f3\u8fa8\u8b58\u7cfb\u7d71\u7684\u767c \u7b2c\u4e09\u6b65\uff1a\u7372\u53d6\u6240\u6709\u7279\u5b9a\u8a9e\u8005 LSTM \u8a9e\u8a00\u6a21\u578b\u53c3\u6578\uff0c\u8f38\u5165\u524d\u4e00\u968e\u6bb5\u7684\u5d4c\u5165\u548c\u8f38\u51fa\u53c3\u6578\u4ee5 \u8868\u73fe\uff0c\u50c5\u5728 CER \u4e9b\u5fae\u4e0b\u964d\uff1b\u9019\u53ef\u80fd\u662f AMI \u82f1\u6587\u6703\u8b70\u8a9e\u6599\u8a9e\u8005\u8f03\u591a\uff0c\u800c\u672c\u7814\u7a76\u6240\u9078\u7528\u7684 \u5c55\u662f\u4f7f\u7528 Kaldi \u5de5\u5177\uff1b\u8072\u5b78\u6a21\u578b\u662f Lattice-free Maximum Mutual Information(LF-MMI) [17]\uff0c \u521d\u59cb\u5316\u6700\u7d42\u7d44\u5408\u5668\u6a21\u578b\uff0c\u4fdd\u6301\u6240\u6709\u7279\u5b9a\u8a9e\u8005\u7684 LSTM \u8a9e\u8a00\u6a21\u578b\u53c3\u6578\u4ee5\u53ca\u5d4c\u5165 \u7279\u5b9a\u8a9e\u8005\u6578(\u4e03\u4f4d)\u5360\u7e3d\u8a9e\u8005\u6578\u7684\u6bd4\u4f8b\u592a\u5c0f\uff0c\u5c0e\u81f4\u5176\u8868\u73fe\u6c92\u6709\u50cf\u5728\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u4f86\u7684\u597d\u3002 \u7b2c\u4e00\u968e\u6bb5\u4e2d\u7684\u8a9e\u8a00\u6a21\u578b\u662f\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\u3002\u985e\u795e\u7d93\u7db2\u8def\u8a9e\u8a00\u6a21\u578b\u662f\u7531 Pytorch \u5be6\u73fe\u3002 \u5c64\u53c3\u6578\u4e0d\u8b8a\uff0c\u4e26\u5728\u6df7\u5408\u5668 LSTM \u4e0a\u8a13\u7df4\u6240\u6709\u6578\u64da\uff0c\u540c\u6642\u5fae\u8abf\u8f38\u51fa\u5c64\u53c3\u6578\u3002 \u516d\u3001\u7d50\u8ad6\u8207\u672a\u4f86\u5c55\u671b 5.2 \u5be6\u9a57\u7d50\u679c\u8207\u63a2\u8a0e \u6b64\u65b9\u6cd5\u4e2d\u7684\u7d44\u5408\u5668\u8f38\u51fa\u53ef\u8996\u70ba\u4e00\u7a2e\u8a9e\u8005\u7279\u5fb5\uff0c\u4f86\u8f14\u52a9\u52d5\u614b\u5730\u7522\u751f\u6700\u7d42\u7684\u8a9e\u8a00\u6a21\u578b\uff0c \u4e26\u518d\u7dda\u6027\u7d44\u5408\u6bcf\u4f4d\u7279\u5b9a\u8a9e\u8005\u7684\u96b1\u85cf\u5c64\u8f38\u51fa\u548c\u7d44\u5408\u5668\u8f38\u51fa\u7684\u6b0a\u91cd\u3002\u6700\u5f8c\u7d93\u904e\u4e00\u500b\u5168\u9023\u63a5\u5c64 (Fully Connected Layer)\u8207\u6b78\u4e00\u5316\u6307\u6578\u51fd\u6578(Softmax)\u8f38\u51fa\u4e0b\u4e00\u500b\u8a5e\u7684\u6a5f\u7387\u3002\u8a9e\u8005\u8abf\u9069\u6df7\u548c \u6a21\u578b\u8a13\u7df4\u968e\u6bb5\u53ef\u9032\u4e00\u6b65\u5206\u6210\u4e0b\u5217\u5e7e\u500b\u6b65\u9a5f\uff1a \u56e0\u70ba\u6b64\u6a21\u578b\u7684\u8f38\u51fa\u662f\u4e0b\u4e00\u500b\u8a5e\u7684\u6a5f\u7387\uff0c\u6240\u4ee5\u4e5f\u53ef\u4ee5\u76f4\u63a5\u7576\u4f5c\u8a9e\u97f3\u8fa8\u8b58\u7b2c\u4e8c\u968e\u6bb5\u8a9e\u8a00\u6a21\u578b \u91cd\u65b0\u6392\u5e8f\u6240\u9700\u4e4b\u8a9e\u8a00\u6a21\u578b\u3002 \u4e94\u3001\u5be6\u9a57\u7d50\u679c\u8207\u5206\u6790 5.1 \u5be6\u9a57\u8a9e\u6599\u8207\u8a2d\u5b9a \u672c\u8ad6\u6587\u8003\u616e\u4e86\u300c\u5df2\u77e5\u8a9e\u8005\u300d\u548c\u300c\u672a\u77e5\u8a9e\u8005\u300d\u5169\u7a2e\u7684\u60c5\u5883\uff0c\u4e5f\u4ee5\u4e0d\u540c\u7684\u89d2\u5ea6\u8003\u616e\u8a9e\u8005\u7279\u5fb5 \u672c\u8ad6\u6587\u7b2c\u4e00\u7d44\u7684\u5be6\u9a57\u662f\u5be6\u4f5c\u5728\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\uff1b\u8868\u56db\u662f\u5728\u6b64\u8a9e\u6599\u4e0a\u7684\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6 \u73fe\u4e0a\u6709\u4e9b\u5fae\u7684\u63d0\u5347\uff0c\u5176\u4e2d\u57fa\u65bc PLSA \u7684\u8a9e\u8005\u7279\u5fb5\u6709\u6700\u597d\u7684\u8868\u73fe\uff1b\u800c\u7d93\u671f\u671b\u503c\u6700\u5927\u5316(EM) \u64f7\u53d6\u6a21\u578b\u7684\u5efa\u7acb\uff0c\u6211\u5011\u63d0\u51fa\u4e86\u4e09\u7a2e\u8a9e\u8005\u7279\u5fb5\u64f7\u53d6\u6a21\u578b\uff0c \u300c\u8a9e\u8005\u7528\u8a5e\u7279\u5fb5\u6a21\u578b\u300d \u3001 \u300c\u8a9e\u8005\u6163 (Perplexity)\u548c\u8a9e\u97f3\u8fa8\u8b58\u5b57\u932f\u8aa4\u7387(Character error rate, CER)\u7d50\u679c\u3002\u9996\u5148\uff0c\u5f9e\u8a13\u7df4\u548c\u6e2c\u8a66\u8a9e \u6f14\u7b97\u6cd5\u4f30\u6e2c\u6240\u5f97\u4e4b\u57fa\u65bc SSWM \u7684\u8a9e\u8005\u7279\u5fb5\u4e5f\u6703\u8f03\u57fa\u65bc TF \u7684\u8a9e\u8005\u7279\u5fb5\u7684\u8868\u73fe\u4f86\u7684\u597d\uff0c \u7528\u8a9e\u7279\u5fb5\u6a21\u578b\u300d \u3001 \u300c\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b\u300d \uff0c\u5176\u4e2d\u300c\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u6a21\u578b\u300d\u4ee5\u53ca\u300c\u8a9e\u8005\u8abf\u9069 \u53e5\u6b63\u78ba\u8f49\u5beb(Reference Transcription)\u7684\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6\u7684\u5be6\u9a57\u7d50\u679c\u53ef\u770b\u51fa\uff0c\u4f7f\u7528\u57fa\u790e \u8aaa\u660e\u4e86\u4f7f\u7528\u671f\u671b\u503c\u6700\u5927\u5316\u6f14\u7b97\u6cd5\u964d\u4f4e\u80cc\u666f\u8a5e\u5f71\u97ff\u7684\u91cd\u8981\u6027\u3002\u63a5\u8457\uff0c\u5728\u8fa8\u8b58\u7684\u7d50\u679c\uff0c\u4e09\u7a2e \u6df7\u548c\u6a21\u578b\u300d \u9069\u7528\u65bc\u672a\u77e5\u8a9e\u8005\u7684\u6e2c\u8a66\u968e\u6bb5\uff0c\u7d50\u679c\u986f\u793a\u8a9e\u8005\u8abf\u9069\u6df7\u548c\u6a21\u578b\u4e0d\u7ba1\u5728 \u300c\u5df2\u77e5\u8a9e\u8005\u300d LSTM \u8a9e\u8a00\u6a21\u578b\u7d50\u5408\u50b3\u7d71\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b(\u6216\u55ae\u7368\u4f7f\u7528\u57fa\u790e LSTM \u8a9e\u8a00\u6a21\u578b\uff0c\u5982\u62ec\u865f\u5167 \u8a9e\u8005\u76f8\u95dc\u7279\u5fb5\u5728\u767c\u5c55\u96c6\u4e0a\u90fd\u80fd\u5e36\u4f86\u4e9b\u5fae\u7684\u6539\u5584\uff0c\u5c24\u5176\u4ee5 PLSA \u7684\u6548\u679c\u6700\u597d\u3002\u800c\u5728\u6e2c\u8a66\u96c6 \u9084\u662f\u300c\u672a\u77e5\u8a9e\u8005\u300d\u7684\u60c5\u5883\u90fd\u80fd\u9054\u5230\u4e00\u5b9a\u7684\u6548\u679c\uff0c\u4f46\u662f\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u6a21\u578b\u7684\u8868\u73fe\u8f03\u5176\u5b83 \u6578\u503c\u6240\u793a)\u5747\u80fd\u8f03\u4f7f\u7528\u50b3\u7d71\u4e09\u9023\u8a5e\u8a9e\u8a00\u6a21\u578b\u6709\u8f03\u4f4e\u7684\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6\uff0c\u4e5f\u5c31\u662f\u8aaa\u6709\u8f03\u4f73 \u4e0a\uff0c\u662f\u4f7f\u7528\u57fa\u65bc TF \u7684\u8a9e\u8005\u7279\u5fb5\u7684\u8868\u73fe\u6700\u597d\uff1b\u4f7f\u7528\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u53cd\u800c\u4f7f \u65b9\u6cd5\u5dee\uff0c\u539f\u56e0\u662f\u6240\u9078\u53d6\u7684\u53cd\u4f8b\u8a9e\u53e5\u7121\u6cd5\u5f88\u597d\u7684\u8868\u73fe\u8a72\u8a9e\u8005\u7684\u76f8\u53cd\u7528\u8a9e\u7279\u6027\u3002</td></tr></table>", |
|
"text": "\u518d\u8005\uff0c\u6211\u5011\u89c0\u5bdf\u4ee5\u8a9e\u8005\u7528\u8a5e\u7279\u5fb5\u4f86\u8f14\u52a9\u8a13\u7df4\u8a9e\u8a00\u6a21\u578b\u5728\u83ef\u8a9e\u6703\u8b70\u8a9e\u6599\u4e0a\u7684\u5be6\u9a57\u7d50\u679c\uff1b \u5728\u8868\u56db\u5448\u73fe\u51fa\u5206\u5225\u4f7f\u7528\u57fa\u65bc TF \u7684\u8a9e\u8005\u7279\u5fb5\u3001\u57fa\u65bc PLSA \u7684\u8a9e\u8005\u7279\u5fb5\u3001\u57fa\u65bc SSWM \u7684\u8a9e \u8005\u7279\u5fb5\u548c\u57fa\u65bc CNN \u7684\u8a9e\u8005\u6163\u7528\u8a9e\u7279\u5fb5\u5728\u8a9e\u8a00\u6a21\u578b\u8907\u96dc\u5ea6\u548c CER \u964d\u4f4e\u7684\u8868\u73fe\u3002\u878d\u5165\u9019\u4e9b \u8a9e\u8005\u76f8\u95dc\u7684\u8f14\u52a9\u7279\u5fb5\u7684 LSTM \u8a9e\u8a00\u6a21\u578b\u5747\u80fd\u8f03\u57fa\u790e LSTM \u8a9e\u8a00\u6a21\u578b\u5728\u8a9e\u8a00\u8907\u96dc\u5ea6\u7684\u8868", |
|
"type_str": "table", |
|
"html": null, |
|
"num": null |
|
} |
|
} |
|
} |
|
} |