Bayesian Analysis in Natural Language Processing
2nd edition
Par : Formats :
- Réservation en ligne avec paiement en magasin :
- Indisponible pour réserver et payer en magasin
- Nombre de pages311
- PrésentationBroché
- FormatGrand Format
- Poids0.643 kg
- Dimensions19,1 cm × 23,5 cm × 1,8 cm
- ISBN978-1-68173-526-9
- EAN9781681735269
- Date de parution09/04/2019
- CollectionSynthesis Lectures on Human
- ÉditeurMorgan & Claypool
Résumé
In this book, we cover the methods and algorithms that are needed to fluently read Bayesian learning papers in NLP and to do research in the area. These methods and algorithms are partially borrowed from both machine learning and statistics and are partially developed "in-house" in NLP. We cover inference techniques such as Marjcov chain Monte Carlo sampling and variational inference, Bayesian estimation, and nonparartietric modeling.
In response to rapid changes in the field, this second edition of the book includes a new chapter on representation learning and neural networks in the Bayesian context. We also cover fundamental concepts in Bayesian statistics such as prior distributions, conjugacy, and generative modeling. Finally, we review some of the fundamental modeling techniques in NLP, such as grammar modeling, neural networks and representation learning, and their use with Bayesian analysis.
In this book, we cover the methods and algorithms that are needed to fluently read Bayesian learning papers in NLP and to do research in the area. These methods and algorithms are partially borrowed from both machine learning and statistics and are partially developed "in-house" in NLP. We cover inference techniques such as Marjcov chain Monte Carlo sampling and variational inference, Bayesian estimation, and nonparartietric modeling.
In response to rapid changes in the field, this second edition of the book includes a new chapter on representation learning and neural networks in the Bayesian context. We also cover fundamental concepts in Bayesian statistics such as prior distributions, conjugacy, and generative modeling. Finally, we review some of the fundamental modeling techniques in NLP, such as grammar modeling, neural networks and representation learning, and their use with Bayesian analysis.