|  |
 |
| Artikel-Nr.: 858A-9783031231896 Herst.-Nr.: 9783031231896 EAN/GTIN: 9783031231896 |
| |
|
|  |  |
 | After a brief introduction to basic NLP models the main pre-trained language models BERT, GPT and sequence-to-sequence transformer are described, as well as the concepts of self-attention and context-sensitive embedding. Then, different approaches to improving these models are discussed, such as expanding the pre-training criteria, increasing the length of input texts, or including extra knowledge. An overview of the best-performing models for about twenty application areas is then presented, e.g., question answering, translation, story generation, dialog systems, generating images from text, etc. For each application area, the strengths and weaknesses of current models are discussed, and an outlook on further developments is given. In addition, links are provided to freely available program code. A concluding chapter summarizes the economic opportunities, mitigation of risks, and potential developments of AI. Weitere Informationen:  |  | Author: | Gerhard Paaß; Sven Giesselbach | Verlag: | Springer International Publishing | Sprache: | eng |
|
|  |  |
 | |  |  |
 | Weitere Suchbegriffe: allgemeine Informatikbücher - englischsprachig, allgemeine informatikbücher - englischsprachig, Pre-trained Language Models, Deep Learning, Natural Language Processing, Transformer Models, BERT, GPT, Attention Models, Natural Language Understanding, Multilingual Models, Natural Language Generation, Chatbot |
|  |  |
| |