Demystifying Large Language Models : Unraveling the Mysteries of Language Transformer Models, Build from Ground up, Pre-Train, Fine-tune and Deployment
Demystifying Large Language Models : Unraveling the Mysteries of Language Transformer Models, Build from Ground up, Pre-Train, Fine-tune and Deployment
Click to enlarge
Author(s): Chen, James
ISBN No.: 9781738908479
Pages: 346
Year: 202404
Format: Trade Cloth (Hard Cover)
Price: $ 75.89
Dispatch delay: Dispatched between 7 to 15 days
Status: Available

This book is a comprehensive guide aiming to demystify the world of transformers -- the architecture that powers Large Language Models (LLMs) like GPT and BERT. From PyTorch basics and mathematical foundations to implementing a Transformer from scratch, you'll gain a deep understanding of the inner workings of these models. That's just the beginning. Get ready to dive into the realm of pre-training your own Transformer from scratch, unlocking the power of transfer learning to fine-tune LLMs for your specific use cases, exploring advanced techniques like PEFT (Prompting for Efficient Fine-Tuning) and LoRA (Low-Rank Adaptation) for fine-tuning, as well as RLHF (Reinforcement Learning with Human Feedback) for detoxifying LLMs to make them aligned with human values and ethical norms. Step into the deployment of LLMs, delivering these state-of-the-art language models into the real-world, whether integrating them into cloud platforms or optimizing them for edge devices, this section ensures you're equipped with the know-how to bring your AI solutions to life. Whether you're a seasoned AI practitioner, a data scientist, or a curious developer eager to advance your knowledge on the powerful LLMs, this book is your ultimate guide to mastering these cutting-edge models. By translating convoluted concepts into understandable explanations and offering a practical hands-on approach, this treasure trove of knowledge is invaluable to both aspiring beginners and seasoned professionals.


To be able to view the table of contents for this publication then please subscribe by clicking the button below...
To be able to view the full description for this publication then please subscribe by clicking the button below...