What we’re about
The LLM Reading Club is for anyone interested in building both a theoretical intuition of large language models and practical expertise in their development and use. There’s momentum and fun in numbers! 😊 The club will collectively explore some of the canonical books and research papers pertaining to large language models.
The level of prerequisite knowledge needed to maximise benefit will vary depending on the specific book or paper being covered. Generally, the content will be most suitable for those proficient in Python and with some understanding of (or a willingness to quickly learn) the relevant core machine learning or mathematical concepts.
All are welcome - this includes but not limited to - data practitioners of all hues, enthusiasts, students, researchers, and professionals.
Upcoming events (3)
See all- Hands-On Large Language ModelsLink visible for attendees
We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.
Pages being discussed:
Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.> Instructions for joining Discord: https://bit.ly/llm-discord
> Buy book on Amazon
Book overview
Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.
The book aims to help you:
- Understand the architecture of Transformer language models that excel at text generation and representation
- Build advanced LLM pipelines to cluster text documents and explore the topics they cover
- Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
- Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
- Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
- Hands-On Large Language ModelsLink visible for attendees
We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.
Pages being discussed:
Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.> Instructions for joining Discord: https://bit.ly/llm-discord
> Buy book on Amazon
Book overview
Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.
The book aims to help you:
- Understand the architecture of Transformer language models that excel at text generation and representation
- Build advanced LLM pipelines to cluster text documents and explore the topics they cover
- Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
- Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
- Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
- Hands-On Large Language ModelsLink visible for attendees
We are discussing the recently released Hands-On Large Language Models by Jay Alammar and Maarten Grootendorst. This book, which combines the essential theory of LLMs with a practical focus, is written by two highly regarded experts from the LLM space.
Pages being discussed:
Please see the latest message (also pinned) in the #current-reading channel in our Discord chat space to see which pages we'll be reviewing in this session.> Instructions for joining Discord: https://bit.ly/llm-discord
> Buy book on Amazon
Book overview
Through the book's visually educational nature, readers can learn practical tools and concepts they need to use these capabilities today.You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.
The book aims to help you:
- Understand the architecture of Transformer language models that excel at text generation and representation
- Build advanced LLM pipelines to cluster text documents and explore the topics they cover
- Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
- Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
- Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning