Below you will find pages that utilize the taxonomy term “Retrieval-Augmented Generation (RAG)”
Post
Project 10: Build a Chatbot with LangChain and Chroma to chat with your own documents
1. Overview In this project, we will build a Retrieval-Augmented Generation Chatbot with the help of LangChain that can answer questions from internal documentation and have memory. By using Panel’s chat interface, we will also build a LangChain-powered AI chatbot for our RAG application.
The Python Notebook containing the complete model development process and the data used in this project can be found at Google Drive.
2. LangChain LangChain is an open-source developer framework for building LLM applications.
Post
Project 9: Generative QA with Retrieval-Augmented Generation (RAG) and TruEra Evaluation
1. Overview In this project, we will build a Generative Question Answering model with Retrieval-Augmented Generation (RAG) with the help of LlamaIndex that can answer questions from internal documentation. We will also evaluate, iterate, and improve the model by using TruLens.
The Python Notebook containing the complete model development process and the data used in this project can be found at Google Drive.
2. Retrieval-Augmented Generation (RAG) for Question Answering (QA) In the first part of this section, we will discuss the basic RAG pipeline for generative Question Answering from internal documentation.