In previous articles we have explored the journey from loading documents to creating a vector store, discussing the limitations of existing models in handling follow-up questions and engaging in real conversations.
The good news is that we’re addressing these issues by introducing chat history into LangChain. This addition enables the language model to consider previous interactions, allowing it to provide context-aware responses.
The article guides users through setting up their environment, adding memory to the chain, and building an end-to-end chatbot that empowers users to have interactive and context-sensitive conversations with their document-based language models.
Table of Contents:
Setting Up Working Environment & Getting Started
Adding Memory to Your Chain
Building an End-to-End Chatbot