Springing into AI - Part 8: Chat Memory
Problem By default, interactions with LLM are stateless and it has no clue about what was asked previously to it so it can have a more meaningful engaged conversation with our end user. You want to make your awesome GenAI application an enriching experience for the end user. If you are a superhero fan, you must have heard Batman re-iterating "I am Batman" to villains every single time as Batman assumes they have no memory or recollection. Solution Spring offers the benefit of using the mighty Advisor(s). It is through this useful feature where we can tap into the user input request before it reaches the LLM, and enrich the user input with some past chat history thereby giving LLM some more context and having the end user a "conversational" type experience. The figure below paints a bird eye view of what our application would result in. In the figure below, we introduce a segment of Chat Memory . This segment can be a choice of either using "...