Long Term Memory AI Chatbot

Long Term Memory AI Chatbot

Our AI workflow technology allows us to deliver AI chatbots with long term memory. This allows you to "train" your AI chatbot during conversations, and such modify your chatbot's future responses as a consequence of you correcting it during interactions.

This is one of the core requirements for AGI, and is actually extremely easy to build using our technology. In the following video I am showing you how to create an AI chatbot with long term memory in 3 minutes.

How it works

The implementation is actually ridiculously simple. It is basically just an AI function that can be triggered using natural language, that persists information into the chatbot's RAG and VSS database.

This allows the AI chatbot to use whatever information you're providing to it in the future for information related to the subject you store in its memory. Below is the entire training snippet's prompt engineering.

Store to memory

Stores a piece of information to long term memory, which implies saving to
ml_training_snippets, and re-vectorizing the type, such that it can be
retrieved later using RAG and VSS.

The [prompt] argument and the [completion] argument are both mandatory
arguments. If the user does not provide you with an explicit prompt
argument, then create a short one line summary of the information with
keywords related to the fact that makes it easy to retrieve the information
later using RAG and VSS.

If the user asks you to perform an action associated with this function
invocation, then inform the user of what you are about to do, and do not
return follow up questions, but instead end your response with the following:

  "prompt": "[VALUE]",
  "completion": "[VALUE]"

The entire AI workflow itself looks like the following.

.description:Store something to long term memory

// Sanity checking invocation.

// Opening up our database connection to store item to memory.

   // Creating our ml_training_snippets item.

// Re-vectorising the type.

// Returning success to caller.

The above [_type] argument is automatically added to the invocation by the AI function invocation implementation, and is basically just whatever machine learning type you happen to be using this within.

Use cases

  • Training the chatbot in natural language conversations
  • QA testing and increasing the chatbot's quality during conversations
  • Creating an AI Expert System serving as a "memory extension"
  • Etc, etc, etc

Of the above I think possibly the training parts is my favourite, because it allows you to start out with an "empty AI machine learning model", start asking it questions, and only when it fails you modify it. This allows you to use the GPT-4o model as your foundation, and only modify it where it goes wrong.

If you're interested in such an AI chatbot, and/or AI Expert System, you can contact us below.

Thomas Hansen

Thomas Hansen I am the CEO and Founder of AINIRO.IO, Ltd. I am a software developer with more than 25 years of experience. I write about Machine Learning, AI, and how to help organizations adopt said technologies. You can follow me on LinkedIn if you want to read more of what I write.

Published 29. Jun 2024

The End of Google

Google will soon die. They've failed over and over again with AI, and before they'll have anything basic working, every single independent software company on Earth will have integrated something else.

Read More

The AI Chatbot that Does your Laundry

Having the AI do all the fun stuff, such as art and music, such that you can focus on washing the dishes and doing the laundry is simply wrong - So we decided to do something about it.

Read More

Real Estate AI Chatbots

In this article I will demonstrate how one of our Real Estate clients are leveraging AI to increase engagement and collect leads.

Read More