Build a Chatbot with Memory
Learn how to build an AI chatbot that remembers conversations and provides personalized responses.
What You'll Build
A chatbot that:
- Remembers user preferences and past conversations
- Answers questions based on accumulated knowledge
- Provides contextual, personalized responses
Prerequisites
- Hypersave API key (Get one here (opens in a new tab))
- Node.js 18+ or Python 3.8+
- OpenAI API key (for chat completions)
Set Up Your Project
mkdir hypersave-chatbot
cd hypersave-chatbot
npm init -y
npm install openai node-fetchCreate a .env file:
HYPERSAVE_API_KEY=hs_live_xxxxx
OPENAI_API_KEY=sk-xxxxxCreate the Memory Manager
Create memory.js to handle saving and retrieving memories:
// memory.js
const HYPERSAVE_API = 'https://api.hypersave.io';
const API_KEY = process.env.HYPERSAVE_API_KEY;
export async function saveMemory(content, userId) {
const response = await fetch(`${HYPERSAVE_API}/v1/save`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
content,
metadata: { userId, type: 'conversation' },
tags: ['chat', userId]
}),
});
return response.json();
}
export async function getRelevantMemories(query, userId, limit = 5) {
const response = await fetch(`${HYPERSAVE_API}/v1/search`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
query,
limit,
tags: [userId]
}),
});
const data = await response.json();
return data.results || [];
}
export async function askWithContext(query, userId) {
const response = await fetch(`${HYPERSAVE_API}/v1/ask`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
query,
filters: { tags: [userId] },
includeContext: true
}),
});
return response.json();
}Create the Chatbot
Create chatbot.js:
// chatbot.js
import OpenAI from 'openai';
import { saveMemory, getRelevantMemories } from './memory.js';
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
export async function chat(userMessage, userId) {
// 1. Get relevant memories for context
const memories = await getRelevantMemories(userMessage, userId);
// 2. Build context from memories
const memoryContext = memories.length > 0
? `Relevant memories:\n${memories.map(m => `- ${m.content}`).join('\n')}`
: 'No relevant memories found.';
// 3. Generate response with OpenAI
const completion = await openai.chat.completions.create({
model: 'gpt-4',
messages: [
{
role: 'system',
content: `You are a helpful assistant with access to the user's memories.
Use the following context to provide personalized responses.
${memoryContext}
If you learn something new about the user, acknowledge it naturally.`
},
{ role: 'user', content: userMessage }
],
});
const assistantMessage = completion.choices[0].message.content;
// 4. Save the conversation to memory
await saveMemory(
`User: ${userMessage}\nAssistant: ${assistantMessage}`,
userId
);
return assistantMessage;
}Create the Main Application
Create index.js:
// index.js
import 'dotenv/config';
import readline from 'readline';
import { chat } from './chatbot.js';
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout
});
const userId = 'user_123'; // In production, use actual user IDs
console.log('Chatbot ready! Type your message (or "quit" to exit)\n');
function prompt() {
rl.question('You: ', async (input) => {
if (input.toLowerCase() === 'quit') {
console.log('Goodbye!');
rl.close();
return;
}
const response = await chat(input, userId);
console.log(`Bot: ${response}\n`);
prompt();
});
}
prompt();Run Your Chatbot
node index.jsExample conversation:
You: Hi! My name is Sarah and I'm a software engineer
Bot: Nice to meet you, Sarah! It's great to connect with a fellow
software engineer. What kind of projects are you working on?
You: I mainly work with React and Node.js
Bot: React and Node.js are a great combination! Are you working
on any interesting projects right now?
[Later in the conversation...]
You: What's my name and what do I do?
Bot: Your name is Sarah, and you're a software engineer who
specializes in React and Node.js development.Advanced Features
Remember User Preferences
async function savePreference(userId, preference, value) {
await saveMemory(
`User preference: ${preference} = ${value}`,
userId
);
}
// Usage
await savePreference('user_123', 'favorite_language', 'TypeScript');
await savePreference('user_123', 'timezone', 'PST');Proactive Reminders
import { remind } from './memory.js';
async function getContextualReminders(currentContext, userId) {
const response = await fetch(`${HYPERSAVE_API}/v1/remind`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
context: currentContext,
limit: 3
}),
});
return response.json();
}
// In your chat function, add reminders
const reminders = await getContextualReminders(userMessage, userId);
if (reminders.reminders?.length > 0) {
// Include relevant reminders in the context
}Multi-User Support
Structure your memories with user-specific tags:
async function saveMemory(content, userId, sessionId) {
return fetch(`${HYPERSAVE_API}/v1/save`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${API_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
content,
tags: [`user:${userId}`, `session:${sessionId}`],
metadata: { userId, sessionId }
}),
});
}Best Practices
Memory Management Tips:
- Save summaries of long conversations, not every message
- Use tags to organize memories by topic or session
- Periodically clean up old, irrelevant memories
- Set appropriate context limits to avoid token overflow
Next Steps
- Build a Knowledge Base — Add document-based knowledge
- RAG Application — Implement retrieval augmented generation
- Best Practices — Optimize your implementation