What You'll Build
A terminal-based chatbot that maintains conversation history, has a system prompt personality, and manages context efficiently.
The Full Code
import anthropic
client = anthropic.Anthropic()
conversation = []
SYSTEM = "You are a helpful coding mentor. Be encouraging but honest about mistakes."
MAX_HISTORY = 20
def chat(user_msg):
conversation.append({"role": "user", "content": user_msg})
# Trim history if too long
if len(conversation) > MAX_HISTORY:
summary_msg = {"role": "user", "content": "[Earlier conversation summarized]"}
conversation[:] = [summary_msg] + conversation[-10:]
response = client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=2048,
system=SYSTEM,
messages=conversation
)
assistant_msg = response.content[0].text
conversation.append({"role": "assistant", "content": assistant_msg})
return assistant_msg
# Main loop
print("Chatbot ready. Type 'quit' to exit.")
while True:
user_input = input("You: ")
if user_input.lower() == 'quit':
break
print(f"Claude: {chat(user_input)}")Challenges to Try
- Add streaming output so text appears in real-time
- Add a /cost command that shows total tokens and cost so far
- Save conversation to a file and reload on restart
- Add a /model command to switch between Haiku/Sonnet/Opus mid-chat
Key Takeaways
- This lesson covered exercise: build a chatbot that remembers context
- Apply these concepts in your own projects before moving on
- Refer back to this lesson when you encounter related challenges