The system prompt problem
Every personalized AI application faces the same challenge: how do you build a system prompt that captures everything the AI should know about a user?
Most developers manually craft system prompts or stitch together search results. This is fragile, incomplete, and doesn't scale. As you accumulate hundreds or thousands of memories per user, you can't fit them all in a prompt.
What is Cognitive Profile?
Cognitive Profile is a Mengram feature that generates a complete, ready-to-use system prompt from a user's entire memory history. One API call distills all semantic memories (facts), episodic memories (events), and procedural memories (workflows) into a coherent personality snapshot.
from mengram import Mengram
m = Mengram(api_key="mg-...")
# One call — returns a complete system prompt
profile = m.profile(user_id="alice")
The output looks like this:
# Example Cognitive Profile output:
"You are assisting Alice, a senior backend engineer at Acme Corp.
Key facts:
- Prefers Python, uses FastAPI and PostgreSQL
- Works on the payments team
- Prefers concise answers with code examples
Recent context:
- Debugged a Redis connection timeout last week (pool size issue)
- Currently migrating the auth system from sessions to JWT
- Deployed v2.3 to production yesterday with zero downtime
Learned workflows:
- Deploy process: run tests → build → push staging → smoke test → promote
- Code review: security first → test coverage → naming → performance
- When Alice asks about deployment, reference the established workflow above."
How it works internally
- Retrieval: Fetches all memory types for the user (semantic, episodic, procedural)
- Ranking: Prioritizes recent and frequently-accessed memories
- Synthesis: An LLM compresses and organizes the memories into a structured prompt
- Caching: The profile is cached and incrementally updated as new memories arrive
Why not just use search?
search() returns individual memories matching a query. It's great for specific questions. But for general context — "who is this user and what should I know about them?" — search requires you to guess the right queries.
Cognitive Profile answers the general question automatically. Use search() for specific retrieval and profile() for global context. They're complementary.
Using Cognitive Profile with any LLM
# Works with OpenAI
import openai
profile = m.profile(user_id="alice")
response = openai.chat.completions.create(
model="gpt-4o",
messages=[
{{"role": "system", "content": profile}},
{{"role": "user", "content": "How should I deploy the new feature?"}}
]
)
# Works with Anthropic
import anthropic
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-sonnet-4-20250514",
system=profile,
messages=[{{"role": "user", "content": "How should I deploy?"}}]
)
# Works with any LLM that accepts a system prompt
When to use Cognitive Profile
- Chatbots and assistants: Start every conversation with full user context
- Customer support: Agents instantly know the customer's history and preferences
- Personal AI: Build companions that truly know the user
- Multi-agent systems: Share user context across agents without manual prompt engineering
Get started: pip install mengram-ai, grab a free API key, and call m.profile(user_id). Full quickstart tutorial here.