Server Context File for LLM @ Referencing
Overviewโ
The .mcp-server-context.md file is a living document that bridges the gap between the server's internal memory systems (stored in /tmp as JSON) and your LLM's working context. By @ mentioning this file, LLMs instantly gain awareness of:
- Available tools and capabilities
- Recent activity and intents
- Memory entities and relationships
- Discovered patterns
- Active recommendations
- Knowledge gaps
What Problem Does This Solve?โ
Before: Your MCP server has powerful memory systems (KnowledgeGraphManager, MemoryEntityManager, ConversationMemoryManager) but they're invisible to the LLM unless explicitly queried.
After: LLMs can @.mcp-server-context.md to instantly understand:
- What's been happening recently
- What architectural decisions exist
- What patterns have been discovered
- What the server recommends doing next
How It Worksโ
Auto-Generated Contentโ
The file is populated from three memory systems:
- Knowledge Graph โ Active intents, tool usage, score trends
- Memory Entities โ Architectural decisions, relationships, patterns
- Conversation Memory โ Active sessions, turn count, context
Real-Time Updatesโ
The context file updates automatically:
- โ After every tool execution
- โ When memory entities change
- โ When conversation sessions start/end
- โ On server restart
- โ On manual request via MCP tool
Usage Examplesโ
Starting a New Conversationโ
@.mcp-server-context.md I'm new to this project. What architectural
decisions have been made?
LLM Response: "Based on the context file, I can see we have 12 architectural decisions documented, with 3 active intents. The most recent decision was about..."
Resuming Workโ
@.mcp-server-context.md What was I working on? Show me the active intents
LLM Response: "Looking at the context, you have 2 active intents:
- Database migration strategy (started 2h ago)
- API gateway selection (started yesterday)"
Understanding Patternsโ
@.mcp-server-context.md What patterns have we discovered in our architecture?
LLM Response: "The context shows 3 discovered patterns:
- Microservices pattern (85% confidence)
- Event-driven architecture (72% confidence)
- CQRS pattern (68% confidence)"
Checking Progressโ
@.mcp-server-context.md How has our architecture score changed?
LLM Response: "Your current project score is 87/100, up 12 points from the initial baseline. The top improvements came from..."
Integration Stepsโ
1. Register the Tool (in src/index.ts)โ
import { getServerContext, getServerContextMetadata } from './tools/get-server-context-tool.js';
// In your tool registration:
case 'get_server_context':
return await getServerContext(
args as GetServerContextArgs,
{
kgManager: this.kgManager,
memoryManager: this.memoryManager,
conversationManager: this.conversationMemoryManager,
}
);
2. Add to Tool Listโ
{
name: 'get_server_context',
description: 'Generate a comprehensive context file showing the server\'s current state',
inputSchema: getServerContextMetadata.inputSchema,
}
3. Auto-Update on Tool Executionโ
Add this hook to your tool execution handler:
// After successful tool execution
async function afterToolExecution() {
const generator = new ServerContextGenerator();
await generator.writeContextFile(kgManager, memoryManager, conversationManager);
}
4. Update on Server Startupโ
In your McpAdrAnalysisServer constructor:
async initialize() {
// ... existing initialization ...
// Generate initial context file
const generator = new ServerContextGenerator();
await generator.writeContextFile(
this.kgManager,
this.memoryManager,
this.conversationMemoryManager
);
this.logger.info('Server context file created', 'McpAdrAnalysisServer');
}
File Structureโ
# MCP Server Context & Memory
## ๐ฏ Server Quick Reference
- Server name, purpose, paths
- Available tools (quick reference list)
## ๐ง Memory & Knowledge Graph Status
- Active intents (last 5)
- Memory entities breakdown
- Conversation context
## ๐ Recent Analytics
- Tool usage (most used)
- Score trends
- Top impacting intents
## ๐ Discovered Patterns
- Architectural patterns with confidence
- Suggested relationships
## ๐ฏ Recommendations for This Session
- Next actions
- Knowledge gaps
- Optimization opportunities
## ๐ How to Use This Context
- Usage examples
## ๐ Context Refresh
- Auto-update triggers
- Manual refresh instructions
Benefitsโ
1. Instant Context Awarenessโ
LLMs don't need to query multiple tools to understand the server state - just @ the file.
2. Conversation Continuityโ
When resuming work, @ the file to recover context from previous sessions.
3. Pattern Recognitionโ
LLMs can see discovered patterns and make better architectural recommendations.
4. Guided Workflowsโ
The recommendations section guides LLMs toward high-value actions.
5. Memory Bridgeโ
Connects JSON-based memory systems to human-readable markdown that LLMs can process.
Example Context Fileโ
# MCP Server Context & Memory
> **Last Updated**: 2025-01-12T15:30:00Z
## ๐ฏ Server Quick Reference
**Name**: mcp-adr-analysis-server
**Project Path**: `/home/user/my-project`
**ADR Directory**: `docs/adrs`
### Available Tools
1. **adr_suggestion** - Suggest new ADRs
2. **smart_score** - Score architecture (0-100)
3. **deployment_readiness** - Validate deployment
...
## ๐ง Memory & Knowledge Graph Status
### Active Intents
**Total**: 15 | **Active**: 2 | **Completed**: 13
**Recent Intents**:
- **Implement database migration strategy** - executing - 2h ago
- **Select API gateway solution** - planning - 1d ago
...
### Memory Entities
**Total Entities**: 28
**Relationships**: 45
**Average Confidence**: 87%
**Entity Breakdown**:
- Architectural Decisions: 12
- Technical Decisions: 8
- Observations: 5
- Patterns: 3
## ๐ Recent Analytics
### Tool Usage
1. **adr_suggestion**: 34 calls
2. **smart_score**: 28 calls
3. **deployment_readiness**: 15 calls
### Score Trends
- **Current Score**: 87/100
- **Improvement**: +12 points
...
Troubleshootingโ
File Not Updatingโ
Check: Is the tool execution hook in place?
// After tool execution
await generator.writeContextFile(...);
File Missing on Startupโ
Check: Is the initialization call present?
// In server constructor
await generator.writeContextFile(...);
Outdated Informationโ
Solution: Manually trigger update:
Use MCP tool: get_server_context with writeToFile: true
Advanced Configurationโ
Custom Output Pathโ
await generator.writeContextFile(
kgManager,
memoryManager,
conversationManager,
'/custom/path/context.md'
);
Limit Recent Itemsโ
await generator.generateContext(kgManager, memoryManager, conversationManager, {
maxRecentItems: 10,
});
Disable Detailed Infoโ
await generator.generateContext(kgManager, memoryManager, conversationManager, {
includeDetailed: false,
});
Related Documentationโ
๐ก Pro Tip: Add .mcp-server-context.md to your .gitignore since it's auto-generated and changes frequently. But do commit it once with sample data so other developers understand what it's for!