API Reference
Complete API documentation for the MCP Proxy Wrapper and plugin system.
Core API
wrapWithProxy(server, options)
Wraps an existing MCP server with proxy functionality and plugin support.
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { wrapWithProxy } from 'mcp-proxy-wrapper';
const proxiedServer = await wrapWithProxy(server, {
plugins: [],
hooks?: ProxyHooks,
debug?: boolean
});
Parameters
Parameter | Type | Required | Description |
---|---|---|---|
server | McpServer | Yes | MCP server instance to wrap |
options.plugins | (ProxyPlugin | PluginRegistration)[] | No | Array of plugins to apply |
options.hooks | ProxyHooks | No | Before/after tool call hooks |
options.debug | boolean | No | Enable debug logging (default: false) |
Returns
Promise<McpServer>
- Enhanced server instance with proxy capabilities
Example
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { wrapWithProxy, LLMSummarizationPlugin } from 'mcp-proxy-wrapper';
const summaryPlugin = new LLMSummarizationPlugin();
summaryPlugin.updateConfig({
options: {
provider: 'mock', // Use 'openai' with API key for production
minContentLength: 100,
summarizeTools: ['search', 'analyze']
}
});
const server = new McpServer({ name: 'my-server', version: '1.0.0' });
const proxiedServer = await wrapWithProxy(server, {
plugins: [summaryPlugin],
debug: true
});
wrapWithEnhancedProxy(server, options)
(v2 API)
Enhanced version with advanced lifecycle management and performance features.
import { wrapWithEnhancedProxy, EnhancedProxyWrapperOptions } from 'mcp-proxy-wrapper';
const proxiedServer = await wrapWithEnhancedProxy(server, {
plugins: [],
hooks?: ProxyHooks,
lifecycle?: LifecycleConfig,
execution?: ExecutionConfig,
performance?: PerformanceConfig
});
Parameters
Parameter | Type | Required | Description |
---|---|---|---|
server | McpServer | Yes | MCP server instance to wrap |
options.plugins | (ProxyPlugin | PluginRegistration)[] | No | Array of plugins to apply |
options.hooks | ProxyHooks | No | Before/after tool call hooks |
options.lifecycle | LifecycleConfig | No | Plugin lifecycle management |
options.execution | ExecutionConfig | No | Hook execution configuration |
options.performance | PerformanceConfig | No | Performance monitoring |
Returns
Promise<McpServer>
- Enhanced server with v2 proxy capabilities
Plugin Interface
ProxyPlugin
Base interface that all plugins must implement.
interface ProxyPlugin {
name: string;
version: string;
// Lifecycle hooks
beforeToolCall?(context: ToolCallContext): Promise<void | ToolCallResult>;
afterToolCall?(context: ToolCallContext, result: ToolCallResult): Promise<ToolCallResult>;
// Plugin lifecycle
initialize?(context: PluginContext): Promise<void>;
destroy?(): Promise<void>;
}
Properties
Property | Type | Required | Description |
---|---|---|---|
name | string | Yes | Unique plugin identifier |
version | string | Yes | Plugin version (semver) |
Methods
Method | Parameters | Returns | Description |
---|---|---|---|
beforeToolCall | context: ToolCallContext | Promise<void | ToolCallResult> | Called before tool execution |
afterToolCall | context: ToolCallContext, result: ToolCallResult | Promise<ToolCallResult> | Called after tool execution |
initialize | context: PluginContext | Promise<void> | Plugin initialization |
destroy | None | Promise<void> | Plugin cleanup |
PluginRegistration
Configuration object for registering plugins with specific settings.
interface PluginRegistration {
plugin: ProxyPlugin;
config?: PluginConfig;
}
Properties
Property | Type | Required | Description |
---|---|---|---|
plugin | ProxyPlugin | Yes | The plugin instance |
config | PluginConfig | No | Plugin-specific configuration |
Example
const proxiedServer = await wrapWithProxy(server, {
plugins: [
// Direct plugin registration
summaryPlugin,
// Plugin with configuration
{
plugin: memoryPlugin,
config: {
// Plugin-specific settings go inside the 'options' object
options: {
maxEntries: 50, // Note: using actual option from ChatMemoryPlugin
sessionTimeout: 60 * 60 * 1000 // 1 hour
}
}
}
]
});
ToolCallContext
Context object provided to plugin hooks during tool execution.
interface ToolCallContext {
toolName: string;
args: Record<string, any>;
metadata: {
requestId: string;
timestamp: number;
userId?: string;
[key: string]: any;
};
}
Properties
Property | Type | Description |
---|---|---|
toolName | string | Name of the tool being called |
args | Record<string, any> | Arguments passed to the tool |
metadata.requestId | string | Unique request identifier |
metadata.timestamp | number | Request timestamp (Unix milliseconds) |
metadata.userId | string? | User identifier (if available) |
ToolCallResult
Result object returned from tool execution.
interface ToolCallResult {
content: Array<{
type: 'text' | 'image' | 'resource';
text?: string;
data?: string;
url?: string;
mimeType?: string;
}>;
isError?: boolean;
metadata?: Record<string, any>;
}
Properties
Property | Type | Description |
---|---|---|
content | Array<ContentBlock> | Tool response content |
isError | boolean? | Indicates if result is an error |
metadata | Record<string, any>? | Additional result metadata |
PluginContext
Context provided during plugin initialization.
interface PluginContext {
server: McpServer;
logger: Logger;
config: Record<string, any>;
}
Core Plugin APIs
LLM Summarization Plugin
import { LLMSummarizationPlugin } from 'mcp-proxy-wrapper';
const summaryPlugin = new LLMSummarizationPlugin();
// Configuration options
interface SummarizationConfig {
provider: 'openai' | 'mock'; // AI provider
openaiApiKey?: string; // OpenAI API key
model?: string; // Model name (default: gpt-4o-mini)
maxTokens?: number; // Max tokens in summary
temperature?: number; // Generation temperature
summarizeTools?: string[]; // Tools to summarize (empty = all)
minContentLength?: number; // Min content length to summarize
saveOriginal?: boolean; // Save original responses
}
// Update plugin configuration
summaryPlugin.updateConfig({
options: {
provider: 'openai',
openaiApiKey: process.env.OPENAI_API_KEY,
model: 'gpt-4o-mini',
maxTokens: 150,
summarizeTools: ['search', 'research', 'analyze'],
minContentLength: 100
}
});
// Get original result by storage key
const original = await summaryPlugin.getOriginalResult(storageKey);
// Get plugin statistics
const stats = await summaryPlugin.getStats();
Chat Memory Plugin
import { ChatMemoryPlugin } from 'mcp-proxy-wrapper';
const memoryPlugin = new ChatMemoryPlugin();
// Configuration options
interface MemoryConfig {
provider: 'openai' | 'mock'; // Chat AI provider
openaiApiKey?: string; // OpenAI API key
model?: string; // Model for chat responses
saveResponses?: boolean; // Save tool responses
enableChat?: boolean; // Enable chat functionality
maxEntries?: number; // Max stored entries
maxSessions?: number; // Max chat sessions
sessionTimeout?: number; // Session timeout in ms
excludeTools?: string[]; // Tools to exclude from saving
}
// Update plugin configuration
memoryPlugin.updateConfig({
options: {
provider: 'openai',
openaiApiKey: process.env.OPENAI_API_KEY,
saveResponses: true,
enableChat: true,
maxEntries: 1000,
sessionTimeout: 24 * 60 * 60 * 1000
}
});
// Start chat session
const sessionId = await memoryPlugin.startChatSession(userId);
// Chat with memory
const response = await memoryPlugin.chatWithMemory(
sessionId,
'What data do I have about sales?',
userId
);
// Search conversations
const results = memoryPlugin.searchConversations('sales metrics', userId);
// Get conversation history
const history = memoryPlugin.getConversationHistory(userId, 20);
Plugin Data Types
// LLM Summarization Plugin Types
interface StoredResult {
originalResult: ToolCallResult;
context: Omit<PluginContext, 'pluginData'>;
timestamp: number;
toolName: string;
requestId: string;
metadata?: Record<string, any>;
}
interface LLMProvider {
generateSummary(content: string, prompt: string, options?: any): Promise<string>;
}
// Chat Memory Plugin Types
interface ConversationEntry {
id: string;
toolName: string;
request: {
args: Record<string, any>;
timestamp: number;
};
response: {
content: string;
metadata?: Record<string, any>;
timestamp: number;
};
context: {
requestId: string;
userId?: string;
sessionId?: string;
};
}
interface ChatSession {
id: string;
userId?: string;
messages: ChatMessage[];
createdAt: number;
lastActivity: number;
}
interface ChatMessage {
id: string;
type: 'user' | 'assistant' | 'system';
content: string;
timestamp: number;
metadata?: Record<string, any>;
}
Logger Interface
Logger
Standard logging interface used throughout the system.
interface Logger {
debug(message: string, meta?: any): void;
info(message: string, meta?: any): void;
warn(message: string, meta?: any): void;
error(message: string, meta?: any): void;
}
Built-in Logging
The proxy wrapper includes built-in logging with colored output. Enable debug mode to see detailed execution logs:
const proxiedServer = await wrapWithProxy(server, {
plugins: [summaryPlugin],
debug: true // Enables detailed logging
});
Error Handling
Plugin Errors
Plugin errors are automatically caught and logged without breaking tool execution:
// Plugin error handling
try {
await plugin.beforeToolCall(context);
} catch (error) {
console.error(`Plugin ${plugin.name} error:`, error);
// Tool execution continues
}
Tool Errors
Tools should return error results in MCP format:
// Tool error response
return {
content: [{
type: 'text',
text: 'Error: Invalid input provided'
}],
isError: true
};
Plugin Errors
Plugin errors are handled gracefully by the proxy wrapper:
// LLM Summarization error (falls back to original)
return {
...result,
result: {
...result.result,
_meta: {
...result.result._meta,
summarizationError: 'OpenAI API unavailable',
fallbackToOriginal: true
}
}
};
// Chat Memory error (logs but doesn't break tool call)
catch (error) {
this.logger?.error(`Failed to save conversation entry: ${error}`);
return result; // Return original result
}
Type Definitions
Complete TypeScript Definitions
// Export all types for use in your applications
export {
ProxyPlugin,
BasePlugin,
ToolCallContext,
ToolCallResult,
PluginContext,
PluginConfig,
PluginMetadata,
PluginStats,
Logger
} from 'mcp-proxy-wrapper';
Migration Guide
From Direct MCP Server
// Before: Direct MCP server
const server = new McpServer(config);
server.tool('my-tool', schema, handler);
// After: Wrapped with proxy
const proxiedServer = await wrapWithProxy(server, { plugins: [] });
proxiedServer.tool('my-tool', schema, handler);
Adding AI Enhancement
// Add AI summarization to existing setup
import { LLMSummarizationPlugin } from 'mcp-proxy-wrapper';
const summaryPlugin = new LLMSummarizationPlugin();
summaryPlugin.updateConfig({
options: {
provider: 'openai',
openaiApiKey: process.env.OPENAI_API_KEY,
summarizeTools: ['research', 'analyze'],
minContentLength: 200
}
});
const proxiedServer = await wrapWithProxy(server, {
plugins: [summaryPlugin]
});
Backward Compatibility: The proxy wrapper maintains full compatibility with existing MCP server code. No changes are required to your tool implementations.
Best Practices
Plugin Development
- Error Isolation: Always handle errors gracefully
- Performance: Minimize blocking operations in
beforeToolCall
- Logging: Use structured logging with context
- Testing: Write comprehensive tests for plugin logic
Production Deployment
- Environment Variables: Use environment-based configuration
- Database: Use PostgreSQL for production data storage
- Monitoring: Implement health checks and alerting
- Security: Follow security best practices for API keys
Performance Optimization
- Plugin Priorities: Order plugins by execution cost
- Caching: Implement caching for expensive operations
- Connection Pooling: Use connection pooling for databases
- Rate Limiting: Implement appropriate rate limiting
Ready to build? This API reference covers everything you need to integrate the MCP Proxy Wrapper into your applications.