What is Model Context Protocol (MCP)?
Model Context Protocol (MCP) is an open standard introduced by Anthropic in November 2024 that provides a standardized way for AI models to interact with external tools, data sources, and services in real-time. Instead of AI models being limited to their training data, MCP enables them to access live information, perform calculations, manipulate files, and execute complex workflows through well-defined tool interfaces.
This material provides a comprehensive learning journey for MCP, starting from basic concepts and progressing to a production-ready integration with Azure OpenAI. The goal is to demonstrate how MCP can transform AI applications by enabling them to interact with real-world systems and data. It starts with simple local communication and gradually introduces network-based architectures, culminating in a full integration with Azure OpenAI's function calling capabilities.
Why MCP Matters for Azure OpenAI
Azure OpenAI's function calling capabilities become significantly more powerful when combined with MCP servers. This integration allows AI models to:
- Access real-time data - Live system information, current weather, database queries
- Perform complex operations - File operations, calculations, API calls
- Maintain state - Persistent storage and workflow management
- Scale dynamically - Multiple tools and services can be added without model retraining
Solution Architecture
The MCP integration with Azure OpenAI follows a three-tier architecture:
┌─────────────────┐ HTTPS/JSON ┌─────────────────┐ HTTP/SSE ┌─────────────────┐
│ Azure OpenAI │ ◄─────────────► │ MCP Client │ ◄─────────────► │ MCP Server │
│ (Cloud API) │ Function │ (Python Bridge) │ Tool Calls │ (Local Tools) │
│ │ Calling │ │ │ │
└─────────────────┘ └─────────────────┘ └─────────────────┘
Key Components:
- Azure OpenAI - Provides AI intelligence and function calling capabilities
- MCP Client - Acts as a bridge, translating between OpenAI functions and MCP tools
- MCP Server - Hosts the actual tools and capabilities (calculations, file operations, system access)
Learning Path Overview
The complete learning journey consists of three progressive steps:
Step 1: Basic MCP Concepts 🔰
- Goal: Learn fundamental MCP concepts with local communication
- Transport: stdio (subprocess communication)
- Files:
server.py
+client.py
- Concepts: Tools, Resources, Prompts, JSON-RPC messaging
Step 2: Network MCP 🌐
- Goal: Learn network-based MCP using web standards
- Transport: SSE (Server-Sent Events) over HTTP
- Files:
servernetwork.py
+clientnetwork.py
- Concepts: Network accessibility, persistent connections, multi-client support
Step 3: Azure OpenAI Integration 🤖
- Goal: Production-ready AI integration with real-time tools
- Transport: HTTPS (Azure OpenAI) + SSE (MCP Server)
- Files:
azure_mcp_server.py
+azure_openai_mcp_client.py
- Concepts: Function calling, tool bridging, interactive AI, multi-step workflows
Key Features Demonstrated
Rich Tool Ecosystem
- Mathematical Operations - Real-time calculations with error handling
- File Management - Read, write, and manage files persistently
- System Information - Access live system metrics and data
- Weather Services - Integration with external APIs (mock demonstration)
- Interactive Prompts - User guidance and capability discovery
Production-Ready Patterns
- Error Handling - Graceful failure modes and user feedback
- Network Security - Local MCP servers with encrypted Azure OpenAI API
- Conversation History - Stateful interactions across multiple requests
- Interactive Experience - Menu-driven exploration and learning
Real-World Example Workflow
Here's how a typical interaction works:
1. User: "Calculate 15 * 8 and save the result to a file"
2. Azure OpenAI analyzes request and calls MCP tools:
→ calculate(operation="multiply", a=15, b=8)
→ save_text_file(filename="result.txt", content="120")
3. MCP Server executes tools and returns results:
← "Result: 15.0 multiply 8.0 = 120.0"
← "Successfully saved content to result.txt"
4. Azure OpenAI provides final response:
"I've calculated 15 × 8 = 120 and saved the result to result.txt"
Getting Started
Prerequisites
# Install dependencies
pip install openai mcp psutil python-dotenv
Interactive Experience (Recommended)
# Terminal 1: Start MCP server with enhanced tools
python azure_mcp_server.py
# Terminal 2: Run interactive Azure OpenAI client
python azure_openai_mcp_client_interactive.py
The interactive client provides:
- Menu-driven interface for easy exploration
- Real-time tool execution with detailed feedback
- Conversation history management
- Example conversations to learn the system
- Tool discovery to understand capabilities
Demo Mode
Even without Azure OpenAI credentials, the system runs in demo mode showing exactly how the integration would work, making it perfect for learning and experimentation.
Architecture Benefits
Separation of Concerns
- AI Intelligence: Azure OpenAI handles natural language understanding
- Tool Execution: MCP servers handle specialized operations
- Integration Logic: Python clients manage the bridge between systems
Scalability & Flexibility
- Multiple Tools: Easy to add new capabilities without retraining models
- Network Distribution: MCP servers can run anywhere accessible via HTTP
- Technology Agnostic: MCP servers can be written in any language
Security & Control
- Local Tool Execution: Sensitive operations remain on your infrastructure
- Encrypted Communication: Azure OpenAI API uses HTTPS
- Access Control: You control which tools are available to AI models
Next Steps
This implementation demonstrates the complete spectrum from basic MCP concepts to production-ready Azure OpenAI integration. The progressive learning approach ensures you understand:
- Fundamental MCP concepts and communication patterns
- Network-based architectures using web standards
- AI model integration with real-time tool capabilities
- Interactive user experiences that combine AI intelligence with live data
For detailed implementation guides, complete code examples, and step-by-step tutorials, refer to the comprehensive documentation and working code samples:
📖 Complete Documentation & Code: https://github.com/srinman/mcpdemo/blob/main/mcp_tutoring_basic_to_advanced.md
The repository includes:
- Working code examples for all three learning steps
- Detailed explanations of each component and concept
- Interactive clients for hands-on experimentation
- Production patterns for real-world deployment
- Testing utilities and validation scripts
Conclusion
Model Context Protocol bridges the gap between AI intelligence and real-world capabilities. This integration with Azure OpenAI demonstrates how modern AI applications can access live data, perform complex operations, and maintain state - transforming AI from a text generator into a capable assistant that can interact with your entire technology stack.
The three-step learning journey provides both conceptual understanding and practical implementation experience, enabling you to build sophisticated AI applications that combine the power of large language models with the flexibility of real-time tools and services.
In the next blog post, we will explore how to secure the MCP server using EntraID, APIM.
References
https://phase2online.com/2025/04/28/what-is-an-mcp-server-explained/
https://github.com/Azure-Samples/remote-mcp-apim-functions-python?tab=readme-ov-file
https://www.redhat.com/en/blog/model-context-protocol-mcp-understanding-security-risks-and-controls
https://blog.christianposta.com/understanding-mcp-authorization-step-by-step-part-two/
https://modelcontextprotocol.io/specification/2025-06-18/basic/authorization