Experiments

MCP Dual Interface Demo

Demonstration of dual-interface architecture where the same business logic serves both a traditional web application and an MCP server for AI tools.

growing#mcp#python#fastapi#react#dynamodb#docker#ai-tools#architecture

The idea

Most teams build applications for humans and then, when they want to integrate AI, create a separate layer on top: wrappers, adapters, intermediary APIs. This creates coupling, duplication, and cascading failure points.

This experiment explores an alternative: building from day one with two interfaces that share the same business logic. One interface for humans (web app) and another for AI agents (MCP server), both accessing the same data layer directly.

Why it matters

The Model Context Protocol (MCP) is an open standard that allows LLMs to interact with external tools in a structured way. Instead of an AI agent having to navigate a UI or parse HTML, the MCP server exposes operations as typed tools that the model can invoke directly.

The key question this experiment answers is: can you design an application where the human interface and the AI interface are both first-class citizens from day one, without one being a patch on top of the other?

Architecture

graph TB subgraph "Human Interface" A[React Frontend] --> B[FastAPI REST API] end subgraph "AI Interface" C[Kiro CLI / LLM] --> D[MCP Server - stdio] end subgraph "Shared Layer" E[Business Logic<br/>todo_service.py] end subgraph "Data" F[(DynamoDB)] end B --> E D --> E E --> F

Key architectural principles:

  • Loose coupling: each service accesses DynamoDB directly, without depending on the other
  • Shared code: business logic lives in a shared/ module that both interfaces import
  • Independent scaling: the REST API and MCP server can scale separately
  • No cascading failures: if the REST API goes down, the MCP server keeps working and vice versa

Tech stack

ComponentTechnology
REST APIFastAPI + Python 3.11
MCP ServerPython MCP SDK (stdio transport)
FrontendReact 19 + TypeScript + Vite + Tailwind CSS
DatabaseDynamoDB Local
Shared LogicPython module (shared/)
OrchestrationDocker Compose

How it works

The shared/todo_service.py module contains all business logic: create, list, update, and delete tasks. Both the REST API (backend/main.py) and the MCP server (mcp-server/server.py) instantiate TodoService with the same DynamoDB table.

The REST API exposes conventional HTTP endpoints:

POST   /todos          → create task
GET    /todos          → list tasks
GET    /todos/{id}     → get task
PATCH  /todos/{id}     → update task
DELETE /todos/{id}     → delete task

The MCP server exposes the same operations as typed tools that an LLM can invoke:

create_todo   → create task
list_todos    → list tasks
get_todo      → get task
update_todo   → update task
delete_todo   → delete task

Both interfaces execute exactly the same logic. No business rule duplication.

AWS mapping

The local architecture maps directly to AWS services for production:

LocalAWS
DynamoDB LocalDynamoDB
FastAPILambda + API Gateway or ECS/Fargate
MCP ServerLambda or ECS Task
React FrontendS3 + CloudFront or Amplify
Shared moduleLambda Layer or shared package

Lessons learned

  1. stdio transport is simple but limiting: works well for local development with Kiro CLI, but production would need SSE or WebSocket to support multiple concurrent clients.

  2. The shared layer is the most valuable pattern: separating business logic into an interface-independent module is what makes the dual architecture possible without duplication.

  3. DynamoDB simplifies direct access: by not needing a complex ORM or migrations, both services can access the same table with minimal configuration.

References

Experiments