32dots HEIDELBERG AI
Session 23

MCP and connected systems

hard
  • Make one working connection via both OpenAPI HTTP and MCP, and compare
  • Read a system as {host, client, server, tool, resource} instead of "some boxes and arrows"
  • Articulate when MCP's standardisation earns its keep (vs. plain HTTP)

Students should understand how AI systems connect to tools and data in a standardized way. Connections are where half of production bugs live — getting fluent in the vocabulary pays off the first time something breaks at 2am.

  • If you swap an OpenAPI tool for an MCP tool with the same capability, what changes for the agent? What changes for the ops team?
  • Why might a tiny project not need MCP at all?

MCP (Model Context Protocol) separates the host application, clients, servers, tools, resources, and prompts. This makes it easier to design systems where models can reliably access external capabilities without ad-hoc integration logic. In practice today, most production glue is still OpenAPI + HTTP — MCP is the emerging standard; both are worth knowing.

Rough rule: ≤3 tools and one team → OpenAPI is fine. Many tools, multiple owners, tools you want to share across products → MCP starts paying for itself.

scenario

Your AI system is designing its MCP tool set. Which of these 5 tool definitions are well-designed, which have problems?

Tool: search_pubmed(query: str, max_results: int=5) → list[{pmid, title, abstract}]
Tool: do_stuff(input: any) → any
Tool: send_email_to_all_patients(subject: str, body: str) → None # sends to every patient in DB
Tool: get_patient_labs(patient_id: str, date_from: str, date_to: str) → list[LabResult] # requires auth token
Tool: analyze_everything(text: str) → dict # runs all analysis pipelines in parallel

A campus assistant could connect to course documents, room data, and scheduling tools through clean interfaces — each a separate server with declared tools and resources.

▸ Use the instructor's finished build before you build yours feel what "done" looks like — then recreate it

Not loading? https://dify.32dots.de/chat/mTz7ZoTh5OptMzcT

Finished MCP and connected systems
The finished app — use this as your target
ask Ask a question about "MCP and connected systems"
n8n Task Open in n8n → 🔑 student@cos.32dots.de · cos2026

Draw and prototype a connected system: n8n workflow with one Source node (HTTP call to an external API with an OpenAPI spec), one Model (processes the retrieved data), one Action node (writes result somewhere). Then replace the Source with the MCP Client node (n8n 1.70+ has one) pointed at a real MCP server and compare.

Nodes needed
  • Trigger
  • HTTP Request (OpenAPI-described endpoint) OR MCP Client node
  • Chat Model
  • Action node (Postgres / Sheets / Email)
  • Output

MCP and connected-systems thinking is about gluing many external servers together — the *definition* of plumbing. n8n already has HTTP + native MCP support and lets you see the wire format (headers, bodies, errors). Dify's agents consume MCP tools but hide the wire details. When learning the connection layer, use n8n; when using the connection layer in a product, either tool works.

  • **MCP for MCP's sake** — adopting MCP for a 2-tool project adds ceremony without payoff. Standardise when the count justifies it.
  • **Not reading the wire** — if you never look at the request/response, you won't debug it. Log at least headers + body for one real call.
  • **Confusing tools with resources** — tools are *actions* (do something); resources are *data* (read something). The agent uses them differently.

Why do standards matter more as systems become more connected?

Design a campus-assistant architecture map with at least 3 tool servers (course docs, room lookup, scheduling) and prototype one end-to-end call.

Deliverable

Architecture diagram + one working MCP or HTTP-OpenAPI connection showing a real retrieval and a real action.

Which tool in your stack would save the most time if three other people in your lab could also call it via MCP — and what would stop them from using it today?