📚Academy
likeone
online

Servers vs Tools

MCP Servers are long-running processes that expose multiple tools. Understand the difference — and explore six real server types.

MCP Servers vs Direct Tool Calls

An MCP Server is NOT the same as a single tool. A server is a persistent process that can expose many tools, maintain state across calls, and manage connections. Here's how they compare:

MCP Servers
Direct Tool Calls
LifecycleLong-running process. Starts once, handles many requests.
LifecycleOne-off execution. Each call is independent.
CapabilitiesExposes multiple tools, resources, and prompts through a single server.
CapabilitiesSingle function with defined inputs and outputs.
StateMaintains state across calls. Database connections stay open. Context persists.
StateStateless. Each invocation starts fresh with no memory of previous calls.
DiscoveryClients discover available tools automatically via the MCP protocol.
DiscoveryTools must be explicitly defined and configured per integration.
TransportCommunicates over stdio (local) or Streamable HTTP (remote).
TransportVaries — REST, SDK calls, or inline function execution.
StandardUniversal MCP protocol. Works with any MCP client.
StandardVendor-specific. Different format for each AI provider.

Explore MCP Server Types

MCP servers come in many forms. Here are six common types, each exposing different capabilities:

📁
Filesystem

Read, write, search files

💾
Database

Query, insert, update data

🌐
API

Wrap any REST or GraphQL API

🖥️
Browser

Navigate, screenshot, interact

🧠
Memory

Persistent knowledge recall

🔍
Search

Web, docs, or code search

See the Difference in Code

Here is the same capability — letting AI read a file — implemented as a direct tool call versus an MCP server tool. Notice how the MCP version is standardized and self-documenting:

Direct Tool Call (OpenAI-style)JSON
// Vendor-specific format // Different for every AI provider { "type": "function", "function": { "name": "read_file", "parameters": { "type": "object", "properties": { "path": { "type": "string" } } } } }

You define the schema. You parse the call. You execute the function. You return the result. Different format for each AI provider.

MCP Server ToolTypeScript
// Universal MCP standard // Works with ANY MCP client server.tool( "read_file", { path: z.string() .describe("File path to read"), }, async ({ path }) => ({ content: [{ type: "text", text: readFileSync(path, "utf-8"), }], }) );

The SDK handles protocol, validation, and transport. Your code is just the business logic. Works with Claude, VS Code, Cursor — any MCP client.

The Same Tool, Two Approaches

Here is a file-reading capability implemented both ways in Python. Notice how the MCP version handles protocol, validation, and discovery automatically — your code is just the business logic:

Python — raw tool definition (vendor-specific)
# You handle EVERYTHING: schema, parsing, execution, errors
tools = [{
    "type": "function",
    "function": {
        "name": "read_file",
        "description": "Read a file from disk",
        "parameters": {
            "type": "object",
            "properties": {
                "path": {"type": "string"}
            },
            "required": ["path"]
        }
    }
}]

def handle_tool_call(name, args):
    if name == "read_file":
        with open(args["path"]) as f:
            return f.read()
    # You must manually route every tool call
Python — MCP server (universal standard)
from mcp.server.fastmcp import FastMCP

# The SDK handles protocol, validation, routing, transport
mcp = FastMCP("file-server")

@mcp.tool()
def read_file(path: str) -> str:
    """Read a file from disk"""
    with open(path) as f:
        return f.read()

# Works with Claude, VS Code, Cursor — any MCP client
mcp.run()

The raw approach requires you to define JSON schemas, parse arguments, route calls manually, and handle errors — all in a format that only works with one AI provider. The MCP server version is six lines of business logic. The SDK generates the schema from the type hints, handles transport, and works with every MCP client automatically.

When to Build a Server vs Use Direct Tool Calling

Build a Server When you want the tool to work across multiple AI clients, when you need persistent state (DB connections, sessions), or when you want automatic tool discovery.
Use Direct Calls When you are building a one-off integration for a single app, when you only need one or two simple functions, or when you are locked into a specific AI provider's SDK.
Academy
Built with soul — likeone.ai