Skip to content

Instantly share code, notes, and snippets.

@allenporter
Last active May 22, 2025 08:18
Show Gist options
  • Save allenporter/b0e9946feb2ab60901c4f467ac1ba6f9 to your computer and use it in GitHub Desktop.
Save allenporter/b0e9946feb2ab60901c4f467ac1ba6f9 to your computer and use it in GitHub Desktop.

Home Assistant Model Context Protocol integration

TL;DR

Completing these steps will let you have an LLM Powered Web scraper in Home Assistant through the Model Context Protocol with an example of how you could make a template entity for extracting new headlines for a display.

Pre-requisites

This assumes you already know about the following:

  • Home Assistant
  • Voice Assistant with Conversation Agent (OpenAI, Google Gemini, Anthropic, etc)
  • Python virtual environment for running MCP server

Overview

This guide will get you an LLM powered template enttiy using MCP to fetch external web pages:

graph LR
    A["Home Assistant LLM Conversation Agent"] <--> |sse| B["mcp-proxy"]
    B <--> |stdio| C["mcp-fetch MCP Server"]
    C <--> |http/s| D["web pages"]

    style A fill:#ffe6f9,stroke:#333,color:black,stroke-width:2px
    style B fill:#e6e6ff,stroke:#333,color:black,stroke-width:2px
    style C fill:#e6ffe6,stroke:#333,color:black,stroke-width:2px
    style D fill:#e6ffe6,stroke:#333,color:black,stroke-width:2px
Loading

Install MCP Proxy & Fetch MCP Server

Install dependencies mcp-proxy and MCP Fetch Server.

$ uv pip install mcp-proxy mcp-server-fetch

Most MCP servers are stdio based (e.g. spawned by Claude Desktop) so we need to start a server to expose them to Home Assistant. We use mcp-proxy which runs an SSE server, then spawns the stdio MCP server that can fetch web pages. The proxy server spawns the command without any environment variables, so we set our path.

$ mcp-proxy --sse-port 42783 --env PATH "${PATH}" -- uv run mcp-server-fetch
...
INFO:     Uvicorn running on http://127.0.0.1:42783 (Press CTRL+C to quit)

The SSE server is now exposed at http://127.0.0.1:42783/sse. You can set flags to change the IP and port the proxy listens on.

Configure Model Context Protocol Integration

Important

NOTE: This integration is currently available in 2025.2.0b beta release

Open your Home Assistant instance and start setting up a new integration.

Manually add the integration

Set the SSE Server URL to your MCP proxy server SSE endpoint e.g. http://127.0.0.1:42783/sse. Make sure the URL ends with /sse.

The integration will create a new LLM API called mcp-fetch that is available to conversation agents. It does not add any other entities or devices.

Configure Conversation Agent

  1. Navigate to your existing conversation agent integration and reconfigure it

  2. Set the LLM Control to mcp-fetch

  3. Update the prompt to be something simple for now such as: You are an agent for Home Asisstant, with access to tools through an external server. This external server enables LLMs to fetch web page contents.

Screenshot of Conversation Agent Configuration

Try it Out

Open the conversation agent and ask it to fetch a web page:

Screenshot 2025-01-07 at 10 33 25 PM Screenshot 2025-01-07 at 10 34 10 PM

Prompt Engineering

Lets now experiment to use the tool to make a sensor. We should ask the model to respond more succintly, and call it from the developer tools. Here is a yaml example:

action: conversation.process
data:
  agent_id: conversation.google_generative_ai
  text: >-
    Please visit bbc.com and summarize the first headline. Please respond with
    succinct output as the output will be used as headline for an eInk display
    with limited space. 

We could even improve this by giving some few-shot example headlines, however, the model follows instructions OK already and produces a headline:

response:
  speech:
    plain:
      speech: Trump threatens Greenland and Panama Canal.
      extra_data: null
  card: {}
  language: en
  response_type: action_done
  data:
    targets: []
    success: []
    failed: []
conversation_id: 01JH2B56RMR2DABQQGAAA9X95D

Template Entity

Below is an example template entity definition based on the Model Context Protocol tool calling to fetch a web page:

template:
- trigger:
    - platform: time_pattern
      hours: "/1"
    # Used for testing
    - platform: homeassistant
      event: start
  action:
    - action: conversation.process
      data:
        agent_id: conversation.google_generative_ai
        text: >-
          Please visit bbc.com and summarize the single most important headline.
          Please respond with succinct output as the output will be used as
          headline for an eInk display with limited space, so answers must be
          less than 200 characters.
        response_variable: headline
  sensor:
    - name: Headline
      attributes:
        title: "{{ headline.response.speech.plain.speech }}"
      unique_id: "d3641cdf-aa9f-4169-acae-7f7ba989c492"
  unique_id: "272f0508-3e27-4179-9aca-06d8333874e7"
Screenshot of template entity output

Now go out and make yourself an ESPHome E-ink display if you don't have one already!

@Hedda
Copy link

Hedda commented Apr 14, 2025

Any thoughts on if can integrate support for Google's new "A2A" (Agent2Agent) open protocol? An open protocol enabling communication and interoperability between opaque agentic applications.

@allenporter
Copy link
Author

I was reviewing their new Agent SDK but had not yet seen the agent to agent protocol. Will take a look.

@Hedda
Copy link

Hedda commented May 20, 2025

@allenporter FYI, Microsoft just annnouned at their Microsoft Build 2025 conferance that the upcoming public preview of Windows 11 will get native support for Model Context Protocol (MCP) to power the agentic ecosystem on Windows 11. That should practially garantee that MCP will also be more broadly adopted soon:

image

MCP on Windows architecture

PS: Related to that news, recommend listen to The Verge Decoder podcast interview with Microsoft’s top AI leader Kevin Scott on the future of AI agents (in which he also highlight the importance on MCP):

@Hedda
Copy link

Hedda commented May 22, 2025

fabric (from @danielmiessler) is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.

Would be awesome to have integration(s) for Fabric's "Patterns" (currated crownsoured promts) as AI agents / tooling that can use Fabric pattern promts as tools can use from Home Assistant.

FYI, someone has now begun working on an MCP server for Fabric to allow integrating Fabric AI capabilities into MCP-enabled tools. See here:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment