Replies: 5 comments
-
For reference, MCP and other tooling for LLM is by the way being worked on by allenporter -> https://github.com/allenporter |
Beta Was this translation helpful? Give feedback.
-
FYI, Paulus spoke about MCP during the Home Assistant 2025.2 release party video, check out his explaination (starting at timestamp 56:40): And their voice development team also spoke about it on the Voice Chapter 9 video: Also check out: |
Beta Was this translation helpful? Give feedback.
-
A2A (Agent2Agent Protocol) integration support might also be relevant as a complementary feature to supporting the MCP protocol? A2A is a new open protocol enabling communication and interoperability between agentic AI applications (that is complementary to the MCP protocol. Any thoughts on if can integrate support for Google's "A2A" (Agent2Agent) open protocol? A2A is an open protocol that complements Anthropic's Model Context Protocol (MCP), which provides helpful tools and context to agents. A2A protocol is designed to address the challenges identified in deploying large-scale, multi-agent systems. A2A empowers developers to build agents capable of connecting with any other agent built using the protocol and offers users the flexibility to combine agents from various providers. Critically, businesses benefit from a standardized method for managing their agents across diverse platforms and cloud environments. We believe this universal interoperability is essential for fully realizing the potential of collaborative AI agents. https://youtube.com/watch?v=rAeqTaYj_aI While introducing A2A, Google claims building AI agentic system demands two layers:
MCP focuses on the first category: organizing what agents, tools, or users send into the model, whereas A2A focuses on the second category: coordination between intelligent agents. On the other hand, by separating tools from agents, Google is able to position A2A as complementary to — rather than in competition with — MCP. https://youtu.be/vIfagfHOLmI?si=pKEOugt3oZJlWRaj https://www.youtube.com/watch?v=voaKr_JHvF4 An open protocol enabling communication and interoperability between opaque agentic applications. One of the biggest challenges in enterprise AI adoption is getting agents built on different frameworks and vendors to work together. That’s why we created an open Agent2Agent (A2A) protocol, a collaborative way to help agents across different ecosystems communicate with each other. Google is driving this open protocol initiative for the industry because we believe this protocol will be critical to support multi-agent communication by giving your agents a common language – irrespective of the framework or vendor they are built on. With A2A, agents can show each other their capabilities and negotiate how they will interact with users (via text, forms, or bidirectional audio/video) – all while working securely together. See A2A in ActionWatch this demo video to see how A2A enables seamless communication between different agent frameworks. Conceptual OverviewThe Agent2Agent (A2A) protocol facilitates communication between independent AI agents. Here are the core concepts:
|
Beta Was this translation helpful? Give feedback.
-
For this AI Automation Suggester a use case could be to use codename goose to generate automation code as it supports MCP server? "goose is your on-machine AI agent, capable of automating complex development tasks from start to finish. More than just code suggestions, goose can build entire projects from scratch, write and execute code, debug failures, orchestrate workflows, and interact with external APIs - autonomously." |
Beta Was this translation helpful? Give feedback.
-
This is been on my list to try! Thanks for all the information. I;ll see if I can get this working this weekend |
Beta Was this translation helpful? Give feedback.
-
Would it by the way be a good idea if could access the IT Automation Suggester integration via Assist conversation agent using MCP? See related discussion here:
Anyone here involved anything in the development and/or testing of Model Context Protocol integration support for Home Assistant?
Recommend read how the Open Home Foundation and Home Assistant's founders describe idea/concept of AI agents for the smart home:
For reference know that Home Assistant now has initial support for Model Context Protocol (MCP) which will will be key to Agentic AI for it:
https://www.youtube.com/watch?v=VChRPFUzJGA&t=16s&ab_channel=JackHerrington
Noticed now in the Home Assistant 2025.2 Beta (Release Candidate) release notes that Home Assistant 2025.2 will have initial support for it:
https://www.home-assistant.io/blog/2025/02/05/release-20252/#model-context-protocol
So relevant to this is that Home Assistant 2024.2 release added support for "Model Context Protocol" (MCP) server and client that will eventually be used to extend Home Assistant's AI capabilities through file access, database connections, API integrations, and other contextual services:
Model Context Protocol
This release adds the Model Context Protocol to Home Assistant thanks to Allen. Home Assistant can both be an MCP server and an MCP client. From the MCP website:
MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
To give it a try yourself, check out this client demo.
PS: Maybe someone here is project working on Model Context Protocol (MCP) server/proxy support and integration as an AI interface?
Beta Was this translation helpful? Give feedback.
All reactions