AI for iOS: MCP for Apple Intelligence when approaching

Combining AI systems and agent solutions is the goal of the Model Context Protocol (MCP) developed by Anthropic. With a little luck, Apple could soon get the technology on the iPhone: as part of iOS 26.1. There are now good, first information in the update betacode, which will probably be released in October. In addition to iOS, it should also be on the iPad (with iPados 26.1) and the Mac (with MacOS 26.1), reports the Apple blog 9to5macthat discovered the codeleaks. MCP allows to connect AI systems with data sources and to lift the isolation of individual LLMs. The makers hope that it should become an open standard similar to the web protocol HTTP.

Apple apparently wants to integrate MCP into its app intent’s framework. This should be able to interact with applications – and in particular an improved voice assistant Siri. This also includes reading app states. In order for this to work, developers have to work. Since Apple had initially postponed the Siri context sensitivity to the coming year, little has happened here so far. App Intents can also work with the Spotlight search, use the short commands app and even intercept hardware interactions.

With MCP, developers could access significantly more than just Siri and/or Apple Intelligence. At the system level, app activities and availability of data and functions would ultimately be usable for all AI platforms that support MCP. In addition to Anthropics Clause, this also includes Chatgpt from Openai as well as various other large voice models and AI applications.

However, the codeleak does not say that MCP is available directly with iOS 26.1 and Co. – it could only be the first preliminary work until the context -sensitive siri then (hopefully) will appear in the coming spring. “The code found today points to a very young MCP support,” writes 9to5mac. It could still “take a while until the integration is available – even an official announcement is probably even longer.

Apple is currently driving a multi -track strategy: Apple Intelligence is based on its own voice models, but users can alternatively also use chatt. Google Gemini could also be available in the foreseeable future. In the meantime, it was said that Apple would get the LLM directly from Google for the context-sensitive siri (or their chatbot-like successor).


Discover more from Apple News

Subscribe to get the latest posts sent to your email.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.