At the start of this 12 months, Apple introduced its Foundation Models Framework through the WWDC 2025, with which developers can use the corporate's local AI models of their applications to the ability supply.
The company has announced that developers with this framework are given access to KI models without obtaining the inference costs. In addition, these local models have functions equivalent to the guided generation and the integrated tools.
Since iOS 26 is submitted to all users, developers have updated their apps in order that they contain functions of Apple's local AI models. Apple's models are low in comparison with leading models from Openai, Anthropic, Google or Meta. For this reason, local functions largely improve the standard of life with these apps as an alternative of introducing vital changes to the workflow's workflow.
Below you can find a number of the first apps that use Apple's AI framework.
The small artist app offers various interactive experiences with which children can learn different skills equivalent to creativity, mathematics and music. The developer Arima Jain has delivered a KI story creator with the iOS 26 -update. In this manner, users can select an indication and a subject, with the app generating a story with AI. The developer said that text generation within the history is driven by the local model.
This developer works on a prototype to robotically propose emojis for timeline events based on the title for the Daily Planner app.
The Finanz persecution app Moneycoach has two nice functions which are operated by local models. First, the app shows insights into its editions, e.g. B. whether you spent greater than average for food for this specific week. The other function robotically suggests categories and subcategories for an expenditure element for quick entries.

This Word -Lern -app has added two latest modes with the AI ​​models from Apple. There is a brand new learning mode that uses an area model to create examples that correspond to a word. The example also calls on the users to clarify using the word in a single sentence.

The developer also uses models for on devices to generate a map view of the origin of a word.

Just like another apps, the Tasks app has implemented a function to robotically suggest a day for an entry with the assistance of local models. These models are also used to acknowledge a recurring task and plan accordingly. With the app, users can speak just a few things and use the local model to divide them into different tasks without using the Internet.

The Journaling App Day One of Automattics is situated with Apple models to acquire highlights and to propose titles on your entry. The team has also implemented a function to generate entry requests that dive to dive and write more on what they’ve already written.

The recipe app uses Apple Intelligence to propose tags for a recipe and assign them to a timer. AI can also be used to disassemble a text block into easy -to -follow steps for cooking.
The digital signing app uses the local models from Apple to extract vital findings from a contract and to offer users a summary of the document they signed.

