In the future, iPhone analysis data will be used in the further development of Apple Intelligence. In order to better understand how customers actually use the new AI functions, Apple wants to gain trends and insights from “real user data” in the future. The group announced this in its machine learning blog. Ultimately, this should help to make Apple Intelligence more powerful. In order not to record real emails, texts and inputs, data protection techniques are used, Apple emphasized. Insights into the behavior of individual users are therefore excluded.
Differential privacy should protect user data
Apple continues to rely on “differential privacy”. The mathematical procedure enables the analysis of large data records to record patterns and trends without gaining specific insights into the inputs of individual users.
A nested procedure is used to improve the summary functions that are in Apple Mail and IMessage or the news app: With a voice model, Apple generates synthetic or artificially created texts on common topics, such as a message such as “you want to play tennis tomorrow at 11:30 a.m.”. Elements of such synthetic communications are then compared with real emails by users on their devices. The operating system then determines which synthetically generated communication of an actual email from the user’s inbox comes closest, as the company explains. In this way, trends recorded should contribute to better training data in order to ultimately make the Apple language models more suitable for everyday use and to provide more sensible summaries.
For the AI generated emojis (“genemojis”), Apple also analyzes the commands or prompts that use users to create the image signs. Here, too, differential privacy is used to prevent prompt from being traced back to individual users, as the company writes.
The functions are linked to analysis data that send Apple’s operating systems – here iOS, iPados and MacOS – to the manufacturer. Users are usually asked when the device was set up or an update whether you want to transmit this data to Apple. The analysis of the gene moss prompts is already active in a beta, according to Apple. What is meant are probably the current beta versions of iOS 18.5 and MacOS 15.5. The email comparison should follow soon. Further Apple intelligence functions such as the image generation with playground, the writing tools, the creation of retrospectives in the photo app and the “Visual Intelligence” function will also rely on this method, according to Apple.
Problems with Apple Intelligence
The summary of communications from Apple Intelligence in particular regularly shows problems. For news apps, the group even turned off the function temporarily after the Apple language models repeatedly provided false information. The learning opportunities of the Apple AI have so far been extremely limited, since many functions are carried out on the devices in a purely place and Apple’s Ki-Cloud should not store any data. Users only have the option of giving feedback manually and conveying specific data on errors to Apple.
Apple has been relating to differential privacy for almost 10 years in order to analyze data from the user extensively, such as to determine popular emojis or to improve the word suggestions of the iOS keyboard. If you do not want to transfer analysis data to Apple, you should check in “Settings> Data Protection & Safety” Analysis & improvements “whether” Share iPhone analysis “is activated there-and then switch off.
Discover more from Apple News
Subscribe to get the latest posts sent to your email.