News Warner Logo

News Warner

Apple’s complicated plan to improve its AI while protecting privacy

Apple’s complicated plan to improve its AI while protecting privacy

  • Apple has developed a new AI training system that improves its models without accessing user data or copying it from devices.
  • The system uses a synthetic dataset and samples of recent emails or messages from users who have opted into Apple’s Device Analytics program to determine which inputs are closest to real data.
  • Apple will then use the most frequently picked fake samples to improve its AI text outputs, such as email summaries, without accessing user data.
  • The new system is being introduced in a beta version of iOS and iPadOS 18.5 and macOS 15.5, marking an effort by Apple to turn around its struggling AI features.
  • Apple’s use of differential privacy will help keep user data private, as the company introduces randomized information into its broader dataset to prevent linking data to any one person.

Apple says it’s found a way to make its AI models better without training on its users’ data or even copying it from their iPhones and Macs. In a blog post first reported on by Bloomberg, the company outlined its plans to have devices compare a synthetic dataset to samples of recent emails or messages from users who have opted into its Device Analytics program.

Apple devices will be able to determine which synthetic inputs are closest to real samples, which they will relay to the company by sending “only a signal indicating which of the variants is closest to the sampled data.” That way, according to Apple, it doesn’t access user data, and the data never leaves the device. Apple will then use the most frequently picked fake samples to improve its AI text outputs, such as email summaries.

Currently, Apple trains its AI models on synthetic data only, potentially resulting in less helpful responses, according to Bloomberg’s Mark Gurman. Apple has struggled with the launch of its flagship Apple Intelligence features, as it pushed back the launch of some capabilities and replaced the head of its Siri team.

But now, Apple is trying to turn things around by introducing its new AI training system in a beta version of iOS and iPadOS 18.5 and macOS 15.5, according to Gurman.

Apple has been talking up its use of a method called differential privacy to keep user data private since at least 2016 with the launch of iOS 10 and has already used it to improve the AI-powered Genmoji feature. This also applies to the company’s new AI training plans as well, as Apple says that introducing randomized information into a broader dataset will help prevent it from linking data to any one person.

link

Q. What is Apple’s new plan to improve its AI models while protecting user privacy?
A. Apple plans to use a synthetic dataset compared to samples of recent emails or messages from users who have opted into its Device Analytics program.

Q. How does Apple’s new system work in terms of data access and sharing?
A. Apple devices will only send a signal indicating which synthetic input is closest to the sampled data, without accessing user data or sending it to the company.

Q. Why did Apple struggle with the launch of its flagship AI features?
A. Apple has struggled with the launch of its flagship AI features due to potential issues with less helpful responses and the need for further improvement.

Q. What method is Apple using to keep user data private in its new AI training plans?
A. Apple is using a method called differential privacy, which introduces randomized information into a broader dataset to prevent linking data to any one person.

Q. When was Apple first talking about using differential privacy to protect user data?
A. Apple has been talking up its use of differential privacy since at least 2016 with the launch of iOS 10.

Q. Has Apple used differential privacy before in any other features or applications?
A. Yes, Apple has already used differential privacy to improve the AI-powered Genmoji feature.

Q. What is the goal of introducing randomized information into a broader dataset for Apple’s new AI training plans?
A. The goal is to prevent Apple from linking data to any one person and maintain user privacy.

Q. How will Apple use the most frequently picked fake samples to improve its AI text outputs?
A. Apple will use the most frequently picked fake samples to improve its AI text outputs, such as email summaries.

Q. What devices will be able to participate in Apple’s new AI training system?
A. Devices running a beta version of iOS and iPadOS 18.5 and macOS 15.5 will be able to participate in Apple’s new AI training system.

Q. Why is Apple introducing its new AI training system in a beta version?
A. Apple is introducing its new AI training system in a beta version to test and refine the technology before wider release.