A few months ago, Apple hosted the Workshop on Privacy-Preserving Machine Learning, which featured presentations and discussions on privacy, security, and other key areas in responsible machine learning development. Now, it has made the presentations public. Here are three highlights.
As it did recently with the presentations from the 2024 Workshop on Human-Centered Machine Learning, Apple published a post in its Machine Learning Research blog with a few videos and a long list of studies and papers that were presented during the two-day hybrid event held on March 20–21, 2025.
Quick note on differential privacy
Interestingly, most (if not all) papers touch on differential privacy, which has been Apple’s preferred method for the past few years to protect user data when dealing with servers (despite some criticism).
Very basically, differential privacy adds noise to user data before it’s uploaded, which aims at making it impossible to trace the real data back to an individual, in case the data is intercepted or analyzed.
Here’s how Apple frames it:
The differential privacy technology used by Apple is rooted in the idea that statistical noise that is slightly biased can mask a user’s individual data before it is shared with Apple. If many people are submitting the same data, the noise that has been added can average out over large numbers of data points, and Apple can see meaningful information emerge.
Three studies Apple showcased from the event
1: Local Pan-Privacy for Federated Analytics, presented by Guy Rothblum (Apple)
This study was published on March 14, building on top of another study from 2010. Rothblum co-authored both.
While the 2010 study investigated a way to keep information private even if an analytics system or server was compromised, this new study applies that idea to personal devices.

In a nutshell, the study shows that in a device that gets improperly accessed multiple times (like in a compromised shared computer), it’s nearly impossible to collect usage data without leaving the door open to privacy risks. The researchers propose new encrypted methods that let companies still gather useful statistics, but in a way that keeps individual activity completely hidden.
2: Scalable Private Search with Wally, presented by Rehan Rishi and Haris Mughees (Apple)
This is a very interesting presentation that explores how Apple maintains privacy while lowering costs when it comes to encrypted search at scale.
For instance, when the user takes a photo of a landmark, Apple processes that photo to identify that landmark. This requires some back-and-forth with Apple’s servers, which could present privacy implications.
The paper explains a method called Wally, which basically employs differential privacy (so it sends out the real query, surrounded by fake data), but it’s structured in a way that as more people query the server during the same time window, the amount of fake noise each person needs to send drops significantly.

This means Apple can keep queries private, while scaling to millions of users without bursting bandwidth or compute costs.
3: Differentially Private Synthetic Data via Foundation Model APIs, presented by Sivakanth Gopi (Microsoft Research)
This was one of two papers presented by Microsoft researchers. While this particular one deals with images, the second paper deals with text. But the gist is the same.
AI companies need good data to train their models on. Sometimes, good “real-world” data is just regular user data, and it should be kept private.
The two studies explore how to generate high-quality synthetic data that preserves the utility of real-world user data, but without exposing anything actually private.

Basically, they introduce a method called Private Evolution (PE) that guides API-only foundation models to generate synthetic versions of private datasets that are similar to the real data, but without needing any model training or internal access. PE can match or even outperform state-of-the-art approaches that rely on direct model finetuning, but at a fraction of the privacy cost.
Full studies list
Alongside the featured videos, Apple also published links to all 25 studies presented in the event. Here is the full list, which includes studies from researchers at Apple, as well as companies like Microsoft and Google, and institutions such as MIT, Rutgers, Boston University, Carnegie Melon, and University of California, Berkeley:
Limited time Mac deals on Amazon
- Mac mini (M4) 16GB/256GB: $499 (17% off)
- Studio Display, Nano-Texture Glass, VESA Mount Adapter: $1,649 (13% off)
- MacBook Air, 15-inch, M4, 16GB/256GB: $999 (was $1,199)
- MacBook Pro, 14-inch, M4, 16GB/512GB: $1,299 (19% off)
FTC: We use income earning auto affiliate links. More.