One of the more puzzling aspects of Apple’s rollout and enforcement of its App Tracking Transparency (ATT) privacy policy is the fact that the company has done nothing to police device fingerprinting. Apple explicitly identifies fingerprinting as being prohibited in ATT policy and yet the practice has continued unabated since the public introduction of ATT in iOS 14.5. I walk through why this is likely the case in my 2022 predictions for mobile advertising, wherein I discuss Apple’s reticence to sharply and clearly define the term ‘fingerprinting’.
To avoid a semantic beartrap around what activities constitute fingerprinting, I’ve started to use the phrase “probabilistic install attribution using device parameters” (PIAUDP) to describe the use of device and network parameters for install attribution. Throughout this article, for the sake of readability, I implore readers to conjure the acronym ‘PIAUDP’ when they see the word ‘fingerprinting’.
Given the prohibitions that Apple outlines in its ATT guidelines, fingerprinting as enacted via PIAUDP is manifestly a contravention of ATT policy. And yet, as I describe Why isn’t Apple policing mobile ads fingerprinting?, Apple has yet to crack down on that behavior. From the article:
But when ad tech companies conduct fingerprinting through SDKs, they do so within the apps of their customers. Apple can similarly see this happening in app review, but in order to police it, it would need to reject updates from app developers that aren’t themselves doing anything wrong. This would be a messy solution, especially since every app on the App Store being run as anything resembling a business contains at least one SDK that is currently fingerprinting. App developers would be punished by Apple for violations from the ad tech companies that they are paying.
Apple is faced with a difficult proposition in policing fingerprinting: in order to achieve wholesale enforcement of ATT’s restrictions against fingerprinting, Apple would need to resurrect its tactic from early April 2021, which entailed rejecting app updates that included the SDK of an MMP that (ostensibly) was deemed to be in violation of platform policy. This incident was enormously disruptive to the app ecosystem, and it only involved one MMP.
My sense at the time was that Apple sent an unambiguous message to the mobile ad tech category that fingerprinting would not be tolerated once ATT was rolled out. But because much of the ad tech ecosystem faced an existential threat with ATT, those companies had no choice but to continue the practice of fingerprinting, and Apple faced a “you can’t fire all of us” dilemma. This is all to say: as a policy stipulation, and without a direct mechanism for prevention beyond threats, ATT’s prohibition of fingerprinting is almost impossible to enforce.
Apple recognized this in cooperating with the UK’s Competition and Markets Authority on a report the regulatory body recently published on the mobile ecosystem, which I cover in a Twitter thread. From the report (page I14 from the Appendix):
It has been reported that it may be difficult for Apple to fully enforce this policy. In particular, we understand that there are no obvious technical means for Apple to know what data ad tech companies use (apart from the IDFA that it does not provide), whether they might be doing ‘fingerprinting’, and what new technical workarounds they might find in the absence of IDFA. Indeed, a study by privacy software developer Lockdown found evidence of a number of apps that seemed to continue to engage in third-party tracking when users opted out from the ATT prompt.
I discussed the predicament in which Apple finds itself in this interview with Ben Thompson on Stratechery: the mobile ad tech category called Apple’s bluff, and fingerprinting is currently being performed with aplomb. But fingerprinting doesn’t allow for behavioral profiles to be built because of the fleeting, ephemeral nature of IP-based identity.
So while fingerprinting does undermine Apple’s messaging around privacy, the mechanics of it are so esoteric that no one who is only aware of the ATT initiative through watching “the commercial” is really going to care. Without a clean, elegant means of preventing fingerprinting, Apple can turn a blind eye to it, especially since one of its primary motivations in enacting ATT has been realized.
But can Apple introduce a clean, elegant means of preventing fingerprinting? Google’s recent announcement of its Privacy Sandbox for Android provides some guidance.
As I hypothesized for an iOS 16 predictions post last week, I believe Apple could borrow the SDK Runtime concept from Google’s Privacy Sandbox for Android to force SDKs to operate in an environment that is independent of an encompassing app’s. This allows for two benefits. First, the SDK would be evaluated for ATT compliance on its own, potentially through a separate “app review” process (as the SDK Runtime feature for Private Relay for Android involves), so Apple wouldn’t need to punish app developers for their reliance on non-compliant ad tech services. And second, the data emitted from these sequestered SDKs could potentially be passed through Apple’s Private Relay system, meaning the IP addresses of users would be obfuscated, preventing fingerprinting from being done with any useful level of accuracy.
Private Relay transmits data from a device through a two-hop architecture that camouflages the user’s IP address in such a way that neither the website destination, first, or second hop can reconcile it. Currently, it is applied to Safari traffic on the iPhone and unencrypted traffic in apps. From speaking with a number of network engineers on this topic, it seems the non-application to the majority of in-app traffic probably derives from cost sensitivities: it would be prohibitively expensive to run all in-app traffic through this system. But running only the data generated from ad tech SDKs through Private Relay would be significantly less expensive.
Note that it’s not clear how this hypothetical solution would work in practice. For web traffic, per the Apple spec, Private Relay consults a list of known trackers published by DuckDuckGo when making direct connections. This same approach could be applied for the SDK environment: if an SDK is placed on a “known ad tech SDK” list that implies usage for advertising targeting or delivery, that connection might be run through Private Relay. As Alex Bauer writes on Twitter, this approach, which mirrors how ITP and Private Relay obfuscate IP addresses and is consistent with the roughly 12-month lag between the rollout of privacy features for Safari and their equivalents for apps. ITP has a separate “Hide IP Address from Trackers” setting for Safari which does not require iCloud+ to use, as Private Relay does.
No matter the format it takes, a fingerprinting prevention feature in iOS 16 would be the next shoe to drop for ATT: it would allow Apple to fulfill its promise to consumers about safeguarding their privacy without requiring an overhaul, or an extra-judicial extension directed at ad tech vendors, of the existing App Store review process. And taking this next step makes eminent sense; I’d even call it an imperative. If Apple wants to push the mobile ecosystem into a new paradigm of advertising measurement, it needs to close loopholes like fingerprinting, which use device parameters that have long been considered PII that should require consent to process. If a company is to create a broad, sweeping privacy policy — which ATT is — then that policy should not be easily negated by the pervasive techniques that existed more or less in the open before the policy was published.