While increasingly trying to diversify its businesses with Cloud and hardware, Google is primarily supported by advertising. User tracking for ads inherently conflicts with the broader push for privacy. Google today announced a “Privacy Sandbox” initiative to build open standards that contribute to a more private web.

Google makes the high-level case that advertising is fundamental to supporting many web businesses, like publishing, and should not be entirely shunned. However, it recognizes that ad tracking to serve more personalized and relevant content is “now being used far beyond its original design intent.” Namely, “some data practices don’t match up to user expectations for privacy.”

Apple’s Safari has moved to improve user privacy, but Google criticizes how attempts to address this problem “without an agreed upon set of standards” can have “unintended consequences.”

  1. The first relates to how “large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting.” Fingerprinting involves finding device information that differs between users to create a unique ID. Google points out that — unlike cookies — users cannot clear or reset their fingerprints.
  2. Next is how “blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web.” According to Google, irrelevant ads served without cookies sees “funding for publishers falls by 52% on average.”

Google already announced solutions for Chrome at I/O 2019 to better classify cookies, highlight settings, and block fingerprinting. Today’s Privacy Sandbox plan by Google goes further by working to “develop new standards that advance privacy while continuing to support free access to content.” A standards-based approach — while lengthy — allows for a universal solution that’s consistently used by all sites and browsers. It’s similar to the industry ad blocker that blocks disruptive experiences.

Over the last couple of weeks, we’ve started sharing our preliminary ideas for a Privacy Sandbox – a secure environment for personalization that also protects user privacy.

They include:

  • We’re exploring how to deliver ads to large groups of similar people without letting individually identifying data ever leave your browser
  • Publishers and advertisers need to know if advertising actually leads to more business. If it’s driving sales, it’s clearly relevant to users, and if it’s not, they need to improve the content and personalization to make it more relevant.
  • Publishers today often need to detect and prevent fraudulent behavior, for instance false transactions or attempts to fake ad activity to steal money from advertisers and publishers.
  • With a privacy budget, websites can call APIs until those calls have revealed enough information to narrow a user down to a group sufficiently large enough to maintain anonymity. After that, any further attempts to call APIs that would reveal information will cause the browser to intervene and block further calls.


Check out 9to5Google on YouTube for more news:

About the Author