What is differential privacy? How will it affect marketing?

As people pay more and more attention to personal data and privacy today, the entire advertising and marketing industry is also reassessing the way to deal with personal privacy, and advertisers are also looking for better ways to collect data, hoping to legally use more accurate personal data information as much as possible without damaging personal privacy.

Among many methods of data collection and management, there is a method called differential privacy, which is similar to a statistical technology. It allows companies to share aggregate data of users while protecting personal privacy.

So, how does differential privacy play a role in marketing? Here are some detailed explanations.

What is differential privacy?

This is a data aggregation process pioneered and now used by Apple, Google and other large technology companies. In short, differential privacy algorithm is to inject random data into data packets to protect personal privacy.

Before sending the data to the server for anonymization, the differential privacy algorithm adds the random data to the original packet. The injection of random data means that the data packets obtained by advertisers are actually slightly hidden. Therefore, compared with the real original data packets, they are not the most accurate.

 

How could this happen?

Advertisers can effectively obtain the approximate answers they need without harming anyone's privacy. For example, advertisers browsing different privacy data may know that if 150 out of every 200 people see an advertisement posted on Facebook and click the advertisement to link to the brand's website, they do not know which 150 people are actually. This is equivalent to giving users of these data a plausible denial, because it is almost impossible to identify specific personal information with complete certainty.

Sounds inaccurate

Because advertisers cannot fully understand how people react to a marketing campaign, there is a clear trade-off between personal privacy and data accuracy, and advertisers must accept certain compromises. Because if random data is not injected into the master data package, it is easy to determine who is interested in or responsive to your advertisements. This means that if you do not get the appropriate "GDPR General Data Protection Regulations", you must clean up these databases, because these databases that clearly know the accuracy of personal information are illegal. If you use these data, you may face unbearable punishment. Google is a good example in Europe.

Who is doing these things?

Truth in Measurement, a cross industry organization composed of advertisers, media and technology platforms, mainly considers how to use statistical technology to support cross platform measurement. Trace Rutland, media innovation director of Tyson Foods, is also a member of this cross industry organization. She said that this pragmatism is more like a moral test in the final analysis, focusing on such a question: "How do our users feel about our use of their data in this way, and do you accept this way?" The answer to this question can prompt cross industry groups to consider whether differential privacy can be used as an effective way to verify the data shared in the data cleaning room. For the data cleaning room, please refer to our previous article "Have you heard of the data cleaning room? The expensive MarTech column"

What is the use of this for cross platform measurement?

As for whether the data cleaning room can support cross platform measurement, one of the sticking points is who really benefits from it. Media vendors are cautious about sharing their data with competitors in the same place, while advertisers do not feel that they have ownership of these environments, which makes them doubt the added content.

Differential privacy can alleviate these doubts, because all supporters of the data cleaning room will feel that they have some control over the data anonymity process usually controlled by media vendors. Advertisers can usually get a data package that accurately reflects the implementation of a certain marketing campaign, and media sellers do not have to give up valuable target data.

This question was raised at an event hosted by Truth in Measurement last month. Victor Wong, CEO of Thunder Experience Cloud, said: "It is agreed that as the result of our data cleaning room, advertisers will receive a data log file of marketing activities based on differential privacy." Thunder Experience Cloud is the project leader.

Can any advertiser do this?

In theory, all advertisers can develop their own algorithms for differential privacy, but considering the complexity of development and management, this is not practical. In fact, advertisers like Tyson Food are more willing to cooperate with others to jointly fund a technology that they can apply to larger data packages.

"If something like differential privacy is to grow, it needs to work together from the buyer. It cannot be done by advertisers alone." Rutland said that she hoped that the industry would unite to support the unified version of the algorithm, rather than acting independently. "Whenever advertisers try to blaze new trails in cross platform measurement, they are unable to extend this walled garden to a sufficient extent to affect the market."

What are the disadvantages?

On the issue of small data packets, differential privacy is not very good. The smaller the packet, the less accurate it will be once random data is added to it. In addition, compared with the real anonymous data of reporting users, it is more difficult for differential privacy to play a role in scale.

However, these do not affect the application of differential privacy in many industries. After all, the penalties for data privacy in various countries around the world are not just for fun, and not everyone with a family like Google can pay huge fines again and again. Be careful and lawful in using user data.

User Agreement | Privacy Policy

Copyright © 2024 www.licenet.com. All rights reserved.