The exposure of massive state surveillance by the US and UK intelligence services has led to a rethink in the EU. Since then, the focus has shifted from criticism of states to the big tech companies: Google and Facebook in particular have been repeatedly criticized for collecting too much and too personal data. […]
With the introduction of the EU General Data Protection Regulation (GDPR) in 2018, a useful, though not always easily enforceable regulation came into force, which has led to a stronger momentum towards data protection and digital privacy. When Google announced in March this year that it would completely abandon cookies from 2022, the big data industry seemed to have given the go-ahead to bow to users ‘ desire for more privacy and less data collection. Apple, the most valuable company in the world, will also allow cross-tracking by apps in its iOS operating system in the future only with the prior consent of users.
On the one hand, these developments point to a change in the data and tech industry. On the other hand, however, there was also criticism that the corporations were simply no longer dependent on these controversial forms of advertising personalization and, in the case of Google, would instead get the desired data via so-called cohorts, i.e. the bundling of users with similar interests.
Therefore, the question arises: where is the industry moving around the collection, collection and processing of data on the Internet? Here, two main forces are facing each other: on the one hand, the large tech companies and, associated with them, the advertising industry, which have a strong interest in meaningful and extensive data in order to increase their profit and improve the quality of service. On the other hand, users and, in some cases, state regulators are opposed to this interest in order to ensure their anonymity on the net and the protection of digital identity.
Mass of personalized data compromises anonymous data sets
On the part of data collectors, one of the problems is that even with good will from companies, limiting them to anonymous data does not necessarily mean that this data remains anonymous. By reverse engineering, for example, anonymous data sets can be broken down again in a global data pool that contains billions of personalized data sets. This makes isolated attempts, for example by smaller companies, to respect users ‘ privacy ineffective. This toxic state illustrates the importance of individual privacy protection by means of state-enforceable rules, such as deletion requests or the requirement to collect data only for the intended purpose and to store it for a limited period of time.
At the same time, many companies still have no interest in replacing the principle of data maximization with that of data minimization, as long as there is no commercial incentive to do so. Big data is still considered a solution for everything in many industries. In addition, the legislator’s attempts to circumvent end-to-end encryption, for example in the case of messengers, for the purpose of criminal prosecution also endangers the integrity of digital privacy. Here, the interest of solving crimes is contrasted with the desire of users for anonymity.
Use of privacy tools as a trend sign
Nevertheless, a change is emerging that is not only driven by individual measures of large players such as Apple or Google, but has a much broader basis: The increased use of privacy tools such as privacy-friendly browsers, VPN clients and ad blockers, for example, makes the business with data less profitable, while increasing user numbers increase their effectiveness at the same time. The privacy-by-design approach, as advocated by Avast, for example, also promises a greater focus on privacy in the software development and design of digital services. In this case, it is thoroughly checked in advance whether a product actually needs to collect a certain amount of data and, if so, whether the data can be processed directly on the device.
The growing unwillingness of digitally mature users to be biased by international corporations as draft horses in order to increase their advertising revenues is thus becoming increasingly clear. The purchasing and user behavior is the strongest means to limit the comprehensive collection of data and its personalization. As more and more people switch to devices and applications that handle data more restrictively than their competitors, this is causing a reaction in the company’s management levels. Google in particular seems to be seeking a change of course with new technologies that are less likely to invade the privacy of individuals – such as contextualized tracking or even the waiver of data collection in certain cases.
However, the most important subsidiary of the parent company Alphabet can only afford this change of strategy because it has the necessary market power to do so. In fact, this means less a waiver of the advertising revenue that is so essential for Google, but rather a stronger binding of the data to Google. Due to the widespread use of the company’s in-house browser Chrome, the company can use its new cohort technology, also known as FLoC, to extract relevant data directly from the browser history and process it on the device itself. The result is, on the one hand, better protection against personalized tracking for Chrome users, but on the other hand, a significant increase in Google’s market dominance. Even for Apple, which is relatively weak compared to its digital competitors in the advertising business, the change does not trigger serious losses, on the contrary: The advertising business with the in-house services is to be expanded in the future.
As a result, data continues to be collected, but the excessive distribution over the dominant adtech ecosystem is becoming more and more restricted. The tech companies want to keep control of their data and at the same time give their users the signal: We respect your privacy. The change in mood in the digital society has now also arrived in the corporations, although the business with the data will continue solely for economic interests.
Global minimum standards as a trend accelerator
However, for a long-term change in the handling of data, the legal framework will also play a role. While the foundations for comprehensive protection of digital privacy in the EU by the GDPR are theoretically in place, a global solution in this direction is still lacking. International minimum standards, for example through the OECD, could encourage countries such as the USA to enact their own federal laws that strengthen data protection and digital privacy. This would also force corporations like Facebook to rethink their business model and find creative ways to be profitable – without ignoring the growing desire of its users for anonymity on the Internet.
* Author Shane McNamee is Chief Privacy Officer of Avast.