New knowledge drives mankind forward. Sometimes the knowledge is used wisely; sometimes it is not. Sometimes inappropriate uses have negative impact on individuals. Data, much of it relating to individuals, is key to the generation of knowledge. However, it is important to separate out the distinct functions of data driven knowledge creation, knowledge discovery, and the application of the resulting knowledge, knowledge application. The GDPR encourages knowledge discovery but also requires the accountable process of impact assessments that identify risks to individuals and that make sure those risks are documented. The best public policy is one that encourages knowledge discovery, even at the commercial level, and knowledge application in a legitimate and fair fashion.
The IAF team is concerned that society is sleep walking into an era where knowledge discovery will be precluded by a restrictive reading of the data protection law, especially with respect to knowledge application. For that reason, the IAF filed comments on the UK ICO’s draft code of practice on direct marketing (Draft Code). The IAF is concerned that the Draft Code suggests the mere processing of data to generate insights, without a sense of tangible negative effects, will be considered to have consequential effects. By extension, the requirements of the GDPR relative to these effects extend these same requirements to knowledge discovery where there is often less direct impact to individuals. For example, one issue in the comments is profiling to segment markets. What is at issue is when does profiling have legal or significantly similar effect? Part of the IAF’s comments are as follows:
Underlying the IAF concerns are the differences between privacy and data protection as fundamental rights. The right to privacy relates to individual autonomy and family life while the right to data protection relates to the risk to people arising out of the processing of data pertaining to them. The right of individuals to control their data, as a privacy right, is always important, but it is particularly so in instances where individuals should have the ability to protect themselves and their families and to form and socialize new ideas with a small circle of chosen friends. Consent as a governance mechanism works most effectively in situations where individuals knowingly provide data. Increasingly, data have their origin either in individuals’ interaction with the world (observed) or in the insights that come from processing data (inferred). The legal basis for that processing increasingly is legitimate interests or fulfillment of a contract. In those instances, the processing must be fair. Fairness includes transparency, and transparency is challenging in the direct marketing ecosystem. There is room for improved transparency in the direct marketing ecosystem. Fairness also requires a series of assessments to determine that data bring value to people and do not cause actual harm. The General Data Protection Regulation (“GDPR”) created data protection impact assessments (“DPIAs”) to make sure organisations considered both benefits and harms to stakeholders when processing data. Individuals benefit from competitive markets, so it is reasonable to consider whether less competition because of overly cautious interpretations of data protection law creates harms to individuals that are tangible.
As stated earlier, observation has become overly ubiquitous in today’s society. The IAF believes that the movement to limit third-party cookies will have some societal benefits in this area. However, even with those changes, the technology and processing behind market segmentation will be complex and understanding that process will not be most individuals’ main concern. So, the role of organisations and regulatory agencies becomes more important. Organisations must conduct assessments at almost every stage of the processing and must be able to demonstrate those assessments were conducted in an honest and competent fashion. Regulators most oversee and enforce substantially enough so organisations believe the likelihood of enforcement is high.
The segmentation process uses probability to segment individuals into cohorts of those likely to do something and those that are not likely to do so. Segmentation logically fits into the GDPR’s definition of profiling. The GDPR requires consent where the profiling has legal and similarly significant effect. It is IAF’s view that a lack of individual awareness of the robustness of the processing alone does not meet the test of being a similarly significant effect. Similarly, significant effect may come from the actual use of insights to make decisions. DPIAs are designed to identify similarly significant effects, justify or mitigate them, and document the outcome. The IAF sees indications in the Draft Code that the ICO is leaning in the direction of finding that the processing of data for segmentation has significant impact on the individuals the data pertains to. The impact on the societal value brought by direct marketing by requiring knowledge discovery to be subject to consent would be negative and therefore have a negative impact on individuals.
While these comments are directed at the ICO, there are indications that other data protection authorities may have similar views. Knowledge discovery may create insights that are detrimental to individuals when used in an inappropriate fashion. This type of potential risk is why the GDPR is “risk based” and requires assessment of risk. But to restrict profiling and knowledge discovery to only where consent is an effective governance process creates reticence risk. The assessment and balancing of risks through processes as outlined in the GDPR, conducted honestly and competently, is the better answer. The IAF will be scheduling a policy call on March 19 to discuss the issues raised by the Draft Code.
Comments