top of page
Writer's pictureMartin Abrams

Risk Based Should Be More Than A Cliché

The IAF will hold a policy call October 14 on “What is Risk?”


There are different types of privacy risks, and policymakers/regulators and organizations need to understand the different types. Policymakers/regulators need to understand them in order to draft and enforce this generation privacy law, and organizations using data need to understand these types of risk in order to maximize their use of data.


Since the EU General Data Protection Regulation (GDPR) and most recent privacy laws are risk-based, what is meant by risk? To begin with, it is risk to people, not risk to the organizations that process the data. Organisations always have been mindful of their own risks. The risk to people from the processing of data comes in two forms: 


  • The first form is procedural risk, the risk that individual rights related to their data will not be respected. Those rights include the right to know, have access to data, correct data, and object to its use. It also includes the right to complain and have redress. 

  • The second form is impact based on the outcomes of processing. Impact risk includes discrimination based on prohibited criteria and inappropriate loss of reputation. It also includes reticence risk and the costs to people for not processing data, for knowledge not gained, and for the impact of a knowledge vacuum on people. It may also include the failure to provide a benefit such as access to credit, health or employment.  


Fifty years ago, policymakers thought the best way to avoid impact risk was through individual control, a matter of procedural rights. Many procedural rights were based on reliance on individual control backed by independent regulators and the ability for redress. As processing became more complex, more of the risk management process went from an individual burden to an organizational accountability obligation. Requirements for privacy by design and default, data protection impact assessments and responsible data protection officers were part of that migration of risk management from the individual to the organization. It is not an accident that the framing of the GDPR as risk-based coincided with accountability becoming an explicit requirement of the law. 


So, flash forward to 2020. What the Schrems II case has shown us is that thousands of Euros, Dollars, Yen and other currencies are being spent by thousands of businesses to create “supplemental measures” and “additional safeguards” with limited evidence that those measures will actually reduce the impact of data transfers on individuals. Instead, these actions will reduce the likelihood that an organization will be fined by a data protection authority if its data transfers are challenged. This result shifts the calculus from being about risks to people to mitigation of risk to the organizations themselves. Not only does this shift not reduce impact risk, it does not reduce the individual’s procedural risk either. 


As is only too well known, this result is due to the decision of the European Court of Justice that found Privacy Shield is not equivalent and Standard Contractual Clauses might be insufficient because:

  • The U.S. government’s bulk collection of data is disproportionate and a violation of the GDPR, and

  • The inability of individuals to complain to an independent tribunal means they cannot exercise their rights under Article 47 of the Charter of Fundamental Rights of the European Union to an effective remedy and to a fair trial.


Just to be clear, this blog is not intended as a critique of Schrems II but rather a reassessment of what risk-based means. When the facts in Schrems II are looked at from a risk perspective, it becomes clear that the likelihood that most business transfers would ever be interesting to national security agencies is very slim and easily established. Most of those 5000 businesses that were part of Privacy Shield do not process data that are interesting to national security agencies.  Since the likelihood is slim, the need for a complaint process with an independent tribunal is also slim. Is there a possibility of data being requested by a national security agency?  Absolutely. But risk requires an analysis of likelihood and impact and a balancing of other risks. Not being able to transfer data outside the EU raises massive issues, for many different stakeholders, and that inability has an impact on people. This result creates a disproportionate risk for other fundamental rights and interests.[1] 


Increasingly, the possibility that an individual right will not be exercisable has been raised by data protection agencies to reduce the flexibility built into data protection law. There is evidence that legitimate interest as a legal basis for complex processing is rejected because the complexity makes it hard for individuals to understand well enough to object to processing. Is there a possibility that people won’t understand? Absolutely. Does this risk mean there is a default to giving disproportionate emphasis to procedural risk? Absolutely not. This is why a risk assessment requires looking at the full range of interests across the full range of stakeholders.


These issues will only become more complex. For example, when insulin reservoirs are implanted in people, and data flows are necessary not only to monitor the patient but also to support research, are regulators going to require the data only be used if the entire information ecosystem related to health is understood by patients? 


Twenty years ago, risk related to data use was discussed purely as a risk to business. Professionals discussed regulatory, reputation, compounding and reticence risk. The community has made great progress over the past twenty years. However, there has always been some friction between risk being seen as an inability for individuals to exercise rights and risk being associated with outcomes from processing. 


Procedural risk, the risk that individuals might not be able to exercise their rights, is important. However, the negative outcomes that come from the decision to process or not process data, impact risk, also is of great importance, arguably more so. There are times when procedural risk should prevail but not all the time. Again, this is why data protection law increasingly requires risk assessments that are proportional to the complexity of the processing and the issues raised. 


The IAF is seeing organizations go in two different directions. Organizations that embrace that they are data driven understand that sustainability requires them to engage in the processes that enhance stakeholder trust, such as enhancing their assessment processes which balance the risk equation for people. Other organizations focus on the regulatory risk to the organization and not on the risk to people, and this organization risk focus drives data protection laws away from their core mission of data should serve people.


Join the IAF at its Policy Call on October 14 where we will explore this issue. 


 

[1] This comment does not mean that the IAF does not think reform is necessary when data is gathered by the government from the private sector. Back in 2014 the IAF issued a paper suggesting accountability principles for government use of private sector data. In 2016 I organized a plenary session at the International Conference of Data Protection and Privacy Commissioners on this specific issue. 

Комментарии


Recent Posts

bottom of page