Every year at this time the Ten Commandments is aired on network TV and every year many of us end up watching at least part of Hollywood’s dramatic interpretation of the Exodus from Egypt as told in the bible. Regardless of your faith, it’s a must-see movie just for the “camp” and glitz of the Golden Age of Hollywood. There’s a famous scene when Moses, played by Charlton Heston, climbs Mt. Sinai to receive the Ten Commandments from the Lord. When Moses doesn’t return immediately with new laws, the Children of Israel lose their patience and faith, leading them to return to the old, discredited ways of the Egyptians. You know what comes next, right? They built a golden calf, which not only marked a step backwards in thought, it proved to be a monumental waste of gold and other precious metals. That scene—with the drama and outdated special effects–is about impatience and returning to discredited theories in the face of anxiety and uncertainty. We’ll leave other themes to religious leaders.
As the decade flipped to the 20’s, we thought there was near universal agreement among policymakers, regulators, civil society, and commercial actors that individual consent was increasingly less effective as a mechanism for governing and green lighting complex data processing necessary for digital progress and innovation. The growing consensus and recognition that new approaches to managing data protection in the Fourth Industrial Revolution struck many—including us—as real progress. The GDPR’s risk-based approach to data protection and its pivot to ways other than consent to achieve lawful and trustworthy data processing were positive signs, welcomed by diverse stakeholders. But recent developments seem to suggest that less than two years after the EU hit the start button on GDPR some regulators have already lost patience with the risk-based framework. The IAF filed comments on the UK ICO’s draft code of practice on direct marketing (Draft Code) whose conclusion seems to lead to requiring consent for all marketing without any proportionality test. In the US, many policy makers seem fixated on consent, in part because it’s a familiar concept that’s relatively easy to grasp and draft, even if the application of consent does not produce long term solutions to the myriad issues raised by today’s complex processing. Does this retreat from progress signal a return to worship at the feet of the just rejected golden consent model? We hope not. There’s too much as stake for everyone involved, including consumers.
Much of the future’s collective economic growth, benefiting both individuals and society, will be driven through increasingly complex digital transformation. We don’t need to reference reports anymore and no one is challenging the notion that our future will be revolutionized through technology such as the Internet of Things, 5G networks, cloud computing, big data, artificial intelligence, blockchain, and super computing power. Many of these digital trends involve complex data, data use and technology that are increasingly challenging to understand and by extension place tension on traditional models of data protection. We call this the “duh” paragraph because it’s not necessary to say but it sounds good and tends to be the one paragraph everyone agrees on.
Here are a few more “duh” statements for your consideration:
We need to maximize the societal benefits of processing while minimizing risks
Data use can and will produce phenomenal benefits
Data use can also cause real harm, but the risk of harm can be managed and mitigated
Trust is critical for success and we don’t have it
We are not going to have trustworthy systems, ethical processing or responsible data practices by requiring companies to leverage notice and consent. In a world of observation through billions of sensors, complex data flows, inferences and answers produced by sophisticated machine learning, and evolving business models and technology, meaningful consent prior to the use of data is near impossible. It’s foolish to pretend otherwise and we’re not persuaded by those who argue (disingenuously we believe) that we’re not giving consumers credit for knowing what they want. Consent mechanisms can be easily manipulated, and they place the burden on individuals who are not equipped, even under the best circumstances, to make informed choices on the fly. We end up with false choices and pretty colors, not trustworthy business practices.
To be super clear, we’re not suggesting that individual participation in data activities that have an impact on people is not important. To the contrary, in certain contexts it will be essential in our rapidly evolving, data-driven marketplace. Recent improvements to consent mechanisms regarding the collection of precise location information on mobile platforms is an example where consent can play an important function. But shifting accountability to individuals across the ecosystem through mechanisms like consent is not effective to drive responsible and trustworthy processing, much less eliminate adverse consequences that may derive from the collection and use of data.
Sometimes it appears that consent is an uber stylish, haute couture approach to prohibit certain data processing practices. Of course, policymakers may decide that certain uses of data should not be permitted. That’s fair. We elect and appoint these individuals to make those tough decisions. But if that’s the objective then please state that so that stakeholders understand the policy and can develop a path forward. Regulators shouldn’t design impossible to satisfy requirements for consent to produce the same outcome. Of course, producing a list of “thou may” and “thou shall not” is difficult, controversial, and often produces unintended consequences, bringing decision makers right back to the difficult, knotty place they hoped to avoid.
It’s a tired story isn’t it? We often feel like we’re going in circles. When the going gets tough policy makers get going – right back to outdated consent. In many respects we’re sympathetic to the challenges policymakers face and we’re not suggesting they have an easy task. To be fair, regulators are rightly skeptical and frustrated with some business-driven approaches to the challenge of risk-based, accountability type models. Part of it may have to do with the way aspects of the GDPR have been implemented by business. For example, in discussions with some Regulators they have commented they have not seen many good examples of a proper “balancing of interests” made operational by business. Put another way, they are expressing a question as to the effectiveness of actual “accountability” processes within the GDPR as implemented by business.
Some of their concerns are justified. Popping out a formulaic risk assessment by pushing a button on automated “do it your self” privacy software is not what they had in mind, yet we’ve all seen those solutions. And so, like the top fashion designers in Paris and Milan, they accessorize “consent” with gems like affirmative, express, clear, informed, specific, and separate. Once modified with trendy adjectives, consent is familiar, easy, and dressed up but it’s not ready to go.
We’ve committed privacy heresy here. We’ve attacked the foundational principles of consent and individual control. We’ve also asserted that line drawing is near impossible and almost always produces unintended consequences. We’ll add one more unorthodox point of view. Checklist approaches to privacy and data protection do little to actually manage risks and prevent negative outcomes. So now what?
How about a little honesty first? Business models, technology, data flows and data are complicated and ever evolving. A complex solution that contemplates the nuances of different practices and the context of processing is going to be needed. Second, any path forward that does more than put lipstick on a pig is going to require serious investment of resources by industry. Full stop. There are no cheap seats in the big data arena for responsible actors. In order to enable robust use of data, responsible actors must demonstrate their commitment.
Here’s the kicker. There’s really nothing new here. Organizations have had the ability to address and manage risks related to data processing all along. No one has wanted to do it. At least not in a serious, continuous, comprehensive and demonstrable way that can be articulated to anyone outside the business. This reluctance has contributed to the credibility chasm with regulators. Rightfully so. Should anyone care? Yes, because it seems that the result is more prescriptive guidance and (less effective) requirements that will, in some cases, negate the ability to create beneficial value from data. In the process the ultimate loser is the consumer, the very individuals the regulator wants to protect, and the business wants to pursue as a loyal customer.
There must be a way out, or at least a path forward. We think there is. It is a framework based on demonstrable accountability, which incorporates data governance, risk assessment, verifiability, and enforcement. We need trusted governance based on accountability by the very organizations who are the stewards of the data, developers of technology and decision makers.
What’s holding back a greater emphasis on accountability? A lot. Part of the tension and circularity of thinking may have to do with a combination of the growing complexity of data driven activities with general trust issues with business. We summarized some of the trust issues above but in sum it comes down to this: some regulators and policy makers increasingly think some in industry are full of it when it comes to privacy and data protection.
There are a number of steps business and regulators must take to advance trust based on organization accountability. While individual rights and participation remain key as outlined in the IAF’s suggested principles that update both individual participation and organizational obligations, clearly a retrenchment or a bias to mechanisms such as consent are not the answer. A tilt to trusted accountability as an effective governance mechanism is really the only way to achieve the promise of digital transformation and business needs to invest more.
There are steps that can help bridge this gap and move us forward (and not backward):
Enhanced Accountability – Business must adopt and make operational the elements of Data Stewardship Accountability, particularly for advanced and complex data driven innovation. As Stephen Wong, Hong Kong’s Privacy and Data Protection Commissionaire noted, “In order to be able to transform data into information and information into knowledge and insight and knowledge into competitive advantage, and in order for individuals to be able to trust data processing activities that might not be within their expectations, enhanced data stewardship accountability elements are needed”.
Demonstrable Accountability – while the GDPR requires organizations to be able to “demonstrate” they are meeting the law’s requirements, increasingly and in particular for advanced and complex data driven activities, organizations will need to be able to proactively show accountability in “demonstrable” ways. The IAF’s recent report addressing A Path to Trustworthy People Beneficial Data Activities included steps to achieve demonstrable accountability.
External Oversight – an element of demonstrable accountability includes oversight. The IAF believes this is a key element of trusted accountability and recognizes that further work is needed to determine the degree and type of effective and efficient (and cost effective) oversight and governance for organizations processing data in complex scenarios. Effective and efficient oversight and governance are necessary for these types of processing need to be fully trusted.
Codes of Conduct – may provide a path forward as they can outline the key accountability requirements in a given data processing activity or for a specific industry data activity. These efforts should be encouraged by business and regulators alike, with reasonable tolerance for “test and learn”. For example, some have called for prohibitively complex and costly “certification” schemes that will inhibit interest in codes that could actually advance trust. The enemy of the good is the perfect.
These are but four paths forward: all part of building trusted accountability. Of course, they need to be backed up through enforcement. We need to advance these and learn together. A shift backward to more consent will limit data driven innovation and in reality, will not provide the goals of autonomy and control these mechanisms where intended to meet. Business not investing and being willing to demonstrate accountability mechanisms and/or regulators demanding costly prohibitively complex oversight will kill forward progress and ultimately inhibit the benefits we all will receive as part of a digital transformation.
If you’re thinking that our ideas don’t seem possible in 2020, you’re right. Instead of trying to solve last year’s issues with this year’s reinterpretation of consent we are trying to lay the foundation for a long-term, workable and adaptable legal framework that will be relevant and fully implementable in 2030. We think that if we all take a longer view, and don’t react to the privacy issue of the day, we’ll reach a solution that genuinely protects consumers while fostering innovation, growth, and data processing necessary for the future.
We need to advance Trusted Governance Based on Accountability.
Comments