top of page
Writer's pictureThe IAF Team

Trust Deficit Acceleration Means Trust but Verify

Dirty diesel cars, opiates, income disparities, and institutional failures.  The trust deficit caused by these abuses or plain mistakes seems to be accelerating beyond red to bright red.  This acceleration has huge ramifications for new privacy laws and for interpretations of existing laws.


The IAF staff recently visited a privacy regulatory agency to discuss how advanced analytic processing including AI that may be “people beneficial” can be considered trustworthy based on a demonstration of objectives, benefits and mitigation of risks.  One of the agency officials turned to me and said, “I bet Johnson & Johnson considered fighting pain to be beneficial to people.”  A judge’s ruling that marketing by a well-respected company exacerbated an opiate addiction problem is a far cry from online behavioral targeting.  The comparison might be apples to oranges, but trust, especially of motive, relative to objectives and processes seems to be under challenge.  Real and perceived compliance and process failures in the marketplace not related to data are having a severe impact on the trust in data driven processes as well.

In compliance areas ranging from polluting diesel engines to over prescribing addictive pain killers, it isn’t just business that is seen wanting; it is all institutions from churches to governmental executive offices.  And data driven processes are not immune.  While in the past one could say, “Where is the harm in processing data?”  “It is just an ad?”  Today, processing harm is being defined in ways that get to the essence of our values as people.  The Netflix documentary, “The Great Hack,” makes the case that the threats to liberal democracies may be the price of robust observed data usage through profiling and micro targeting used in an inappropriate fashion.  In the digital age, observation in itself can be viewed as a harm.  Hong Kong demonstrators are destroying smart street lights for a reason.  The watching eye is not always viewed as benign.  These occurrences are all part of the environment of growing general distrust.


For years the IAF has  received pushback to the use of the term “trust deficit.”  We have been told if you say there is a trust deficit, policymakers might hear you and believe it true. However, we believe this is an effective way to label the struggle that business faces – the use of data by business creates people benefits against the backdrop of business and institutional failures, including inadvertent and intentional misuse.  In meetings with IAF members, we have seen evidence that thoughtful companies are trusted by their customers and respected by their regulators.  But regulation isn’t targeted to compliant companies; it is targeted to a more general trust deficiency.   


Well-established brands are using data to build competitive advantage and to provide benefits to multiple stakeholders.  These uses typically are a good thing.   Furthermore, if they don’t do so, they will be at a competitive disadvantage or benefits will not be created.  However, data use is based on acceptance or permission from the law or individuals.  When a company stretches uses or does not have processes to evaluate market acceptance and risk and something goes wrong, it isn’t just that company that suffers a loss in trust.  


Guess what?  Politicians know there is a trust deficit and politically are reacting accordingly. 


The California lawmakers have been steadfast in allowing almost no improvements to the challenging definitions contained in the California Consumer Privacy Act.  In Europe, the restrictive interpretations of the GDPR by the European Data Protection Board have made data driven research difficult if not impossible.  If the current trend continues, it may lead to bad public policy on a full range of information policy issues.  Unfortunately, the notions of flexibility are regarded with ambivalence and likened to self-regulation, rather than a sensible, governable approach to data use in a digital age   Amplifying this view is the sense that companies have proven they can’t be trusted to regulate themselves.


Twenty plus years ago, observation in the form of third-party tracking software placed on individuals’ computers was necessary to make the consumer Internet user friendly and economically viable.  Business argued to policymakers that the tracking was safe and proportional, and that regulation would kill the Internet. Over the years, data driven observation has become more robust, and related data have been shared widely.  These developments have led to a few highly reported negative and sensationalist stories.  The repercussions of these assertions are just beginning to be felt.  The most immediate impact will be in the digital advertising ecosystem.  The California Consumer Privacy Act and investigations by European regulators are likely to  change the mechanics behind digital advertising.  The result will be difficult for many business models.


There are consequences to legislative over reaction and under reaction.  The stakes are exponentially higher than just an advertising supported cyber ecosystem.  Observation is necessary to make an increasing number of environments workable.  These ecosystems range from smart cars to smart medical devices.  Furthermore, observed data is necessary to fuel the digital future that will drive economic growth.  Blunt instruments to limit data flows and data use will have consequences for people, society and companies.


Ecosystems need observation to operate and fuel benefits society dearly wants.  At the same time, the accelerating trust deficit increasingly puts any type of flexibility in jeopardy.  This brings us back to Ronald Regan’s famous phrase “trust but verify.”


The 2009 Essential Elements of Accountability required organisations to stand ready to demonstrate.  Subsequent work advanced accountability to address more complex data environments and made clear the requirements and processes for effective internal oversight. The IAF’s policy solutions create governance mechanisms that document policies and processes that make that demonstration possible.  For those demonstratable processes to be measurable, there needs to be standardized components that still allow for both corporate diversity and external review.  The IAF solutions, ranging from model legislation to ethical assessments and oversight processes, are designed to allow for verification of competency and integrity of demonstratable processes.  Before these more robust governance processes take shape, the movement is from acceptance of standing ready to demonstrate to calls for more proactive validation.  But validation of what and to whom?


External oversight of demonstrable processes is not a review of every decision to use or not use data.  Straight forward uses, that are fully anticipated and understood and that meet established standards, do not need additional review.  What we are discussing is validation of the processes behind the decisions about observation and/or advanced analytics, and that those decisions are made soundly by competent people and with an honest focus on defined external stakeholder interests.


However, validation or verification requires people, processes and systems at the validator, not just the controller.  What are those processes and who pays for them?    Is it regulators?  Is it accountability agents of some other form?  Is it other trusted intermediaries? How does one make it scalable to SMEs?  What level of addition transparency can help to demonstrate intentional competency and integrity to make demonstration cost effective for both reviewer and the party to be reviewed? 


The IAF’s view is the answer will be a mixture.  In some cases, it will be regulators.  In others, it will be enforcers of codes of conduct or practice.  What is clear is that complete self-certification will not be satisfactory, particularly in data processing activities that can or are perceived to have a significant impact on individuals.


The political challenge is arriving at a solution that is trusted and not cost prohibitive.  For that to happen, there needs to be a private sector process to arrive at a consensus on what verification might mean.


This discussion is to be continued at the IAF’s West Coast Summit on September 19. 

0 views0 comments

Comments


Recent Posts

bottom of page