top of page

Search Results

290 results found with an empty search

Blog Posts (95)

  • The IAF Celebrates Ten Years of Focusing on Accountability and Governance

    The IAF celebrates its 10th anniversary in 2023, and recently published our annual report –  2022 Highlights and 2023 Policy Directions . Over the last few years, IAF research and education work has converged under the overarching principle under which we were formed – Accountability and allowing data to serve people. Specifically, our work triangulates Risk, Corporate Research and Proportionality to address the policy and strategy concerns of IAF members and community. Forward-looking organizations are applying and enhancing strategic frameworks to meet the demands of an observational world, responsible and resilient data use, and the expectations for fair AI.  The IAF believes it’s critical that accountable organizations be able to think with data and engage in knowledge discovery and creation, within accountable frameworks, in order to achieve a trusted global digital ecosystem. The pathway to innovation is clear. Break the path at any point, then actions taken will be less than optimal. Bad decisions, or even the reluctance to take actions that should be taken, may harm people. There are no new actions – such as cancer therapies, smart car safety, pollution abatement, and specialized education – without the ability to think with data to create knowledge. All IAF research and education endeavors are focused on furthering the ability for demonstrably accountable companies to use data pertaining to people to maintain the data innovation path.  In 2022, we saw increased pressure and scrutiny on digital economy activities, from AI applications, to AdTech, from increased cybersecurity threats to international data transfers, and crucial issues arose about individual body autonomy and safeguarding our youth. Here or just around the corner is the activation of several U.S. State privacy laws plus anticipated legislation in Canada (C-27) and the UK, and European Regulation (Digital Services and Digital Marketing Acts). The U.S. Congress renewed it evergreen debate about Federal Privacy Legislation.  Read more about the IAF’s 2022 Highlights and 2023 Policy Directions  here . We look forward to seeing you throughout the year at our informal chats, monthly policy and strategy calls, annual retreat and workshops. We will be focused on governance and what it means to think and act with data so that data serves people.

  • Making Data Driven Innovation Work

    It is increasingly understood digital agendas in both the public and private sectors are about creating safe pathways for data to turn into information, for information to be turned into knowledge and for knowledge to facilitate actions that are societally beneficial to people. Data enables technology such as Artificial intelligence (AI) technologies have significant potential to transform society and people’s lives. The U.S. National Institute of Standards and Technology states “AI technologies can drive inclusive economic growth and support scientific advancements that improve the conditions of our world.” [1] Less understood or acknowledged is the distinction between data used to create knowledge and the actions that may result from using or applying this knowledge. Knowledge creation also can be called “thinking with data.” It is key for processes such as data analytics used in product or service development and improvement. It may more broadly be called “data-driven or digital innovation.” The creation of knowledge differs from applying this knowledge. Knowledge application utilizes knowledge created on a specific or identifiable set of individuals. Understanding the differences between knowledge creation and knowledge application, and the purposes for which they are being undertaken as well as the different set of risks for each, is critical to understanding how they should be regulated. Knowledge creation in and of itself is less directly impactful on individuals. The application of this knowledge leads to decisions and actions that can impact individuals and brings into consideration traditional privacy concerns. Knowledge creation does not have the same concerns and results.  Knowledge creation and knowledge application should not be treated the same way.  However, today, regulatory approaches, including regulation, oversight, and enforcement, do not necessarily treat these two processes differently.  Both processes require controls, but since the risks are not the same in knowledge creation and knowledge application, the controls should not be the same. At its extreme, polarization is affecting the public policy debate on privacy and data protection. Terms such as autonomy and control increasingly are being interpreted as personal sovereignty. Policymakers designing the new data-driven industrial policy are talking past independent regulators who are concerned markets are dominated by data extraction that harms all individuals, particularly protected classes. This situation is making legislation and regulation that both facilitates innovation and protects the full range of stakeholder interests increasingly difficult. This void has led some organizations to delay or forgo the creation of these insights. As a result, the trajectory of the application of data protection public policy has the potential to stifle further data driven innovation. The Information Accountability Foundation (IAF) believes it is critical that organizations be able to think with data and to engage in knowledge discovery and creation in order to achieve a trusted global digital ecosystem. The IAF has advocated for many years that there should be a distinction between knowledge creation (thinking with knowledge) and knowledge application (acting with knowledge). Increasingly, knowledge creation leads to the development of new insights key to digital innovation.  However, the current pattern of data protection law evolution, and the enforcement of these laws, which does not appreciate the distinction between these two processes, has led to confusion and hesitancy about when advanced analytics for knowledge creation may be used where there is no distinct legal basis for processing personal data for this purpose. The IAF undertook research to understand how organizations discover and create new knowledge and is pleased to publicize the results of this research in our report Making Data Driven Innovation Work . The IAF research team conducted extensive interviews with organizations from numerous fields that use data to create knowledge, even if they did not identify their processes in that way.  The interviews were used to supplement the team’s decade of work as researchers at the IAF and consultants working on ethical assessments and demonstrable accountability. This project sought to clarify the impact of regulatory and public policy uncertainty on commercial-driven knowledge creation, develop scenario driven examples of this impact, develop a public policy model that enables responsible data-driven knowledge creation through a series of compensatory controls, and create a narrative and path for knowledge creation to be more formally recognized as legitimate data processing activities in next generation privacy and data protection law. In summary, many organizations use personal data as part of analytics processing (Corporate Research) to solve identified business problems, but most organizations do not use Corporate Research as a distinct processing activity more broadly because: Most organizations generally do not break data processing into two distinct phases: knowledge creation (I.e., the research function to identify a solution to a business problem) and knowledge application (I.e., application of the solution to the business problem), and/or In the EU, this portion of data processing (research) is complicated and/or limiting.  Personal data can be processed for scientific research purposes, which is narrowly defined, as long as sufficient safeguards have been implemented and scientific research is run “in accordance with relevant sector-related methodological and ethical standards, in conformity with good practice” (collectively Safeguards), and/or Other countries and future laws limit the use of personal data for research purposes to improvement/supply of products and services or to requiring the use of de-identified personal data for research and for socially beneficial purposes. The IAF thinks that organizations might be able to use personal data more broadly for Corporate Research if, in addition to legal and regulatory modifications, appropriate Safeguards are put into place and proposes that those Safeguards be developed and implemented in a defined “Research Sandbox” environment. The project developed potential safe pathways for data to turn into information, for information to be turned into knowledge and for data to facilitate actions that are societally beneficial to people, all where processing is conducted in a manner that is lawful, fair, and just.  The IAF appreciates that this level and type of paradigm shift will require input from key stakeholders, including regulators and organizations.  In short, this research report is just the start, and the IAF hopes to obtain further input through this project which includes workshops in 2023. [1] Artificial Intelligence Risk Management Framework (AI RMF 1.0) ( nist.gov )

  • Who Is Paying the Policy Debt?

    Everyone! There is a technology concept called the “tech debt.”  It accumulates over time when software development applies the easy and quick incremental answer to complete or  change a project or fix glitches in a system.  Over time, one short term fix is layered one upon another, compounding over time. [1]    Eventually, the interest on that debt needs to be repaid.  There is a corollary in the information policy world that could be referred to as “policy debt.”  It occurs in both the public and corporate policy arenas and links to the innovation cycle, where any break in the cycle, be it data, information, knowledge or action, impacts business resiliency.  The policy debt has been accruing interest since the mid-1990’s, and the interest payment seems to be coming due now. Western liberal democracies are based on capitalism and its corporate structure.  Corporations have shareholders that elect boards to govern the corporations for the benefit of stakeholders.  Those boards are responsible for four governance principles:  Strategy, financial performance, business resiliency and compliance.  Which of those four principles are touched by current regulatory direction and privacy laws that lag digital technologies?  Obviously, compliance is first, with billion-dollar fines for failure to comply.  But the debt payment does not stop there.  Regulators are now requiring that data be purged, and that the software developed with the data be erased as well.  In Denmark, regulators required that schools brick student computers linked to the cloud.  These actions trigger business resiliency.  In the latest Meta case, the European Data Protection Board questioned whether Meta’s advertising-based business model is actually a legitimate business strategy.  Three out of four governance principles are triggered by regulatory attempts to pay down the policy debt.  What follows is from my first-hand experience with the policy world dating back to 1988 when I went to work at TRW as consumer policy director.  I visited Brussels when the Data Protection Directive was being drafted, sat at the table when the U.S. FTC and States took on the credit reporting industry, and most important as the regulation of the consumer Internet was debated in Washington DC meeting rooms.  It is that Internet debate where the policy debt acceleration began.  Third party cookies were introduced in the mid-1990’s as a means to facilitate an advertising funded consumer Internet.  This approach required triggers on browsers that linked to tracking software stored on consumers’ hard drives.  Was this tracking within the public commons or was it comparable to families’ homes, where some level of seclusion was expected?  The policy decision was to not answer this question and to push the answer to the future when the consumer Internet was better established.  Instead, the focus was on transparency in the form of privacy notices that, over time, would get increasingly long and dense.  The interest due on this policy decision has been accruing. That policy indecision facilitated nearly two generations of digital and economic growth – new products and services, the rise of new brands and jobs — all fueled by personalized advertising. It made possible many new business models and in some cases disinformation and manipulation. Eventually it focused policymaker attention on that policy debt. That first generation of observational technologies became not just a means for facilitating advertising; it also became central to how things actually work in the interconnected world in which we live.  Smart cars, medical devices, cyber security, fraud detection, communications, and the whole Internet of Things was and is dependent on observation.  How is a means for supporting a digital ecosystem fueled by observation created without paying this debt? Many academics and others have given this observation a name: “surveillance capitalism” which suggests simple solutions to paying down the policy debt’s accrued interest.  Terms like data minimization, do-not-track, do-not-sell become part of the policy vocabulary.  But simple answers rarely work.  As Professor Dan Solove points out in his new article, “Data is What Data Does:  Regulating Use, Harm and Risk Instead of Sensitive Data,” legislation based on use, harm, and future harm in the form of risk is complicated, and regulating on the proposition that some data is more sensitive than other data is simple but not effective.  Some would say that the EU GDPR was designed to create a pathway to paying the policy debt, but the essential nature of observation and the new technologies that it facilitates have made implementation of the plain language of the GDPR and GDPR inspired legislation in other regions less than optimal.  The GDPR was intended to be risk-based, but the GDPR did not define the risks that are to be considered as organizations manage digital processes.  Data protection and privacy speak to three tasks:  assure some a space not subject to observation, where private thoughts and family life might prosper; allow for people to define themselves and not be defined completely by their digital waste; and have fair outcomes when data is processed.  Does the risk-based approach place the emphasis on personal controls over observation and processing – data subject rights, or on the fairness of outcomes? Both sides of that equation are important, but risk management requires prioritizing one over the other.  The IAF work on “risk of what” has led us to understand that visualizing risk is not one outcome versus data subject rights, but rather dependent on a stakeholder’s first impression of what is most at issue.  Stakeholders go beyond the data subject and the controller and include parties impacted by the processing that are not the active participants. So, a risk-based policy system must be stakeholder based. There is a great deal of evidence that privacy regulators are doubling down on data subject rights, with a narrow focus on one stakeholder, the data subject. Recent cases have focused on narrowing legitimate legal bases, requiring transparency with conflicting values of simple and complete, and necessity that reaches to the legitimacy of business processes. So, as we celebrate privacy week, let’s spend a minute thinking about the policy debt.  We might think about new policy models that embrace the complexity of the digital age, considering all stakeholders’ interests and make sure the policy debt is paid in a fashion that is equitable to all in the many roles they play:  data subject, patient, employee, citizen, student, shareholder, pensioner, etc. [1] Technical Debt https://www.productplan.com/glossary/technical-debt/

View All

Other Pages (195)

View All
bottom of page