How To Implement A Holistic Approach To User Data Privacy [EXCLUSIVE]
When a company collects information, stores it and analyzes it, that information should not be used, stored, shared, maintained, retained or disposed of outside of the agreed purposes for which it was originally obtained. This instance is where data privacy strategies come into play again to help strengthen data ethics and security policies.
How to implement a holistic approach to user data privacy
The key steps in protecting sensitive data include automating visibility, contextualizing, controlling access policies and implementing ongoing monitoring to identify vulnerabilities and risks before they become breaches.
Support a zero trust approach to data management with an integrated suite of capabilities, including automatically created and securely isolated data copies that can tackle cybersecurity gaps in on-premises or hybrid cloud deployments.
Automate governanceProviding data protection and privacy at scale requires organizations to set up a governance framework so data is both accessible and protected. A data fabric architecture provides the methods your organization needs to automate data governance and privacy and maintain resilience no matter what tomorrow brings.
Adoption of a holistic approach to data protection requires cooperation among departments as well as business groups. In short, this applies to any group with a stake in collecting, processing, using and managing personally identifiable information (PII), intellectual property, trade secrets and other types of confidential information.
Discovering, identifying, and classifying your sensitive data is the critical first step in this process, but it also needs to be repeatable and agnostic of technology or geography. When data is discovered, it can then be classified (identified and grouped), based on specific patterns and algorithms. This provides IT professionals with the ability to make more informed decisions about security, data sharing, data access, digital transformation, cloud migration, and remediation prioritization. When data discovery and classification are followed by risk analysis, the most comprehensive and holistic security foundation is built on reality. Risk analysis helps IT teams understand the sensitivity of data and then rank its level of risk. These capabilities also help organizations enforce data sovereignty and meet data privacy and security regulations like GDPR, CCPA, Payment Credit Industry Data Security Standard (PCI DSS) and HIPAA.
As described in Section III. Privacy & Security, data protection requires a holistic approach to system design that incorporates a combination of legal, administrative, and technical safeguards. To begin, ID systems should be underpinned by legal frameworks that safeguard individual data, privacy, and user rights. Many countries have adopted general data protection and privacy laws that apply not only to the ID system, but to other government or private-sector activities that involve the processing of personal data. In accordance with international standards on privacy and data protection (see Box 8), these laws typically have broad provisions and principles specific to the collection, storage and use of personal information, including:
Privacy-enhancing technologies (PETs). Requirements to use technologies that protect privacy (e.g., the tokenization of unique identity numbers) by eliminating or reducing the collection of personal data, preventing unnecessary or undesired processing of personal data, and facilitating compliance with data protection rules.
In general, personal information should be lawfully obtained (usually through freely given consent) for a specific purpose, and not be used for unauthorized surveillance or profiling by governments or third parties or used for unconnected purposes without consent (unless otherwise required under the law). Finally, users should have certain rights over data about them, including the ability to obtain and correct erroneous data about them, and to have mechanisms to seek redress to secure these rights.
Policymakers and courts have struggled with striking the appropriate balance between protecting the privacy of registrants and supporting criminal investigations. One approach to such matters could be to apply the same rules that apply to other forms of searches and seizures in the country in question, such as a requirement that a warrant be obtained. This may be beneficial where a balance between personal privacy and public interest has already been struck in this regard. For further discussion and citations on this issue in scholarly work and the media, see the IDEEA tool).
In order to secure identity-enabled experiences and prove they take user privacy seriously, organizations must embrace the challenges that land in their laps. Implementing data privacy regulations, such as GDPR or the California Consumer Privacy Act (CCPA), should be viewed as an opportunity, not a hindrance.
Eve Maler, interim CTO at identity and access management platform ForgeRock, is a progressive proponent of privacy, user consent and IAM innovation. She directs the User-Managed Access (UMA) standards initiative and coinvented the SAML and XML standards. Her passion for privacy is apparent in her four-step strategy to improve and uphold data privacy and protection.
Here, Maler explains how organizations can use the opportunity to approach privacy in a holistic way to the benefit of the user, the institution and the future of user data privacy in the digital age.
Eve Maler: In the modern era, we've got a new view on what data privacy needs to be -- it's a far cry from the Data Protection initiative of 1995. Today, it involves building a pyramid of requirements and privacy. Data protection is the first layer of this pyramid -- the baseline. The next layer is data transparency -- enterprises are expected to tell people why they want data. The third and final layer of data privacy is data control. This is where we start to have no business model. The business model needs to give people control over their own lives.
Maler: They must present themselves as trustworthy. This can be accomplished by extending authority to individual users in appropriate ways and by reinforcing transparency by protecting the data. We have innovated user consent and access management in the world of open banking, for example. Open banking in the U.K. specifically started to innovate in what it calls 'strong customer authentication.' It requires strong patterns of user authentication -- beyond just the password.
These are the vicissitudes of regulatory changes -- complications come with the territory. I think there is pressure on us to federalize privacy laws, as opposed to state-by-state laws. GDPR was the result of trying to create a digital single market. It sought to increase individual empowerment. Perhaps the best success of GDPR was that enterprises started to feel an imperative to do data inventories and improve data hygiene controls.
I think we will start to see the same thing in the U.S. Because we lack a digital single market in the U.S. around privacy laws, we are suffering. We may be used to doing things differently for California markets -- gasoline is one example -- but now that CCPA and GDPR have been implemented and organizations have to do that for personal data, it is seen as inefficient. I think we are really under pressure to create better regulatory efficiencies.
We also believe identity relationship management is crucially important to the future of data privacy. For example, in the healthcare industry, regulations are increasing pressure on organizations to reduce what they call 'information-blocking' or 'data-blocking maneuvers.' There are attempts to ensure healthcare providers do not stop patients from accessing their healthcare information. Enabling people to share information with other parties is important in healthcare. The future of identity is about relationships.
The digital era has created unprecedented opportunities to conduct business and deliver services over the Internet. Nevertheless, as organizations collect, store, process and exchange large volumes of information in the course of addressing these opportunities, they face increasing challenges in the areas of data security, maintaining data privacy and meeting related compliance obligations.
This article presents an overview of the Data Governance for Privacy, Confidentiality and Compliance (DGPC) framework developed by Microsoft to assist organizations in creating a data governance program that addresses all three objectives in a holistic manner.1 In particular, this discussion focuses on the risk management portion of the DGPC framework.
The next step is to define guiding principles and policies that generate the appropriate context in which to meet these requirements. Last, the organization should identify threats against data security, privacy and compliance in the context of specific data flows; analyze the related risks; and determine appropriate control objectives and control activities.
Each row depicts a stage in the information life cycle. The first four columns in the matrix represent a technology domain, while the far-right column represents manual control activities that must take place to meet the requirements of the four data privacy and confidentiality principles at each stage of the information life cycle. The four principles form the basis of questions that will be asked for every cell of the matrix.
Identifying these threat types offers a starting point for organizations to assess their data flows and consider how assumptions about privacy, confidentiality and compliance may change when a flow crosses a trust boundary, such as during transitions between life-cycle phases.
A program based on the Data Governance for Privacy, Confidentiality and Compliance framework complements existing security standards and control frameworks by providing a holistic approach to identifying data-flow-specific threats against data privacy, security and compliance and by addressing residual risks in effective and efficient ways.6