Data Protection Archives | Calligo https://www.calligo.io/insights/data-protection/ Building value through data Mon, 08 Apr 2024 15:49:45 +0000 en-GB hourly 1 https://wordpress.org/?v=6.9.4 Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know https://www.calligo.io/insights/glossary/eu-proposed-artificial-intelligence-act/ Tue, 12 Mar 2024 15:12:23 +0000 https://www.calligo.io/?p=5181 The EU AI Act (the “AI Act”) is the world’s first comprehensive AI law. The Act lays down a harmonised legal framework for the development, supply, and use of AI products and services in the EU.   To whom does the AI Act apply?  The legal framework will apply to all AI systems impacting people […]

The post Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know appeared first on Calligo.

]]>

The EU AI Act (the “AI Act”) is the world’s first comprehensive AI law. The Act lays down a harmonised legal framework for the development, supply, and use of AI products and services in the EU.  

To whom does the AI Act apply? 

The legal framework will apply to all AI systems impacting people in the EU, regardless of where systems are developed or deployed. 

When will the AI Act take effect? 

The AI Act is currently expected to enter into force in Q2-Q3 2024, with different obligations then taking effect in stages. 

Understanding the  AI Act’s Objectives 

The draft AI Act seeks to achieve a set of specific objectives:  

  • Ensuring that AI systems placed on the EU market are safe and respect existing EU law; 
  • Ensuring legal certainty to facilitate investment and innovation in AI; 
  • Enhancing governance and effective enforcement of EU law on fundamental rights and safety requirements applicable to AI systems; and  
  • Facilitating the development of a single market for lawful, safe, and trustworthy AI applications and preventing market fragmentation.  

AI Act: different rules for different risk levels 

The new rules establish obligations for providers and users depending on the level of risk from artificial intelligence. While many AI systems pose minimal risk, they need to be assessed. 

 
1. Unacceptable risk 

Unacceptable risk AI systems are systems considered a threat to people and will be banned.  

They include: 

  • Cognitive behavioural manipulation of people or specific vulnerable groups: for example, voice-activated toys that encourage dangerous behaviour in children. 
  • Social scoring: classifying people based on behaviour, socio-economic status, or personal characteristics. 
  • Biometric identification and categorisation of people. 
  • Real-time and remote biometric identification systems, such as facial recognition. 

Some exceptions may be allowed for law enforcement purposes. “Real-time” remote biometric identification systems will be allowed in a limited number of serious cases, while “post” remote biometric identification systems, where identification occurs after a significant delay, will be allowed to prosecute serious crimes and only after court approval. 

2. High risk 

AI systems that negatively affect safety or fundamental rights will be considered high risk and will be divided into two categories: 

1) AI systems that are used in products falling under the EU’s product safety legislation. This includes toys, aviation, cars, medical devices and lifts. 

2) AI systems falling into specific areas that will have to be registered in an EU database: 

  • Management and operation of critical infrastructure 
  • Education and vocational training 
  • Employment, worker management and access to self-employment 
  • Access to and enjoyment of essential private services and public services and benefits 
  • Law enforcement 
  • Migration, asylum and border control management 
  • Assistance in legal interpretation and application of the law. 

 
All high-risk AI systems will be assessed before being put on the market and also throughout their lifecycle. 

3. General purpose and generative AI 
Generative AI, like ChatGPT, would have to comply with transparency requirements: 

  • Disclosing that the content was generated by AI. 
  • Designing the model to prevent it from generating illegal content. 
  • Publishing summaries of copyrighted data used for training. 

High-impact general-purpose AI models that might pose systemic risk, such as the more advanced AI model GPT-4, would have to undergo thorough evaluations and any serious incidents would have to be reported to the European Commission. 

4. Limited risk 

Limited risk AI systems should comply with minimal transparency requirements that would allow users to make informed decisions. After interacting with the applications, the user can then decide whether they want to continue using it. Users should be made aware when they are interacting with AI. This includes AI systems that generate or manipulate image, audio or video content, for example deepfakes. 

Opportunities 

Ethical Leadership: Organisations that prioritise ethical AI practices and demonstrate a commitment to responsible innovation can enhance their reputation and build trust with consumers, employees, and regulators. By aligning with the principles of the AI Act, organisations can position themselves as leaders in ethical AI deployment. 

Innovation and Differentiation: The AI Act promotes regulatory sandboxes and real-world testing, providing opportunities for Organisations to innovate and develop AI solutions in a controlled environment. Companies that invest in compliance and develop AI systems that meet the  AI Act’s standards can differentiate themselves in the market and gain a competitive edge. 

Market Expansion: Compliance with the AI Act allows Organisations to access the European market with confidence, as they demonstrate adherence to regulatory requirements and respect for fundamental human rights. This opens opportunities for expansion and growth in a region that values ethical AI practices. 

Talent Acquisition: Companies that invest in talent acquisition and training to support AIA compliance with the AI Act can attract top-tier professionals with expertise in AI governance, ethics, and regulatory compliance. Building a skilled workforce capable of navigating the complexities of AI regulation is essential for long-term success. 

The AI Act represents a real opportunity for Organisations that are looking to leverage the power of AI. However, there are some threats that business leaders also need to consider. 

Threats: 

Compliance Costs: The AI Act imposes significant compliance costs on Organisations, including overhead expenses related to risk assessments, governance frameworks, and regulatory reporting. Companies that fail to allocate sufficient resources to the Act’s compliance may face financial strain and operational challenges. 

Fines and Penalties: Non-compliance with the AI Act can result in substantial fines ranging from €7.5 million to €35 million, or a percentage of global turnover. Organisations that neglect the AI Act’s requirements or underestimate the severity of regulatory violations risk facing severe financial penalties that could impact their bottom line and reputation. 

Operational Disruption: Implementing robust governance and oversight measures to ensure  compliance with the AI Act may require operational adjustments and process changes. Organisations that fail to adapt their operations to meet the AI Act’s standards may experience disruption and inefficiencies that hinder productivity and competitiveness. 

Reputational Damage: Violations of the AI Act’s ethical standards or failures to comply with regulatory requirements can lead to reputational damage and loss of consumer trust. Organisations that are perceived as prioritising profit over ethics or disregarding fundamental human rights may face backlash from stakeholders and damage to their brand reputation. 

Conclusion  

In conclusion, while the AI Act presents opportunities for Organisations to demonstrate ethical leadership, drive innovation, and access new markets, it also poses significant threats in terms of compliance costs, fines, operational disruption, and reputational damage. By proactively addressing these challenges and investing in compliance with the AI Act, Organisations can navigate the regulatory landscape successfully and leverage AI technologies responsibly for long-term growth and sustainability. 

For more comprehensive information on Calligo’s Data Ethics and Governance solutions, visit https://www.calligo.io

For more information on Calligo’s AI solutions, visit https://www.calligo.io

The post Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know appeared first on Calligo.

]]>
Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/ https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/#respond Wed, 06 Mar 2024 15:25:48 +0000 https://www.calligo.io/?p=5169   In this lively debate you will hear from Calligo’s Practice Leads as they discuss their key takeaways from 2023 and their data predictions for 2024 and beyond. Topics discussed include: Regulation of AI including the EU AI act AI hallucinations & AI bias Data governance and data fines Dashboard fatigue Data ROI

The post Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable appeared first on Calligo.

]]>

 

In this lively debate you will hear from Calligo’s Practice Leads as they discuss their key takeaways from 2023 and their data predictions for 2024 and beyond.

Topics discussed include:

Regulation of AI including the EU AI act

AI hallucinations & AI bias

Data governance and data fines

Dashboard fatigue

Data ROI

The post Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable appeared first on Calligo.

]]>
https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/feed/ 0
Data Sovereignty Unveiled – Balancing Rights, Privacy, and Innovation https://www.calligo.io/insights/beyond-data-podcast/beyond-data-episode-data-sovereignty-unveiled/ https://www.calligo.io/insights/beyond-data-podcast/beyond-data-episode-data-sovereignty-unveiled/#respond Mon, 10 Jul 2023 13:14:00 +0000 https://www.calligo.io/insights// In this episode of the Beyond Data podcast series, Tessa Jones (Calligo’s Chief Data Scientist) and Peter Matson (ML Solution Architect) are joined by Martin Hoskin, Chief Technologist at VMware and Advisory Board Member for the Centre for Data Ethics & Innovation. In this enlightening discussion, we delve into the concept of data sovereignty and its implications for ethical […]

The post Data Sovereignty Unveiled – Balancing Rights, Privacy, and Innovation appeared first on Calligo.

]]>

In this episode of the Beyond Data podcast series, Tessa Jones (Calligo’s Chief Data Scientist) and Peter Matson (ML Solution Architect) are joined by Martin Hoskin, Chief Technologist at VMware and Advisory Board Member for the Centre for Data Ethics & Innovation. In this enlightening discussion, we delve into the concept of data sovereignty and its implications for ethical data use, as well as explore how federated learning offers a promising solution to the challenges we face. 

Understanding Data Sovereignty

Data sovereignty encompasses the notion of data residency, access control, and governance. The dominance of American cloud providers, subject to U.S. laws, raises concerns about data privacy and security, particularly in the European context. For certain organizations, like government agencies and defense suppliers, data sovereignty becomes a critical factor. VMware has introduced a program to certify partners as Sovereign, ensuring data storage, processing, and governance are specified, differentiating them from major hyperscale cloud providers. 

The Challenge of Data Sharing

Data sovereignty also touches upon the ethical dilemma of sharing data for legitimate purposes like law enforcement investigations. Striking a balance between data privacy and the greater good is complex. For instance, the case of Apple’s cloud security raises questions about when governments should access personal data to combat serious crimes. 

Federated learning emerges as a promising solution to data sharing challenges. This approach enables entities to collaboratively train machine learning models without sharing raw data. Instead, local models are trained on separate datasets, and only aggregated model updates are shared with a central server. This preserves privacy and protects sensitive data, making it suitable for applications like fraud detection in the banking industry. 

Experimenting with Federated Learning

The Centre for Data Ethics & Innovation (CDI) conducted an experiment using federated learning for government-provided services. The CDI set up two data sets—one for detecting fraud in financial transactions using SWIFT data and another for studying the spread of COVID-19. The experiment highlighted the complexities of sharing data, including obtaining government buy-in and ensuring data anonymization to protect privacy. 

While federated learning is ingenious, it comes with its own set of challenges. Concerns arise about the aggregator potentially being reverse engineered to extract sensitive information. Additionally, the scale of data involved in real-world applications may make reverse engineering even more difficult. 

As data continues to play a critical role in various industries, addressing data sovereignty and privacy concerns remains paramount. Federated learning offers a way to enable collaboration without compromising data privacy. However, continuous innovation is necessary to tackle challenges like reverse engineering and fully realize the potential benefits of this approach. 

Ethical Considerations in AI and Data Technology

The conversation takes a broader turn, exploring the intersection of AI, data, and ethics. AI development should consider risks, probabilities, and potential biases to build robust and ethical systems. Ethical implications of sharing genetic data and the responsibility of pharmaceutical companies in handling such information are discussed. 

Regulating AI Ethics and the Divide between Academia and Industry

The need for clear regulations to define and enforce ethical standards in AI and data technology is acknowledged. Balancing philosophical academic perspectives with industry practicality becomes essential as AI progresses toward stronger AI with self-learning capabilities. 

Navigating Legal Frameworks and Data Sharing in Healthcare

Enforcing ethical standards and regulations on a global scale, especially with rogue states, poses challenges. Collaboration through global forums, like Gaia X, can facilitate trust, data security, and individual interpretations of frameworks. Standardized data-sharing frameworks and data portability regulations can address data sharing challenges in healthcare. 

Autonomous Weapons and the Role of Global Forums

The ethical challenges of deploying AI in autonomous weapons, especially in making life and death decisions, raise profound moral dilemmas. The hosts stress the importance of engaging in public discourse and involving the global community to shape AI and robotics’ future. 

The Impact of Social Media on Data Privacy

The podcast concludes with a discussion on the influence of social media on data privacy and the ethical considerations surrounding its use. Addressing the impact on young minds and the potential implications on decision-making, including voting rights for 16- and 17-year-olds, is highlighted. 

In conclusion, data sovereignty, AI ethics, and federated learning are crucial components of an evolving data landscape. Ethical considerations must be at the forefront of AI development and data sharing to ensure responsible and equitable data-driven futures. By embracing ethical practices and fostering interdisciplinary collaboration, we can harness the potential of AI while respecting individual rights and privacy. Establishing global forums and transparent public discussions will play a pivotal role in shaping the future of AI and robotics in a manner that benefits humanity as a whole. 

Listen on Spotify or watch below

The post Data Sovereignty Unveiled – Balancing Rights, Privacy, and Innovation appeared first on Calligo.

]]>
https://www.calligo.io/insights/beyond-data-podcast/beyond-data-episode-data-sovereignty-unveiled/feed/ 0
The benefits of outsourced Data Protection Officer as a Service https://www.calligo.io/insights/glossary/the-benefits-of-outsourced-data-protection-officer-as-a-service/ https://www.calligo.io/insights/glossary/the-benefits-of-outsourced-data-protection-officer-as-a-service/#respond Tue, 21 Feb 2023 13:37:33 +0000 https://www.calligo.io/the-benefits-of-outsourced-data-protection-officer-as-a-service/ As the world becomes increasingly digital and cloud based, the importance of data protection and privacy has become paramount for all organizations. One key aspect of ensuring compliance with data protection laws and regulations is the appointment of a Data Protection Officer (DPO). However, appointing a DPO internally can present several challenges, including conflicts of […]

The post The benefits of outsourced Data Protection Officer as a Service appeared first on Calligo.

]]>

As the world becomes increasingly digital and cloud based, the importance of data protection and privacy has become paramount for all organizations. One key aspect of ensuring compliance with data protection laws and regulations is the appointment of a Data Protection Officer (DPO).

However, appointing a DPO internally can present several challenges, including conflicts of interest and a lack of specialized skills. That is where Data Protection Officer as a Service (DPOaaS) comes in.

Sidestep potential conflict of interest

One of the main reasons organizations appoint external DPOs is to sidestep the potential conflict of interest that arises when a DPO is appointed internally. Supervisory Authorities are becoming increasingly strict about this issue, and a conflict of interest can be seen as a punishable breach. For example, CIOs and CISOs are responsible for the collection, storage, and protection of data, which can prevent them from objectively scrutinizing their own processes.

Similarly, Heads of Legal and In-House Counsel are tasked with defending the organization’s interests, while a DPO is required to represent the data subject. Heads of Compliance, who are responsible for determining how data is processed, may also be unable to impartially assess its adherence to legal obligations.

By outsourcing your DPO to a specialized service provider, such as Calligo, you can sidestep these conflicts of interest and ensure your organization’s compliance and data safety. Outsourcing your DPO is also faster and more cost-effective than hiring one internally.

10x as many DPO vacancies as there are qualified individuals

There are currently 10x as many DPO vacancies as there are qualified individuals, making hiring processes long and expensive. Outsourcing your DPO allows for flexible resourcing, as the role is often not a full-time position. Additionally, outsourcing your DPO gives you access to a wider set of skills, including technical, legal, and information security expertise, all at a far lower cost than recruiting each of these individuals individually.

The Calligo Privacy Team is a specialized team of experienced and qualified professionals with deliberately diverse career backgrounds and deep subject matter knowledge. They are committed to ensuring adherence to global data protection laws without compromising the ambitions and goals of your clients. The team is highly qualified, holding certifications such as the IAPP, which are the world’s most trusted and respected certifications in data privacy. These cover privacy laws and regulations and the practical operations to apply and deploy them successfully.

The Calligo Privacy Team also brings diversity in terms of industry experience. By operating in varied domains, the team’s expertise is sector-transferable, keeping your knowledge as relevant as possible. In an increasingly complex landscape, the team is uniquely placed to support you in the nuances of different data protection and privacy regulations, across any sector and jurisdiction. The team has supported industries such as global manufacturing, global franchise fast food brands, financial, software as a service platform providers, energy, government, charities, and service providers.

In summary, Data protection and privacy is crucial for all organizations in the digital age. However, appointing an internal Data Protection Officer (DPO) can be challenging, due to potential conflicts of interest and lack of expertise. DPO as a Service (DPOaaS) provides a solution by outsourcing the role to a specialized service provider, avoiding conflicts of interest and providing access to a wider set of skills at a lower cost. The Calligo Privacy Team is a highly qualified team of experienced professionals with diverse backgrounds and certifications in data privacy, who are committed to ensuring global data protection compliance. The team has a proven track record of supporting various industries, keeping knowledge relevant and up-to-date.

Let the team help you fulfill your legal obligation to appoint a suitable Data Protection Officer, while also serving as an internal advisor, representative, and liaison for your organization.

Learn more about Calligo’s Data Protection Officer as a Service

The post The benefits of outsourced Data Protection Officer as a Service appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/the-benefits-of-outsourced-data-protection-officer-as-a-service/feed/ 0
Why data-ambitious organizations need more than a Chief Data Officer (CDO) https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/ https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/#respond Fri, 04 Feb 2022 10:16:16 +0000 https://www.calligo.io/insights/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/ The rise of the CDO The potential value of data – if used optimally – is unquestioned. In recent years, there has been a clear acceleration in the number of organizations keen to not only better understand their data’s potential, but also govern it more rigorously, structure it more usefully and use it more creatively. […]

The post Why data-ambitious organizations need more than a Chief Data Officer (CDO) appeared first on Calligo.

]]>
The rise of the CDO

The potential value of data – if used optimally – is unquestioned.

In recent years, there has been a clear acceleration in the number of organizations keen to not only better understand their data’s potential, but also govern it more rigorously, structure it more usefully and use it more creatively.

And so, they appoint a Chief Data Officer (CDO) to drive this change.

This person – the business hopes – will “take hold of the data problem”, pulling sources and siloes together to create clarity, drive automation, place data and insights into the hands of the front line, and improve business performance and customer satisfaction.

Discussing Client Ambition

When discussing these ambitions with our clients, the excitement and optimism is clear. But what is often missed, or at best over-simplified, is the need to execute safely.

Managing the security risk to the organization is a fundamental part of a CDO’s remit. Depending on the organizational structure, it is usually shared with or delegated to a dedicated CISO or equivalent.

Similarly, compliance with industry regulations and certifications such as ISO and SOC comes under the governance aspect of the CDO role (again, often shared with / delegated to the CISO)

But what about Data Privacy?

CDOs and data privacy

In the pursuit of these ambitious data goals, while the CDO and/or CISO handle security and compliance, who will manage the privacy-related risks to the organization? And the risk to the data subjects?

  • What data is personally-identifiable, and therefore subject to data privacy laws?
  • Where is this data received from and held?
  • How retrievable is it?
  • How is it used?
  • Will personal data be exposed to machine learning or automated decision-making?
  • When and how is personal data shared?
  • Or disposed of?

In tackling these questions, some organisations believe the CDO can also perform the Data Protection Officer (DPO) role, or have one report into them or the CISO. Others appoint a Chief Privacy Officer, thinking they are the same as a DPO, or a “DPO+”. Others ignore the need for privacy oversight altogether.

None of these answers are wise. Some are even illegal and can result in penalties.

The truth is, most data-ambitious organizations require all three roles. Without them, data safety is jeopardised and the company is at risk of non-compliance, breaches, inefficiency and missed opportunity.

But how the remits are best defined and structured is often a mystery.

Below is a guide to the three pertinent roles – Chief Data Officer (CDO), Chief Privacy Officer (CPO) and Data Protection Officer (DPO) – outlining why each role is essential for every data-ambitious organization, plus their differences, inter-relationships, boundaries and overlaps.

CDOs and data privacy

In the pursuit of these ambitious data goals, while the CDO and/or CISO handle security and compliance, who will manage the privacy-related risks to the organization? And the risk to the data subjects?

  • What data is personally-identifiable, and therefore subject to data privacy laws?
  • Where is this data received from and held?
  • How retrievable is it?
  • How is it used?
  • Will personal data be exposed to machine learning or automated decision-making?
  • When and how is personal data shared?
  • Or disposed of?

In tackling these questions, some organisations believe the CDO can also perform the Data Protection Officer (DPO) role, or have one report into them or the CISO. Others appoint a Chief Privacy Officer, thinking they are the same as a DPO, or a “DPO+”. Others ignore the need for privacy oversight altogether.

None of these answers are wise. Some are even illegal and can result in penalties.

The truth is, most data-ambitious organizations require all three roles. Without them, data safety is jeopardised and the company is at risk of non-compliance, breaches, inefficiency and missed opportunity.

But how the remits are best defined and structured is often a mystery.

Below is a guide to the three pertinent roles – Chief Data Officer (CDO), Chief Privacy Officer (CPO) and Data Protection Officer (DPO) – outlining why each role is essential for every data-ambitious organization, plus their differences, inter-relationships, boundaries and overlaps.

Who you need

The Chief Data Officer (CDO)

Responsible for using data to best effect. The basis of this is data governance – its stewardship, consolidation, structure, management and distribution, but also the security and compliance risk it presents. On top of this lies innovation and how it can be most profitably exploited, whether through automation, analysis or data science.

The Chief Privacy Officer (CPO)

This role sits within the overall CDO responsibility. This role adds the perspective of privacy compliance to the CDO function, specifically in terms of any action’s risk to the company. As such, they will lead on the construction of the privacy programme, its roll-out and training and any necessary assessments.

The Data Protection Officer (DPO)

Represents the data subject within the organization. They oversee activities from data processing, assessments and employee training to ensure that none of them conflict with data subjects’ privacy rights, and as such must maintain independence from activities and reporting lines. While perhaps not technically required within your organization (for instance if you are not a public body, do not systematically process personal data as a core activity, or are not processing ‘large volumes’ of sensitive data), it is nonetheless a firmly recommended role for any data-ambitious organization with any degree of use of personal data.

Can these roles be combined into single individuals?

The CDO and CPO can be the same person, and arguably should be to ensure that the entirety of data safety – security and privacy – are the foundations of all data use and governance, and reducing the risk of accidental non-compliance, or painful retrofitting of compliance requirements.

The DPO and CDO (and/or CPO) must never be the same person, as it would create a punishable conflict of interest. They should not even be in the same reporting structure. The DPO’s role is to independently monitor and question all activities, strategic policies and objectives, which means they need the platform to challenge every level of the organization.

The risk of getting this wrong

Risk of unethical / non-compliant data processing

Our data privacy experts have often seen overenthusiasm and ambition innocently leading to personal data being misused. Without anyone overseeing the privacy risk to the data subject (DPO) or even the business (CPO), and a focus only on security, then organizations can easily overstep.

Missed opportunity

DPOs and CPOs are often mistaken for naysayers, as they too often focus on limiting what can be done with data and curtailing the ambition of the CDO. In fact, the best DPOs and CPOs will support the CDO’s objectives, by suggesting innovative approaches to data use that balance ambition with risk.

Delays

If privacy is not a foundation on which data ambitions are built, then it will either be forgotten or retrofitted. The former creates risk of breaches, while the latter creates delays. Projects that lay privacy on top, rather than being designed with it in mind from the outset, risk needing costly redesign and rebuilding.

Conflict of interest

A DPO has to be independent of the day-to-day processes of data management, including its receipt, use, treatment and security. This rules out those job titles that are classically given this second role, such as CIOs and Heads of Compliance, and that regulators are now punishing.

The Chief Data Officer (CDO)

Remit unique to this position:

Data governance

Ranging from data’s structure and architecture to its management and ongoing quality assurance. Accurate and efficient data governance is the foundation stone of all data initiatives. Data siloes, untidy or incomplete data and inconsistent data structures are the principle barriers to data ambitions.

Security-related risk to company

Clearly overlapping with the above, the CDO is required to identify where the ambitions for data’s structure, storage and use will create security and regulatory compliance risk. Working with the CISO – who may be alongside or within the CDO’s team – these risks then need to be mitigated comprehensively, and without obstructing operations.

Innovation / Data Science & Insights

This is the principal reason for the appointment of a CDO: using data creatively to further the aims of the organization as a whole. Building on the groundwork of data governance and security, this may be through automation, analytics, visualizations, machine learning or other forms of AI. Projects may be intended for internal efficiency, or the development of new products and services, but one truth remains at every initiative’s core: using data more intelligently.

The Chief Privacy Officer (CPO)

Remit unique to this position:

Privacy-related risk to company

While the CDO handles the security-related risk, the CPO looks specifically at personally-identifiable data, how well protected it is and how ethically / compliantly it is used. This will include determining how all the organization’s activities affect the regulations whose scope they fall under, and ensuring the various obligations are all addressed.

Clearly, this responsibility overlaps with the CDO’s security-related remit, and requires the cooperation of the CISO, as a lot (though not all) of a privacy-focused risk assessment is based in typical security technical and organizational measures (TOMs). As such, the CPO role may well be part of the CDO’s, if the individual has the relevant privacy skills.

Devise & deploy the privacy programme

This is the tactical implementation of the above. It involves the creation of policies and processes that will protect personal data in every department, by every user and with every data interaction, and specifically on an ongoing basis.

Unlike many other areas of compliance, data privacy requires continuous management and oversight. A breach of ISO compliance requirements on a given day is unlikely to jeopardise completing the next audit’s requirements and maintaining certification. In contrast, a single breach of data privacy requirements could result in customer dissatisfaction, being reported to regulators and potentially fines and irreparable brand damage. As such, the deployment of the privacy programme must ensure continuous protection.

Data Protection Officer

Remit unique to this position:

Privacy-related risk to data subjects

This is the crux of the DPO role. A Data Protection Officer is one of few senior roles who categorically do not serve the interests of the organization, but of third parties – arguably the only one. It is this unusual perspective that requires them to be independent of the mechanics of the organization, and that underpins all other responsibilities.

Oversight

The DPO is responsible for continuously monitoring all data processing activities and independently assessing their adherence to the GDPR and any other relevant legislation. Any faults or risks found are then the responsibility of the CPO and/or CDO to remedy, working alongside any relevant departmental head.

Internal audit

Part of the Oversight role above will include regular internal audits of data processing activities. An initial GAP Analysis will show a baseline of compliance, while subsequent periodic audits will showcase the evolving privacy maturity of the organization, plus any persistent weaknesses.

Liaison with authorities and data subjects

DPOs also act as a conduit for all communications with supervisory authorities and data subjects. They may do this proactively, for example securing approval from authorities on the legitimacy of any new and unusual data processing initiatives. DPOs will also handle the communications with any data subjects in the case of Data Subject Requests.

The Shared Remits

Shared remit: CDO & DPO

Automated decision-making

This is a crucial overlap. For many data-ambitious organizations, especially those in consumer services such as banking, telecoms or utilities, there will be a drive to use automation or machine learning to systematize interactions with customers based on the data on them as individuals. These may include the pricing and terms offered to them, which would mean that automated decisions are being made that have a legal or similarly significant effect – which is specifically limited by the GDPR and many other privacy regulations that followed in its footsteps.

This is therefore a classic example of a situation where the CDO and the DPO would have to work together to ensure that the project is legitimately designed and executed, and is highly indicative of why the DPO cannot be the same person or even be in the same reporting structure as the CDO. The CDO’s project needs to be able to be objectively critiqued and perhaps stopped by an independent DPO.

Shared remit: CDO & CPO

Ethical Data Impact Assessments (EDIAs)

EDIAs are modern supplements to the pre-existing Data Protection Impact Assessment (DPIA), and are effectively documented evidence of the scrutiny required above in instances of Automated Decision-making.

While not specifically required by privacy legislation or guidance – as a DPIA is – the sort of rigour they encompass is. As mentioned above, references are found in the GDPR and many other pursuant regulations. The extra scrutiny is recommended because of the deliberate removal of human oversight from processes, and therefore the risk of the inadvertent removal of understanding, proportionality, fairness and even values.

For a DPIA, a DPO and a CPO (see below) will collaborate on mitigating the risks to data subjects – hence the DPO’s involvement.

An EDIA’s extra considerations beyond a DPIA focus on accountability, transparency, necessity and sustainability. These are more technical, strategic and concerned with personal rights including but also beyond privacy, such as the right to not be discriminated against.

The CDO’s input will therefore cover the technical and strategic sides, while the CPO is best placed to review the technology’s ethical use. In truth, this is not a perfect fit. But there are few alternatives. A DPO’s role is to monitor activity through a strict lens of protecting data subjects’ privacy rights – and arguably their independence means their role can never be to perform assessments, only to review. Legal counsel is concerned with the application of the codified law, not the wider topic of ethics. Compliance roles are similarly used to implement specific rules and standards.

Upholding ethics is different by its nature, and not typically a nominated role within organizations, but a CPO is arguably the closest fit, not least because they lead the completion of DPIAs, on which EDIAs are based.

Shared remit: CPO & DPO

Training employees

This is part of the CPO’s deployment of the overall privacy programme, but requires the involvement of the DPO because of their responsibility for monitoring internal compliance. Acting on behalf of data subjects, the DPO will check the suitability and comprehensiveness of the training programme, in essence confirming that should the training be satisfactorily completed (the CPO’s responsibility to ensure), then data subjects’ rights are protected

Data Protection Impact Assessments (DPIAs)

These tools identify any potential risks that may arise from processing personal data, allowing the organization to minimise and negate them in advance. They are a key requirement for demonstrating adherence to GDPR and most other privacy regulations, and should be completed for every way in which an organization processes data.

They are the CPO’s responsibility to perform, though as with the Training above, the DPO is required to provide an oversight role to ensure data subjects’ rights are protected. They will advise the CPO on whether a DPIA is necessary in any given situation, how it should be performed, what measures can be legitimately put in place to negate any risks identified, and whether the ultimate decision that process is permitted or not is correct.

This process and shared responsibility applies equally to other privacy adherence tools such as Legitimate Interest Assessments (LIAs), where the CPO is responsible for performing the duty, while the DPO ensures their completion and verifies their outcomes.

Data Subject Access Requests (DSARs)

Some of the most common instances of CPOs and DPOs having to collaborate are on DSARs. In some industries, these are rather common, especially those with high volumes of consumer interaction such as retail, utilities, telecoms and retail banking. A CPO will be responsible for the performance of the DSAR – for example, verifying the identity of the data subject and collecting relevant data – while the DPO will be responsible for overseeing the process, approving the data to be shared, ensuring deadlines are met and handling communications with the data subject.

The Universal Responsibilities

Data Quality

All three Data Officers have a responsibility – or at least a vested interest – in maintaining the continuous quality of all the organisation’s data.

  • For a CDO, this is of course a principal strategic objective. Better use of data relies on data sources being cleansed for interrogation, and probably integrated under common data models to allow for deeper insights. But without continuous data governance – the process by which data quality is preserved – then interrogation becomes impossible, and integrations fall apart.
  • Data quality requires common rules – defined and upheld ultimately by the CDO – for how data is collected and stored; agreed responsibilities for how it is maintained and kept complete, credible, useful and clean,; and a clear vision for how it may be used.
  • The CPO and DPO will also have involvement in this, and vested interests in its performance. How and where the CDO decides to store data will need to adhere to data residency and sovereignty requirements. Data privacy regulations routinely give data subjects a Right to Accuracy, where every reasonable step must be taken to rectify data inaccuracies or erase data if no longer correct. And of course, without complete, clean and credible data, then DSARs cannot be accurately performed, and DPIAs and other typical processes cannot be conducted or verified easily.

DPIAs in fact even have a specific question of:

“Are you satisfied that the personal data processed is of good enough quality for the purposes proposed? If not, why not?”

Of course, the easiest way for Data Quality to serve all three Data Officers needs is to base the organization’s Data Quality framework on the principles of Privacy by Design & Default.

Contracts

While the above is a strategic imperative that requires all three Data Officers’ involvement, this is a tactical overlap.

  • Contracts with new suppliers, partners, and potentially customers that inherently involve the processing of personal data create responsibilities for CDOs, DPOs and CPOs alike.
  • A CDO needs to ensure that the contract and the mechanics of the engagement will not undermine or contradict any element of data governance. For example, if the new contract is with a new cloud services provider, can the provider support any ISO, SOC or PCI obligations? If the contract is with a new CRM, is the data structure consistent with any pre-existing common data model and how will data quality and accuracy be maintained? And in all cases, what security measures are in place to protect data from internal and external threats?
  • Meanwhile, a CPO will be concerned with whether the contract is in line with the organization’s privacy obligations. To use the example of the new cloud provider again, will data residency obligations be met? Or for new SaaS platforms, where will data be stored and are the correct cross-border data transfer mechanisms such as Standard Contractual Clauses (SCCs) in place?
  • Finally, a DPO’s role in a contract scenario is to review the legitimacy of the decisions made above, and verify that the privacy of data subjects’ personal data will not be jeopardised – regardless of whether the organization is a controller or a processor in the given scenario.

The Core Lessons

  • All three roles – CDO, CPO, DPO – are probably required in your organization, even if a DPO is not strictly required it is nonetheless advisable.
  • The CDO can also be the CPO, but the DPO must be independent.
  • The CDO defines the strategy and is responsible for the vision of what is to be accomplished with your organization’s data. This will include its structure, security, governance, maintenance and creation of value.
  • The CPO is responsible for ensuring that the implementation of this strategy will not put the organization at any privacy-related risk, and is tasked with mitigating any risk with a defined and well-executed privacy programme.
  • The DPO is the representative of the data subject within the organization, and is primarily responsible for overseeing the activities and ensuring no rights are or could be infringed.
  • The more fundamental or complex the operation (such as data quality or intelligent data use), the more likely it is to require all three roles.
  • Putting privacy – and better yet, total data safety – at the heart of every data initiative and interaction will make it more likely that every role’s agendas are equally met.

The post Why data-ambitious organizations need more than a Chief Data Officer (CDO) appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/feed/ 0
Does your DPO have a Conflict of Interest? https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/ https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/#respond Mon, 20 Dec 2021 15:30:47 +0000 https://www.calligo.io/insights/does-your-dpo-have-a-conflict-of-interest/ What is a DPO? Unlike many other areas of compliance, data privacy adherence is not something that can be audited once and then presumed to continue for the foreseeable future. Data is the most voluminous, mobile, essential and potentially dangerous asset any business owns. It is created, deleted and interacted with constantly, often in new […]

The post Does your DPO have a Conflict of Interest? appeared first on Calligo.

]]>
What is a DPO?

Unlike many other areas of compliance, data privacy adherence is not something that can be audited once and then presumed to continue for the foreseeable future.

Data is the most voluminous, mobile, essential and potentially dangerous asset any business owns. It is created, deleted and interacted with constantly, often in new ways by new individuals.

A point in time audit is simply not suitable for continuous oversight of how data is treated.

It is this unavoidable truth that led the GDPR legislators to require organizations that process the most data, and/or the most sensitive data, to ensure that the interests of the data subject are continually and adequately represented in any and all data processing. Hence, the mandated requirement for the Data Protection Officer (DPO).

Under Article 37, DPOs are a mandated requirement if:

  • You are a public authority or body
  • You are an organisation whose core activities consist of processing operations that require regular and systematic monitoring of data subjects on a large scale (e.g. online behaviour tracking)
  • You engage in the processing of large volumes of special category data, or data related to criminal offences and convictions

The DPO’s tasks are outlined in Article 39 of the GDPR as:

  • To inform and advise the business and its employees of their GDPR obligations.
  • To monitor and audit compliance with the GDPR and the business’ data processing policies, including the assignment of responsibilities, awareness-raising and training of staff.
  • To manage data protection impact assessments, and monitor their outcomes.
  • To cooperate with and serve as the contact point for Supervisory Authorities.

Appointing a DPO internally

Many mandated businesses have dutifully appointed their DPO. They have consciously sought to avoid the expense, time and difficulty of hiring a new head, and distilled the requirements and responsibilities to their raw essences and found a person internally who:

  • Understands the way the company ingests and uses data
  • Has the standing and breadth of involvement in the business to appreciate every data workflow
  • Is experienced in the administrative, legalistic and monitoring sides of compliance
  • Is senior and credible enough – as the GDPR requires – to interact with, advise and perhaps argue with the highest levels of the business

This seems suitable. The rights and interests of the data subjects appear to be best protected by a person who has this experience and background, and who can monitor the organization’s activities and ensure their adherence to the rules and the sentiment of GDPR, such as the CIO, CISO, Head of Compliance, Head of Legal, even the CEO.

These organizations seem to be acting in totally good faith. After all, Article 38(6) even allows the DPO role to be secondary role on top of day-to-day operations.

But they have forgotten an underlying principle of the GDPR: the DPO must be independent.

By expecting someone who also has responsibility for the management, oversight, strategy or security of data and how it is processed (i.e. a data controller), to also scrutinise, critique and object to those same processes on behalf of data subjects is creating a conflict of interest.

It is like asking students to mark their own homework. As much as they may be obliged to remain impartial, they have their own obligations, objectives and interests that prevent them from being completely and undeniably impartial.

No matter how ethically they may think they act, it represents a compliance failure.

The danger

And legislators are hot on this. Most Supervisory Authorities, including the UK’s Information Commissioner’s Office (ICO), have issued specific guidance on how to avoid conflict of interest. While this proactive support shows that the SAs intend to help businesses avoid making this error, the flipside is that it also means they will not tolerate failure.

Indeed, fines have started to be handed to firms who overstep, intentionally or otherwise. A prime example is a E50,000 penalty for a Belgian telecoms operator whose DPO was also their Head of Compliance, responsible for the compliance, risk management and audit functions. Dispassionate and independent review of their data protection processes from a data subject’s perspective versus the business’ was deemed impossible.

Some examples of roles often asked to also take on the DPO role

  • CIOs
    who define the IT strategy, including where data resides, how it is accessed and who by, and on which platforms.
  • CISOs
    who build security strategies that prioritize certain measures or defending against certain cybersecurity threats.
  • COOs and CEOs
    who have responsibility and/or influence over how data is processed, for what purpose and through what tools.
  • Heads of legal
    who balance the interests of the organization against what is permissible or possible under the law.
  • Heads of compliance
    who balance the organization’s needs and operations with the requirements of various regulatory frameworks.
  • Heads of departments
    E.g. marketing and HR, who determine how data is processed within their teams in order to meet their objectives.

The whole point of the DPO is to stand apart from the interests of the business and be the voice of the data subject.

How can any of these roles – all of which put the interests of the business first – be compatible with a second role that expects them to demand the business undertakes specific actions that will protect the interests of the data subject? Or even to spot the need for additional actions. External perspective is often key.

Should you outsource your DPO?

A company must appoint a DPO who is free to operate independently. There should be no pressure from management, or risk of insufficient perspective on data-centric processes or strategies that may jeopardize the continuous privacy of personal data.

If you suspect your current internal DPO appointment is putting your GDPR adherence at risk, then you should consider making a change soon.

Reasons for considering outsourcing the DPO role:

  • Guarantees impartiality
    Appointing an external party is specifically permitted under the GDPR, due to the ability for the person to avoid conflict of interest, act dispassionately and often challenge senior management easier.
  • Greater accuracy
    An external DPO is likely to perform better than an internally-appointed DPO who may be restricted by the working practices of the business or by not wishing to undermine wider objectives.
  • Wider skillsets
    The better tier of outsourced DPO services bring not only legal expertise, but also data security and technology, plus experience across numerous jurisdictions and data privacy frameworks.
  • A show of trust
    It shows data subjects and Supervisory Authorities that you take the privacy of data seriously, and are not willing to take dangerous short cuts to adherence.
  • Faster to appoint
    Some try to hire a dedicated DPO, but find they are in high demand and short supply – some reports say 1 candidate to 10 open roles, and many taking over a year to appoint.
  • Significant savings
    Because of how rare suitably qualified people are, they often command a premium salary. Outsourcing the role is far more cost-efficient, and tends to bring wider skillsets.

How Calligo can help

Calligo’s expert and highly-qualified data privacy consultants, who each have a unique mix of legal, technical and infosecurity expertise, are ideally suited to serve as your outsourced Data Protection Officer.

Our DPO as a Service clients range from SME to the largest enterprises, span every sector, multiple geographies and privacy regulations, and process some of the most sensitive categories of data.

Our experts provide ongoing monitoring and audits of the collection and processing of personal data, plus staff training to ensure our clients’ total and ongoing protection. They also represent your organization to both data subjects and Supervisory Authorities .

To find out more about our Data Protection Officer as a Service, click the button below and speak to our expert Data Privacy Consultants

The post Does your DPO have a Conflict of Interest? appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/feed/ 0
UPDATE 6: The Data Privacy Periodic Table https://www.calligo.io/insights/glossary/update-6-the-data-privacy-periodic-table/ https://www.calligo.io/insights/glossary/update-6-the-data-privacy-periodic-table/#respond Wed, 20 Jan 2021 11:05:38 +0000 https://www.calligo.io/update-6-the-data-privacy-periodic-table/ To mark Data Privacy Day 2021, we have released a new update to the Data Privacy Periodic Table

The post UPDATE 6: The Data Privacy Periodic Table appeared first on Calligo.

]]>
Once again, to mark Data Privacy Day (or Data Protection Day in Europe), we have released a new update to the Data Privacy Periodic Table – our industry-renowned open project to create a regularly updated digestible guide to the confusing world of data privacy.

 

This is its sixth update, and roughly three and a half years after its launch in September 2018, what have we learned?

1

Firstly, that the open nature of the project is crucial. The contributions and observations that we receive on suitable additions and changes are often massively insightful and points of fascinating debate. We could not continue to deliver this resource without your input, so thank you.

Contact Sophie Chase-Borthwick here to add your own comments

 
2-1

Secondly, that judging by the eagerness with which it is consumed, shared and commented upon, data privacy remains just as confusing, daunting and multi-faceted an area of business as it did in 2018. If not more so. The speed with which the industry moves, the passion with which its rights are defended or demanded and the external forces that impact its application all contribute to a fascinating but at times overwhelming area of business.

3 And finally, the degree of change that we announce with each update highlights perhaps the most important observation: privacy is has become one of the keystones of modern business practice. It is an unavoidable duty of practically every organization and one that requires very specialist skills and experience.
 

So, what does this update include? 

Amongst others, new legislation, how geo-politics has impacted our industry, and commentary on effective implementation.

In other words, a mix of topics that serve to prove the points above: data privacy is perhaps the business area most heavily-scrutinised by governments and customers alike, and therefore the most susceptible to change.

{{cta(‘e90493ca-4c08-4fd3-a764-868a3d94d17d’)}}

Califorina Privacy Rights Act - Data Privacy Periodic Table-1 Element #116: “CCPA II” becomes CPRA

The California Privacy Rights Act of 2020 (CPRA), also known as “CCPA II” or “CCPA 2.0” passed in November 2020.

It will take effect on 1st January 2023, though any breaches in 2022 will be enforceable against from January 2023. Importantly, it does not replace the pre-existing CCPA. Instead, the CCPA will be incorporated into the new CPRA in 2023 and will remain in effect in the meantime.

As a result, we are keeping CCPA in the “Core Legislation” section, and CPRA in the “Future Developments” section until it takes effect.

 

Some of the principle modifications the CPRA introduces are:

right-arrow

“Do not sell” expanded to “Do not share”: Under CCPA, consumers are able to opt-out of businesses selling their personal data, but the CPRA expands this to ‘sharing’, giving consumers control over who their data may be shared with, particularly advertisers, even if there is no form of transaction involved.

An important side point is that consumers under 16 years old are required to opt-in to the sale and sharing of data, with consumers under 13 requiring parental consent to do so.

right-arrow

New consumer right: the ‘Right to Correct Inaccurate Personal Information’ (or the right to rectification)

right-arrow

A new sub-category of personal information, “sensitive personal information”: Those familiar with the GDPR will recognise many of the categories, though also appreciate how this is also a broader set of criteria. Sensitive Personal Information brings with it additional duties for businesses processing it and additional consumer rights to limit its use. It includes:

o Government-issued identifiers, such as social security numbers and passport numbers
o Financial information, including account details and credit card information
o Precise geolocation data
o Biometrics
o Data on racial or ethnic origin
o Data on religious or philosophical beliefs
o Union membership information
o The contents of any mail, email or text messages

 

The latter is a very interesting inclusion given the overall context of the US government’s powers of data retrieval from technology companies if it would assist in national security investigations.

right-arrow

A new regulator: The California Privacy Protection Agency is the first US government agency at either federal state level to have been created with the sole purpose of protecting individuals’ data privacy. Previously, any disputes had been handled by the Federal Trade Commission and state Attorneys General under “unfair or deceptive trade practices.” As a signal of California’s intent and need to enforce data privacy regulations appropriately, the establishment of the regulator sits outside the CPRA’s overall timeline and is required to assume rulemaking and enforcement authority from the California attorney general no later than July 1, 2021.

We have previously commented on the US federal law, Children’s Online Privacy Protection Act (COPPA), and included it as a specific element in the Future Developments section. This was because of the debate over consent and types of data introduced by the PROTECT Kids Bill, introduced in 2020. The main areas of intended reform were the ages at which consent could be given, and the types of data requiring protection.

CPRA addresses these concerns in substantial part with a combination of the opt-in requirement in the ‘Do not share’ modification plus the new sub-category of personal information. While only a state law, we have removed COPPA from the Future Developments section, but with the caveat that we intend to keep a watchful eye on how – or whether – this attention to minors is echoed in future US state legislation.

 

 

For more commentary on the privacy landscape in the US today, and how it may change in the near future, take a look at another article written for Data Privacy Day 2021: The Top 3 data privacy requests for the new US administration

CPPA - Data Privacy Periodic Table-1

Element #112: COPPA is replaced with CPPA

In place of COPPA, we have added a new piece of national legislation to the Periodic Table: Canada’s Consumer Privacy Protection Act (CPPA), which was introduced in November of last year, only days after the similarly-abbreviated CCPA was passed in Canada. As Sophie Chase-Borthwick, our VP of Data Privacy & Ethics often comments, “privacy really needs some different characters in its acronyms.

In fact, the similarity was recognised by Navdeep Bains, the Canadian Minister of Innovation, Science and Industry, who at the announcement of CPPA also stated Canada’s law would be stronger than California’s.

 

Canada’s principle data privacy legislation, PIPEDA, has been fully enforceable since 2004, having received Royal Assent in 2000 and coming into force in stages from 2001. It was the foundation stone of a data protection regime that was considered robust enough for EU Data Adequacy to be awarded in 2002. CPPA will now amend and replace PIPEDA in order to create an even more stringent environment.

These reforms are important. With the arrival of GDPR in 2016, and the tendency for seemingly every piece of new national data privacy to mimic its provisions, the misalignments between PIPEDA and GDPR have become more and more stark, raising questions about Canada’s Data Adequacy rating, especially while the UK’s adequacy application is currently under such scrutiny (more on this later)

So while there is currently no published timeline for its completion and enforcement, there is clear urgency to protect Canada’s practical status with the EU and its overall data privacy reputation on the global stage.

Some of the key improvements (each of which has clear echoes of GDPR principles, and in some cases, even GDPR language):

 
right-arrow

Consent: CPPA would effectively update the consent thresholds of PIPEDA and CASL from implied to “GDPR strength” explicit (in most scenarios). This is one of the key objectives of CPPA, as informed, affirmative and freely-given consent has become a prerequisite for strong data privacy laws (again, more on this later)

Related to this, legitimate interests for processing that allow consent requirements to be bypassed are also formally defined for the first time in Canadian data privacy law.

right-arrow

Enforcement: The Office of the Privacy of Commissioner of Canada (OPC) can currently investigate complaints and make recommendations, but it cannot hand down fines or other punitive measures. The CPPA would give the OPC power to order the immediate cessation of data processing, and to deliver fines of up to 3% of global revenue or C$10 million, rising to 5% of global revenue or C$25 million for the most serious offences. As a reminder, GDPR’s fine framework has a maximum of 20 million Euros (approx. C$31 million), or up to 4 % of total global turnover.

right-arrow

Privacy management programs: Canadian organizations will be required to design and maintain privacy management programs that show how the organization will protect personal information, including the creation of policies and procedures, employee training and complaint processes.

right-arrow

Automated decision-making: Similar to GDPR, the CPPA addresses the use of personal data in automated decision-making. CPPA creates a right for Canadians to demand explanations of how automated decisions were made. In contrast, GDPR requires specific consent to be given to allow algorithmic decision-making to take place. This will be a particularly interesting area of discussion as the Bill progresses, as what constitutes ‘transparency’ vs a company’s IP will doubtless be a hot topic.

 
right-arrow

Data portability/mobility: The CPPA will the create a right for Canadian individuals to transfer data from one organization to another.

right-arrow

Right to erasure: Individuals will also be able to require organizations to delete data held on them and withdraw consent their consent for its use.

 

Consent - Data Privacy Periodic Table-1

Element #56, Consent, becomes #4

You cannot miss consent as an underlying trend of the article above. There is a clear determination within new major data privacy legislation to protect and define the requirement for businesses to have individuals’ specific, informed and voluntary consent to process their data. On this basis, we felt – as did some of the input we received over the last few months from industry figure who we regularly discuss this resource with – that Consent’s position in the Periodic Table was underserving and not representative of its importance.

 

We have therefore moved it up the “Lawful Justifications for processing” column to position 4. It’s important to note however that our original statement when the Periodic Table was first announced in 2018 remains true: all six of these legal bases are equally valid and powerful. The movement of Consent is simply to reflect its noteworthiness, not its superiority over the others.

 

Brexit - Data Privacy Periodic Table

Element #115: Brexit

Simply put, it’s still there. While Brexit may be formally concluded, from a data privacy perspective, it is still very much a live issue.

 

To save repetition, and because the issue requires fuller explanation than can be addressed here, we have included a dedicated resource to the Brexit situation here.

 

Brexit Jan 2021 thumbnail

How Brexit impacts your data strategy

The data leader’s guide to preparing their data environment for Brexit + a visual guide that shows how Brexit impacts GDPR obligations

Infosec -  Data Privacy Periodic Table

Element #67: Project Management replaced with Infosecurity

In reviewing the Periodic Table with some of our industry peers, we have decided to remove Project Management as a key skill.

This is because privacy should not be considered a “project”. To do so insinuates that privacy is a temporary initiative, with a clear end – which is the antithesis of a correct privacy mindset. While the initial set-up of a privacy programme can absolutely be considered a project, we have collectively agreed that organizations ought to be beyond this starting point now, and instead, privacy should be “live”.

This is of course perhaps not true in reality, as many organizations are still regrettably coming to privacy late, but we were regardless keen to make sure the Periodic Table did not propagate the myth of privacy being “a quick fix” rather than a culture to live by.

In its place, we have added “Infosecurity expertise”, pulling this out as its own required skillset, distinct from “Technical Knowledge” in Element #65, and essential for accurate, effective, coordinated and resilient privacy implementation.

The post UPDATE 6: The Data Privacy Periodic Table appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/update-6-the-data-privacy-periodic-table/feed/ 0
The social engineering tactics everyone needs to be aware of https://www.calligo.io/insights/glossary/the-social-engineering-tactics-everyone-needs-to-be-aware-of/ https://www.calligo.io/insights/glossary/the-social-engineering-tactics-everyone-needs-to-be-aware-of/#respond Fri, 02 Oct 2020 08:30:00 +0000 https://www.calligo.io/the-social-engineering-tactics-everyone-needs-to-be-aware-of/ Read about the increase in IT security and social engineering attacks targeting businesses during COVID-19 and how to protect your business against them.

The post The social engineering tactics everyone needs to be aware of appeared first on Calligo.

]]>
In recent weeks, we have seen an increase in the number of phishing attempts made to businesses as cybercriminals take advantage of the coronavirus (COVID-19) pandemic. It has become so prolific – and successful – that numerous IT security firms and law enforcement agencies, including the FBI, have released warnings.

The most common attack has been, as always, in the form of an email. Most are preying on users’ concern and thirst for information, as content posing as Coronavirus health advice, educational content or financial relief encourages them to click on links and download/open Word documents and PDFs. If these are clicked on or opened, malware or ransomware infects the device and compromises the network.

Despite the increase in security technology deployment – like anti-virus, malware, ransomware and SPAM – combined with strict processes, according to Accenture Security’s 2019 Cost of Cybercrime report, 85% of organizations still reported phishing and social engineering attacks in the last 12 months.

This is because a business’s biggest weakness to IT security, no matter what controls they have in place, is their employees. And during these bizarre times, the threat your workforce poses has never been greater.

Widespread and long-term working from home creates additional security threats that most businesses are unprepared for, making it a perfect hunting ground for phishing attempts.

 Persistent and unavoidable reliance on unsecured home networks
 Likely use of employees’ own devices
 Greater difficulty of verifying email instructions in person
 The difficulty of continuous reinforcement of the security threats
 Natural human susceptibility

It’s a lethal combination.

The secret is to educate your team on how social engineering works, and what to be mindful of – not just in terms of the recent COVID-19 threats, but also more widely.  

Social engineering – What does this mean?

Social engineering is the use of psychological manipulation to convince and trick people into providing confidential and/ or personal information. This tactic also involves sending links or documents in emails and text messages as well as across social media, that when clicked on could infect users devices or entire networks with malware or ransomware.

Types of Social Engineering:
 
Phishing:

Phishing attempts are one of the most common types of social engineering attacks. This is where cybercriminals use increasingly convincing communications such as an email or SMS message, and make it appear to come from an employee, a supplier, or even a financial institution.

These messages will require you to click a link to either an infected page or to a website impersonating a well-known brand requesting you to “log in” (see typosquatting below). They can also include malicious attachments such as Word, Excel or PDFs and encourage the user to download or open the files. Successful attacks often inject malware or ransomware into an organizations network, crippling business operations and financials.

For example, Travelex and Garmin, both suffered a ransomware attack earlier this year, and are still impacted by the attack. The impact of these attacks would have been minimal if proper IT security practices and processes were in place, as well as ongoing employee security awareness training. You can read more about these attacks, plus how to prevent them, here.

SMiShing:

SMiShing uses text messaging or messaging apps such as WhatsApp to send and encourage users to click on malicious links and to give away personal information. Recently there has been a rise in SMiShing attacks spoofing government agencies such as health care, and financial institutions offering to give away information regarding the COVID-19 pandemic.

However, SMiShing attempts can also like they have come from utility providers, online retail organizations and payment apps.

 
Whaling:

A whaling attack is a form of “phishing” and is communication designed to look like it has come from a senior member of an organization and targets high profile individuals or company executives and aims to steal sensitive information, gain access to the system or request a financial transaction. It can be in be emails, phone calls or text messages and is often referred to as CEO fraud.

 
Vishing:

Vishing is a voice-based phishing attack and is often someone posing as an executive of the organization or a contact from a known partner or supplier, requesting financial payments or information. The caller often sounds angry, irritated or panicked, which causes a stressful situation, often making the employee more likely to comply.

  
Baiting:

Baiting often pretends to offer something appealing such as free downloads, or for example, offering free healthcare advice about COVID-19. This is also known as “clickbait”. 

 
Typosquatting:

Typosquatting is when a cybercriminal will obtain domains with URLs similar to well-known organizations and rely on users to make typos and errors when typing in the URL. Unfortunately, these fraudulent sites can look so authentic that they request login and payment details or install malware onto a device solely by just landing on the page.

 
Social Media:

Social Media is a tool that increasingly being used for up-to-date news and is providing cybercriminals with a platform to set up fake accounts to promote “click-bait” posts, often masquerading as news, health care and financial advice.

Additionally, with more people documenting their personal lives on social media such as Facebook, Instagram and Twitter and unknowingly giving away personal information, it becomes easy for hackers to use the platform to find answers for passwords and IT security passwords such as the names of peoples’ relatives and pets.

 
How do I protect myself and my business from social engineering?

Here are a few tips on how users can avoid and combat social engineering attacks:

Do not open any links or attachments in emails from untrusted sources.
Be vigilant when opening any attachments, even when the email appears to be from someone you know. If you’re unsure, ask them.
Hover above a URL to verify beforehand, check for typos or wrong domains, if you’re still unsure, do not click on it!
If an email looks like it’s coming from someone you know but is asking for valuable company information or for financial transactions, usually with urgency, double-check the email address and verify this with a phone call to the sender.
Do not be fooled by “clickbait” offers!
Be wary of social media – how much personal information are you giving away? Don’t be tempted to click on links offering discounts or advice and news.
Ensure you use trusted media outlets and official healthcare websites to look for the latest news, information and advice.
Always use strong passwords or passphrases.
Don’t be afraid to ask questions and report anything that looks suspicious.
How Calligo can help

Calligo’s award-winning IT Managed Services includes IT Security services that address all three pillars of IT security and keep your business continuously protected from all attack types.

Our IT Security Services include:

Strategic security consultancy
Anti-virus, anti-malware, anti-ransomware and anti-SPAM
Security audits
Patch management
Penetration testing
Employee cybersecurity awareness training
Back-up & disaster recovery
Multi-Factor Authentication

The post The social engineering tactics everyone needs to be aware of appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/the-social-engineering-tactics-everyone-needs-to-be-aware-of/feed/ 0
Step-by-step guide to Schrems II and Privacy Shield’s invalidation https://www.calligo.io/insights/glossary/step-by-step-guide-to-schrems-ii-and-privacy-shields-invalidation/ https://www.calligo.io/insights/glossary/step-by-step-guide-to-schrems-ii-and-privacy-shields-invalidation/#respond Tue, 21 Jul 2020 09:19:09 +0000 https://www.calligo.io/step-by-step-guide-to-schrems-ii-and-privacy-shields-invalidation/ Our VP of Data Privacy has written a step-by-step guide to Schrems II & Privacy Shield’s invalidation, and what it means for your privacy obligations

The post Step-by-step guide to Schrems II and Privacy Shield’s invalidation appeared first on Calligo.

]]>
Data Privacy News: Step-by-step guide to Schrems II and Privacy Shield’s invalidation, and what it means for you

Last Thursday, the Court of Justice of the EU (CJEU), the European Union’s top court, struck down the EU-US data sharing agreement, Privacy Shield, technically known as the EU-US Data Protection Shield.

The case known as Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (also referred to as Schrems II) ruled that the data sharing agreement between the EU and the US, Privacy Shield, is not suitable as it does not provide adequate protection for EU citizens’ personal data when stored in the United States.

Schrems II

“The Court of Justice invalidates Decision 2016/1250 on the adequacy of the protection provided by the EU-US Data Protection Shield”

The above quote is the opening statement of the official press release from the CJEU regarding the case. Whilst the sentence appears simple enough, its ramifications are far more serious, and essentially puts thousands of businesses at risk of breaching GDPR.

Privacy Shield was one of the few mechanisms under GDPR where EU personal data could be transferred to the US, and with its immediate shut down, it leaves over 5,300 organizations who relied on this mechanism to find a new and safe way to transfer data.

The history behind the Schrems II ruling

June 2013 In June 2013, National Security Agency (NSA) whistleblower, Edward Snowden, discloses information regarding PRISM, a US government surveillance programme that collected data from some of American’s biggest tech companies which included Facebook, Google and Apple.
June 2013 In light of Edward Snowden’s disclosures, Max Schrems files his complaint to the Irish Data Protection Commission regarding Safe Harbor, an agreement prior to Privacy Shield, that was used to transfer EU citizens’ data to the US.

Schrems argued that by collecting his personal data and transferring it to the US for processing, Facebook was exposing him to mass surveillance, which is illegal under the EU’s Charter of Fundamental Rights.

June 2014 The following year, the Irish High Court refers the case, now referred to as the “Safe Habor decision” or “Schrems I”, to the CJEU (“Max Schrems v. Data Protection Commissioner”)
October 2015 The CJEU rules in Schrems’ favour and invalidates Safe Harbor, as it does not offer EU citizens adequate protection of their personal data against mass surveillance programmes in the US.
October –
December 2015
With the same motive as in 2013, i.e. resenting the potential exposure of his personal data to mass surveillance, Schrems files a second complaint to the Irish Data Protection Commission re the use of EU Standard Contractual Clauses, known as “Data Protection Commissioner v Facebook Ireland and Maximillian Schrems”. The case will also be referred as “Schrems II”.
July 2016 EU-US Privacy Shield was adopted as a mechanism for EU data transfers to the US, replacing Safe Harbor.
October 2017 The Irish High Court refers the Schrems II case to the CJEU
May 2018 On the 25th May 2018, Europe enforces its new data protection framework, the General Data Protection Regulation (GDPR) .
July 2019 The first hearing on the case Schrems II takes place at the CJEU
December 2019 CJEU Advocate General publishes his opinion on the Schrems II case.
16 July 2020 CJEU announce their judgement on the case, with Privacy Shield being immediately invalidated, but upholding data transfer via SCCs.

What did they say and why?

The important thing to note is that this decision was not based on business practices within the US, but in fact, on the surveillance and the regulatory climate within the USA.

As the U.S. Chamber of Commerce Executive Vice President and Head of International Affairs states, “…[the case] focuses not on commercial uses of data, but on concerns over potential government access.

There were two main rulings re Privacy Shield:

interface (1) US law enforcement agencies’ surveillance is not “limited to what is strictly necessary” – the EU standard. Therefore, any EU personal data transferred to the US under Privacy Shield is additionally – and unacceptably – exposed to surveillance. In fact, the judgement also revealed that strictly, US law states that surveillance on non-US citizens only needs to be “as tailored as feasible”.
interface (3) Protection of EU citizens’ privacy rights in the US is too weak. Neither EU member states, nor the US Ombudsman (set up to help EU citizens make any case) have either the authority or the practical ability to enforce GDPR in the US.

Considering these findings, it hardly comes as a surprise that the result came in as it did.

“In the light of all of the foregoing considerations, it is to be concluded that the Privacy Shield Decision is invalid.”

There was then an additional key ruling on Standard Contractual Clauses:

interface (4) Standard Contractual Clauses remain valid, though with a caveat that both the data “exporter” and “importer” must review whether the destination country offers a level of protection equivalent to that of the EU, and in particular what data access rights the country’s authorities may have.

Given the surveillance and regulatory climate of the US, and the judgement also actively encouraging Supervisory Authorities to strike down any SCCs where the guarantees within them are not upheld or capable of being upheld, it is unclear for how long SCCs will survive as a recognised legitimate mechanism.

Unsurprisingly, guidance is soon expected from the EU and Supervisory Authorities, though in the meantime, SCCs are an entirely legitimate data transfer mechanism.

What does this mean in practice?

If your business transfers EU data subjects’ data to the US, you may need to take certain steps to ensure continued compliance with the GDPR.

Circumstances include:

  • US-based organizations receiving data from EU customers
  • Moving data internally within your organisation, for example from EU regional office to US HQ
  • Using US suppliers for EU service delivery
  • …and plenty more

If any of these or similar circumstances apply to you, we have set out below some “what if…?” scenarios to help guide your next steps.

1. What if I am Privacy Shield-certified?

Privacy Shield may be insufficient, but it is still in operation. The US Chamber of Commerce has stated that it will

“continue to administer the Privacy Shield program…[and] today’s decision does not relieve participating organizations of their Privacy Shield obligations.”

Therefore, if you are Privacy Shield-certified, you must maintain this certification unless you formally withdraw from the scheme.

However, on top of this certification, you will now have to implement another mechanism for the lawful transfer of EU Personal Data to the US.

2. What if I only rely on Privacy Shield to transfer personal data from Europe to the US?

The Judgement has determined that Privacy Shield does not offer suitable protections for the transfer of EU Personal Data to the US. This means that you must put in place one of the following mechanisms with immediate effect, and then update your data sharing policies and documentation to reflect the change.

Standard Contractual Clauses

This is likely to be the most common mechanism relied on for transferring personal data to the US.

SCCs are contract articles pre-approved by the EC for use by organisations performing international transfers of EU personal data. They create the necessary obligations – beyond those of typical GDPR compliance clauses found in many supplier contracts – for how the data should be handled by the receiving party (in this case, based in the US).

However, given the uncertainty over SCCs’ future usefulness, this risk ought to be entered on your risk register.

Binding Corporate Rules

Binding Corporate Rules (legal mechanisms that allow multinational companies to transfer EU personal data to entities outside Europe) would likely be suitable for protecting EU Personal Data moving to the US. However, these require Supervisory Authority approval and take months if not years to finalise. These are therefore unlikely to be a viable option unless your business already has Binding Corporate Rules already in place.

If you are in the process of putting in place Binding Corporate Rules that cover transfers to entities outside Europe, but these are not yet approved, then you will still have to utilise another mechanism pending their approval – most likely, Standard Contractual Clauses.

Derogations

There are limited situations in which transfers of personal data to the US may be permitted without any formal mechanism in place. You should obtain legal advice if you are intending to rely on a derogation, as their application is very limited.

Consent

If you do not believe you will be able to put Standard Contractual Clauses in place and none of the other mechanisms apply, you should obtain the consent of your European data subjects to any transfer of their personal data to the US.

Please note that this consent must still comply with GDPR requirements – i.e. it must be freely given, specific, informed, and unambiguous.

3. What if I already have Standard Contractual Clauses or Binding Corporate Rules in place?

Standard Contractual Clauses and Binding Corporate Rules continue to be recognised as an appropriate safeguard for personal data transfers outside Europe.

Note from our experts on SCCs

Technically, Standard Contractual Clauses only cover transfers from European controllers to non- European processors/controllers, the general consensus has historically been that they will not be challenged if used in relation to transfers from European processors to non-European sub-processors / controllers, although that may change with the new judgement.

 

Note too that some Data Processing Agreements may even expressly require the non-European based processor to put Standard Contractual Clauses in place with their non-European sub-processors. You should, nonetheless, get legal advice on whether Standard Contractual Clauses would be enforceable in these circumstances.

Calligo designs continuous safety, privacy, and protection into every business data use, ensuring that every action is legal, ethical, and meaningful.

Find out more about our Data Privacy Services and how our experts in data privacy, data security and technology can build and support your data privacy programme by clicking below, or alternatively, contact the team directly, here.

Data_Privacy_Regulation_Services

The post Step-by-step guide to Schrems II and Privacy Shield’s invalidation appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/step-by-step-guide-to-schrems-ii-and-privacy-shields-invalidation/feed/ 0
The 5 Top Questions Around Data Privacy https://www.calligo.io/insights/glossary/the-5-top-questions-around-data-privacy/ https://www.calligo.io/insights/glossary/the-5-top-questions-around-data-privacy/#respond Wed, 11 Mar 2020 15:36:00 +0000 https://www.calligo.io/the-5-top-questions-around-data-privacy/ Data privacy is one of the most discussed topics in business circles. Read our guide to the most commonly asked data privacy and data security questions.

The post The 5 Top Questions Around Data Privacy appeared first on Calligo.

]]>
Businesses can no longer ignore the importance of data privacy. In this blog post, we round up the questions our data privacy services team hear most often, combined with data from Google Trends, to reveal the most prevalent concerns and areas of most confusion.

Q1: What is the significance of the CCPA, and when does it come into effect?

The California Consumer Privacy Act (CCPA) is a bill aimed at increasing the privacy rights and consumer protection for residents of California, United States of America. The bill was signed into law on June 28, 2018 and became effective on January 1, 2020.

The aims of the Act include allowing individuals to

  • Find out if, and what, personal data has been collected about them.
  • Find out if their personal data is sold or disclosed to a third party.
  • Find out who their personal data has been sold or disclosed to.
  • Put a stop to the sale of their personal data.
  • Access their personal data.
  • Ask a business to delete any personal information they have about them.
  • Not be discriminated against for upholding their privacy rights.

The significance of the CCPA is threefold.

Firstly, the companies that are in scope are inherently larger ones. The data privacy law affects all companies that serve California residents and meet any of the following criteria:

  • Exceed $25 million in annual revenue.
  • Hold personal data on at least 50,000 people.
  • Collect more than half their yearly revenues from selling personal data.

It’s important to note that the law applies to any companies that “serve California residents”, meaning that the companies affected can be located anywhere in the world as long as they provide their services in California.

Secondly, the law is significant because of its jurisdiction. Some of the world’s largest companies (Google, Apple, Disney) are based in California, and their handling of sensitive data will now be under intense scrutiny.

Thirdly, it is currently “the nation’s most far-reaching online privacy law and a potential model for other states”, according to the Washington Post. This means that while its impact and enforcement will be closely monitored for other states to follow, the very fact that it will likely create disparate data privacy laws from state to state may accelerate the ongoing conversation about the need for a federal data privacy law to avoid data privacy becoming a blocker to business. If you’d like to find out more about CCPA or how we can help you, click here.

Q2: Who has been fined under GDPR so far?

Since the General Data Protection Regulation (GDPR) came into force on May 25, 2018, businesses across the world serving European citizens have been held to new standards of data handling.

The steep financial penalties possible under GDPR have provided an incentive for companies, big and small, to introduce new policies and infrastructures to ensure their ongoing adherence to GDPR. Many have also taken the decision to contract  GDPR-qualified experts and Data Protection Officers to help them navigate this difficult change. However, not all businesses have implemented the necessary changes in time, and have thus faced heavy fines from the Information Commissioner’s Office (ICO). These businesses include:

  • Google – fined €50m for a ‘lack of transparency, inadequate information and lack of valid consent regarding ads personalisation’ according to the French data regulator CNIL.
  • TIM – Telecom Provider – fined €27,802,946 for unlawful data processing and a non-compliant aggressive marketing strategy, among other unlawful data collection processes.
  • Austrian Post – fined €18,000,000 for using customer data, including ages and addresses, to calculate the probability of which political party they might support, before selling this information to third parties.

…but there have also been far smaller fines handed to SMEs, undermining the argument that the Supervisory Authorities are only targeting large corporates. Examples include a 9,000 euro fine of a Spanish business that was using video surveillance of its employees without consent, a similar fine for a Cypriot government agency for allowing the police access to personal data without sufficient security, and an 18,000 euro fine for a Swedish school that used facial recognition for monitoring attendance, but did not provide suitable opt-out processes.

Q3: Who does GDPR apply to?

A common misunderstanding is that GDPR only applies to companies with offices or employees in companies belonging to the European Union. GDPR is designed to protect EU data subjects from unacceptable uses of their data, whether the company holding their data is based in the EU or not.

The real test is whether a business is offering services to the EU market, or is monitoring an EU data subject’s behaviour within the EU. if so, then their activities fall within the scope of GDPR regardless of their geographical location.

“Offering services to the EU market” is admittedly not clear and open to misinterpretation. To help, the European Data Protection Board (EDPB) has provided some examples of indicators of which territories an organisation is targeting, including:

  • Accepted currencies for payments
  • Languages of marketing materials
  • The locations where services can and cannot be shipped to

Q4: Why is data privacy important?

Data privacy is one of the fastest growing business issues on the planet, encompassing businesses of all shapes and sizes across every industry. Data has never been a more powerful or valuable commodity, and the proper handling of data (consent, notice, and regulatory obligations) is becoming increasingly regulated.

This is because the issue of data privacy has become a highly emotive and sensitive topic for data subjects, as the uses of data become more and more adventurous, personalised and at times, intrusive.

In fact, the importance of data privacy lies, for many, in its morality; keeping private data safe is seen as the ‘right thing to do’. Data ethics dictates that individuals should have agency over how their data, including how well it is protected, how much is given away, under what circumstances and for how long – much like physical property.

For data-intensive businesses, it has had some dramatic effects on their data regimes and, in some cases, even restricting their business models, such as curtailing the free use of automation or the collection and exploitation of data for marketing purposes.

Nevertheless, data privacy also brings massive opportunity. If data privacy is done right – or more specifically, if privacy by design is rolled out – then there are significant opportunities that come from a better understanding of the condition, location, source, use, importance and sensitivity of every piece of data.

By making your data well structured, visible and based on firm ethical and regulatory grounding, you can be more confident in your authority to use it and apply it to achieve your goal. The applications of data are endless, and if privacy is implemented by design then the business can leverage it in automation and machine learning experiments that improve marketing, sales and general business operations.

Q5: What is Privacy By Design?

Privacy by Design is a concept designed to guide businesses into becoming more proactive regarding data privacy. Built on seven principles, the concept sets the standards for how data privacy should be built into projects, processes and everyday activities. These seven principles are:

  • Proactively anticipating privacy-invasive events .
  • The maximum degree of privacy should be delivered by default .
  • Privacy should be incorporated from initial designs rather than added retrospectively.
  • Data privacy should not come at the expense of full functionality.
  • IT security across the entire lifecycle, from data collection, through to storage and eventual deletion.
  • Transparency at all times. All stakeholders should be informed of how data will be processed, stored and erased.
  • Data subjects should be given every opportunity to uphold their privacy rights .

Privacy by Design is important because it is not simply a framework to aspire to, but rather a necessary guideline for complying with privacy laws such as GDPR and CCPA. Public bodies like the ICO mandate that data privacy be upheld to the highest degree at every stage of a project, else face heavy financial penalties.

By incorporating these seven principles, businesses can ensure that they are treating their data subjects legally, fairly and ethically. Whether you are building a new IT system for storing personal data, developing policies that have privacy implications or looking to share data more actively with third-parties, Privacy by Design ensures that you remain privacy compliant from the very start.

The post The 5 Top Questions Around Data Privacy appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/the-5-top-questions-around-data-privacy/feed/ 0