Data Governance Archives | Calligo https://www.calligo.io/insights/data-governance/ Building value through data Mon, 08 Apr 2024 15:49:45 +0000 en-GB hourly 1 https://wordpress.org/?v=6.9.4 Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know https://www.calligo.io/insights/glossary/eu-proposed-artificial-intelligence-act/ Tue, 12 Mar 2024 15:12:23 +0000 https://www.calligo.io/?p=5181 The EU AI Act (the “AI Act”) is the world’s first comprehensive AI law. The Act lays down a harmonised legal framework for the development, supply, and use of AI products and services in the EU.   To whom does the AI Act apply?  The legal framework will apply to all AI systems impacting people […]

The post Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know appeared first on Calligo.

]]>

The EU AI Act (the “AI Act”) is the world’s first comprehensive AI law. The Act lays down a harmonised legal framework for the development, supply, and use of AI products and services in the EU.  

To whom does the AI Act apply? 

The legal framework will apply to all AI systems impacting people in the EU, regardless of where systems are developed or deployed. 

When will the AI Act take effect? 

The AI Act is currently expected to enter into force in Q2-Q3 2024, with different obligations then taking effect in stages. 

Understanding the  AI Act’s Objectives 

The draft AI Act seeks to achieve a set of specific objectives:  

  • Ensuring that AI systems placed on the EU market are safe and respect existing EU law; 
  • Ensuring legal certainty to facilitate investment and innovation in AI; 
  • Enhancing governance and effective enforcement of EU law on fundamental rights and safety requirements applicable to AI systems; and  
  • Facilitating the development of a single market for lawful, safe, and trustworthy AI applications and preventing market fragmentation.  

AI Act: different rules for different risk levels 

The new rules establish obligations for providers and users depending on the level of risk from artificial intelligence. While many AI systems pose minimal risk, they need to be assessed. 

 
1. Unacceptable risk 

Unacceptable risk AI systems are systems considered a threat to people and will be banned.  

They include: 

  • Cognitive behavioural manipulation of people or specific vulnerable groups: for example, voice-activated toys that encourage dangerous behaviour in children. 
  • Social scoring: classifying people based on behaviour, socio-economic status, or personal characteristics. 
  • Biometric identification and categorisation of people. 
  • Real-time and remote biometric identification systems, such as facial recognition. 

Some exceptions may be allowed for law enforcement purposes. “Real-time” remote biometric identification systems will be allowed in a limited number of serious cases, while “post” remote biometric identification systems, where identification occurs after a significant delay, will be allowed to prosecute serious crimes and only after court approval. 

2. High risk 

AI systems that negatively affect safety or fundamental rights will be considered high risk and will be divided into two categories: 

1) AI systems that are used in products falling under the EU’s product safety legislation. This includes toys, aviation, cars, medical devices and lifts. 

2) AI systems falling into specific areas that will have to be registered in an EU database: 

  • Management and operation of critical infrastructure 
  • Education and vocational training 
  • Employment, worker management and access to self-employment 
  • Access to and enjoyment of essential private services and public services and benefits 
  • Law enforcement 
  • Migration, asylum and border control management 
  • Assistance in legal interpretation and application of the law. 

 
All high-risk AI systems will be assessed before being put on the market and also throughout their lifecycle. 

3. General purpose and generative AI 
Generative AI, like ChatGPT, would have to comply with transparency requirements: 

  • Disclosing that the content was generated by AI. 
  • Designing the model to prevent it from generating illegal content. 
  • Publishing summaries of copyrighted data used for training. 

High-impact general-purpose AI models that might pose systemic risk, such as the more advanced AI model GPT-4, would have to undergo thorough evaluations and any serious incidents would have to be reported to the European Commission. 

4. Limited risk 

Limited risk AI systems should comply with minimal transparency requirements that would allow users to make informed decisions. After interacting with the applications, the user can then decide whether they want to continue using it. Users should be made aware when they are interacting with AI. This includes AI systems that generate or manipulate image, audio or video content, for example deepfakes. 

Opportunities 

Ethical Leadership: Organisations that prioritise ethical AI practices and demonstrate a commitment to responsible innovation can enhance their reputation and build trust with consumers, employees, and regulators. By aligning with the principles of the AI Act, organisations can position themselves as leaders in ethical AI deployment. 

Innovation and Differentiation: The AI Act promotes regulatory sandboxes and real-world testing, providing opportunities for Organisations to innovate and develop AI solutions in a controlled environment. Companies that invest in compliance and develop AI systems that meet the  AI Act’s standards can differentiate themselves in the market and gain a competitive edge. 

Market Expansion: Compliance with the AI Act allows Organisations to access the European market with confidence, as they demonstrate adherence to regulatory requirements and respect for fundamental human rights. This opens opportunities for expansion and growth in a region that values ethical AI practices. 

Talent Acquisition: Companies that invest in talent acquisition and training to support AIA compliance with the AI Act can attract top-tier professionals with expertise in AI governance, ethics, and regulatory compliance. Building a skilled workforce capable of navigating the complexities of AI regulation is essential for long-term success. 

The AI Act represents a real opportunity for Organisations that are looking to leverage the power of AI. However, there are some threats that business leaders also need to consider. 

Threats: 

Compliance Costs: The AI Act imposes significant compliance costs on Organisations, including overhead expenses related to risk assessments, governance frameworks, and regulatory reporting. Companies that fail to allocate sufficient resources to the Act’s compliance may face financial strain and operational challenges. 

Fines and Penalties: Non-compliance with the AI Act can result in substantial fines ranging from €7.5 million to €35 million, or a percentage of global turnover. Organisations that neglect the AI Act’s requirements or underestimate the severity of regulatory violations risk facing severe financial penalties that could impact their bottom line and reputation. 

Operational Disruption: Implementing robust governance and oversight measures to ensure  compliance with the AI Act may require operational adjustments and process changes. Organisations that fail to adapt their operations to meet the AI Act’s standards may experience disruption and inefficiencies that hinder productivity and competitiveness. 

Reputational Damage: Violations of the AI Act’s ethical standards or failures to comply with regulatory requirements can lead to reputational damage and loss of consumer trust. Organisations that are perceived as prioritising profit over ethics or disregarding fundamental human rights may face backlash from stakeholders and damage to their brand reputation. 

Conclusion  

In conclusion, while the AI Act presents opportunities for Organisations to demonstrate ethical leadership, drive innovation, and access new markets, it also poses significant threats in terms of compliance costs, fines, operational disruption, and reputational damage. By proactively addressing these challenges and investing in compliance with the AI Act, Organisations can navigate the regulatory landscape successfully and leverage AI technologies responsibly for long-term growth and sustainability. 

For more comprehensive information on Calligo’s Data Ethics and Governance solutions, visit https://www.calligo.io

For more information on Calligo’s AI solutions, visit https://www.calligo.io

The post Navigating the EU’s proposed Artificial Intelligence Act: What Organisations Need to Know appeared first on Calligo.

]]>
Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/ https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/#respond Wed, 06 Mar 2024 15:25:48 +0000 https://www.calligo.io/?p=5169   In this lively debate you will hear from Calligo’s Practice Leads as they discuss their key takeaways from 2023 and their data predictions for 2024 and beyond. Topics discussed include: Regulation of AI including the EU AI act AI hallucinations & AI bias Data governance and data fines Dashboard fatigue Data ROI

The post Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable appeared first on Calligo.

]]>

 

In this lively debate you will hear from Calligo’s Practice Leads as they discuss their key takeaways from 2023 and their data predictions for 2024 and beyond.

Topics discussed include:

Regulation of AI including the EU AI act

AI hallucinations & AI bias

Data governance and data fines

Dashboard fatigue

Data ROI

The post Data Transformation Predictions for 2024 – Calligo Data Leaders Roundtable appeared first on Calligo.

]]>
https://www.calligo.io/insights/beyond-data-podcast/data-leaders-roundtable-2024-predictions/feed/ 0
Lie Machines – The global fight against misinformation https://www.calligo.io/insights/data-insights/lie-machines-the-global-fight-against-misinformation/ https://www.calligo.io/insights/data-insights/lie-machines-the-global-fight-against-misinformation/#respond Wed, 14 Jun 2023 09:07:27 +0000 https://www.calligo.io/insights// Exorcizing the ghost in the machine In this latest podcast in our ‘Beyond Data’ series, Tessa Jones (Calligo’s Chief Data Scientist) and Peter Matson (Data Science Practice Lead) talk with Oxford University’s Professor Philip Howard about the threats posed to democracy by technology, specifically in the shape of Lie Machines. Fact or fiction? Microtargeting with […]

The post Lie Machines – The global fight against misinformation appeared first on Calligo.

]]>

Exorcizing the ghost in the machine

In this latest podcast in our ‘Beyond Data’ series, Tessa Jones (Calligo’s Chief Data Scientist) and Peter Matson (Data Science Practice Lead) talk with Oxford University’s Professor Philip Howard about the threats posed to democracy by technology, specifically in the shape of Lie Machines.

Fact or fiction? Microtargeting with lie machines

In this age of social media, chatbots and AI it’s never been easier for individuals to share their opinions.  Instant communication to, and engagement with, a global audience is now commonplace, and it seems there’s no need to let facts get in the way of a good angle. As Mark Twain, or maybe Winston Churchill, or most probably Jonathan Swift famously said, “a lie can travel halfway around the world whilst the truth is still putting on its shoes.” A great example in itself of the ease in which misunderstandings and misappropriations can become canon.

In this vein, Professor Howard has spent years studying the mechanisms in which opinion, behavior and values can be manipulated and misdirected by lie machines:

“Lie machines are large, complex mechanisms made up of people, organizations, and social media algorithms that generate theories to fit a few facts, while leaving you with a crazy

conclusion easily undermined by accurate information. By manipulating data and algorithms in the service of a political agenda, the best lie machines generate false explanations that

seem to fit the facts.”

Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations, and Political Operatives

We find lie machines in all types of countries and governing structures. They share common elements – political actors produce the lies, social media firms distribute them, and paid consultants market them. High profile examples of the effectiveness of the lie machine include the UK’s Brexit campaign, and Trump’s electioneering – in both cases patently untrue ‘facts’ and arguments were targeted at key voters by disinformation networks, troll farms and lie machines. Algorithms direct individuals towards ever-more insular sources and extreme content:

 “A healthy, public-facing algorithm might occasionally introduce another credible source…  we know the platforms play around with this stuff, especially during elections in the US”

Controlled by bad actors and forming a global ecosystem of lie development and propagation, these lie machines spread their tendrils across every social media platform, moving out from Facebook as new outlets develop.

Computational propaganda

Lie machines have evolved and finessed themselves as technology advances. Instead of stealing the photos, social media handles and biographies of real people, AI now generates new pictures and personas and thus evades technology platforms’ troll-spotting software.

Spreading propaganda far and wide, with a convincing voice, the lie machine

  • Has a profound effect on society, with a scale that is difficult to quantify
  • Is perfectly engineered to target human vulnerabilities, reducing critical thinking
  • Deliberately misrepresents and appeals to emotions and prejudices, using our cognitive biases to bypass rational thought and create echo chambers
  • Is vague and unknowable – what training data was used for large language models? (Professor Howard postulates that every Gmail sent over the last 25 years may have been scraped, along with content from junk news sites)

Doing better – where does the onus sit? User or developer?

When it comes to developing processes to combat the lie machine, there’s no one legislation or guiding principle that works. We must always consider the regional and cultural context of both data and users. Research can’t necessarily be amalgamated or directly compared from different regions and countries – for example, we know that the placebo effect is always greater in US medical studies. To date, technology has not always built in cultural nuances in how people use words, with intent and meaning lost in translation – the majority of network takedown orders are for sites that are not in English.

Wherever there is human input, there are behavioral differences that make it much more difficult to apply common rules:

“People who manage cookies are above average in terms of their knowledge of technology, so these people are generally more purposeful in terms of how they set up their news feeds and where they go for information”

The huge amount of disinformation spread around Covid and the resulting vaccination campaign demonstrates how potent the lie machine is. It doesn’t need to convince people its argument is right, all that is required is to introduce enough doubt, to highlight there is a chance of harm. After all:

“If everybody really understood probability, nobody would ever buy a lottery ticket”

Balance the field – breaking the lie machines

Professor Howard believes that whilst we are justified in our concern about the threats to democracy, the principles behind the lie machine can be harnessed for good – promoting topics that are in the public interest and generating democratic discourse:

“I am cynical, but not fatalistic”

He describes the steps we can take to break the lie machines:

  • Public policy oversight, founded in ongoing public data capture and analysis
  • Designing social media to highlight emerging consensus, rather than heated conflict – machine learning can amplify common ground
  • Setting election guidelines to create more opportunities for civic expression
  • Giving journalists, civic groups and researchers access to all the public opinion data that is currently in the hands of the technology firms
  • Ensuring that the big data collected by technology platforms is added to public archives

The answer is more social media, not less. But it needs to serve society much better.

IPIE – bringing down the lie machine

Professor Howard has recently launched a new program, creating an independent scientific body to foster global cooperation in safeguarding the online information environment. The International Panel for the Information Environment (IPIE) will assess the scope of the misinformation crisis, analyze its effects on our societies and the planet itself, and propose solutions. Featuring data scientists and engineers alongside neuroscientists and sociologists, IPIE hopes to be the beginning of a global effort to save our common information environment.

Watch the podcast for yourself below to hear more from Professor Philip Howard about the power of the lie machine, and crucially, to learn how we can use it for the collective good.

Professor Philip Howard is a social scientist with expertise in technology, public policy and international affairs. He is Director of Oxford University’s Programme on Democracy and Technology, a Statutory Professor at Balliol College, and he is affiliated with the Departments of Politics and Sociology. Currently, he is also a Visiting Fellow at the Carr Center for Human Rights at Harvard University’s Kennedy School.

The post Lie Machines – The global fight against misinformation appeared first on Calligo.

]]>
https://www.calligo.io/insights/data-insights/lie-machines-the-global-fight-against-misinformation/feed/ 0
Making complex data available for the benefit of society https://www.calligo.io/insights/data-insights/making-complex-data-available-for-the-benefit-of-society/ https://www.calligo.io/insights/data-insights/making-complex-data-available-for-the-benefit-of-society/#respond Mon, 15 May 2023 08:27:56 +0000 https://www.calligo.io/insights// In Calligo’s latest Beyond Data podcast, Tessa Jones (Chief Data Scientist) is joined by Dr Ellie Graeden, Research Professor (Center for Global Health Science and Security) at Georgetown University. Here we explore some of the episode’s highlights: At societal level, poor communication costs lives Transitioning data across and between departments and data systems has historically […]

The post Making complex data available for the benefit of society appeared first on Calligo.

]]>

In Calligo’s latest Beyond Data podcast, Tessa Jones (Chief Data Scientist) is joined by Dr Ellie Graeden, Research Professor (Center for Global Health Science and Security) at Georgetown University. Here we explore some of the episode’s highlights:

  • The inherent conflict of private data and the public good
  • Protecting individual rights within federated learning
  • The importance of effective communication and a common language
  • Designing systems and policies that work together
  • Focusing regulation on outcomes, not creating data siloes

At societal level, poor communication costs lives

Transitioning data across and between departments and data systems has historically been fraught with problems – who owns it? Who pays for it? Is it understandable and translatable into meaningful and actionable insights for the end user? 

Having worked extensively in disaster response, Dr Graeden has seen first-hand the potentially life-threatening issues that can arise when government departments’ data platforms produce incompatible outputs:

  • If 20,000 people need water, how many pallets need to be shipped?
  • If 10,000 electricity meters have been knocked out by a hurricane, how many people need feeding?

In such scenarios, identifying individuals amongst population-level data is crucial if the help provided is to be sufficient.

“We have to be able to really effectively move and communicate and share data that are relevant, in ways that they can get used by people all across the system”

Of course, any data system design should ensure privacy and protection for personal data. ‘Big data’ is still relatively new, and as such more powerful and widespread regulatory controls are now being introduced, although the US still does not have consistent requirements for how data should be handled. Fundamentally, meeting a population’s needs today, and planning for them tomorrow, requires the data of individual people to be analysed. Personal data must be shared quickly, effectively and all the while protecting individual rights. Data system design must therefore:

  • Include all players
  • Consider cultural constraints
  • Keep out bias
  • Ensure the right words and phrases are used
  • Focus on the ‘so what’, why does it matter?

“Every single thing we experience can be captured as data”

Even the most mundane moments in our daily lives leave a digital footprint, we shed data everywhere. But when does ‘my’ data become public, or the property of the software developer or the service provider? VR headsets collect ephemeral data that is analysed and applied for that one end user, but if that data is assumed to fall under GDPR the potential to use it for positive outcomes is severely limited. For example, should authorities be notified if content viewed and generated is illegal or harmful? And what if that chip can detect if the user is having a stroke, is that data classified as ‘health’ data? Can it be used to alert the individual to their medical emergency without contravening legislation? What if your mouse clicks can detect the early stages of Parkinson’s? Should you, could you, be told?

“If you’re treating this data as health data, then they have a very different set of regulatory constraints. HIPAA isn’t going to regulate those because it’s not a health care provider or a health insurer”

Piercing the veil

The conflict between personal protection and public good is everywhere, and Dr Graeden believes that some new data laws will create problems for federated learning. Legislation has clear boundaries (speed limits, blood alcohol levels) whereas science deals in spectrums, probabilities and unknowns.

Deleting an individual’s personal data from the model breaks the system, contradicting what regulators are trying to achieve. The solution is to prioritize outcomes, not processes – it doesn’t matter whether you write the rules with a pen and paper, or with AI, as long as you write the rules. Expanding the framework by setting gradients of data availability affords protection for individuals, whilst making data available that informs better decision making for public bodies.

“Data is nothing more, nothing less, than an abstract description of our world. A useful and powerful language that can tell us things that other languages don’t”

Data can no longer exist in siloes if it’s to be useful to society

There is now a healthy global appetite for the discussion around data, thanks in the main to two recent developments:

  • Covid gave us huge amounts of data about mortality levels, vaccination rates, hospitalisation trends – all of which were in the public consciousness every day
  • AI and ChatGPT – articles and debates about the pros and cons are everywhere, discussion is not just in the scientific community

The key challenges now for data scientists are expectation management and communication – we need to be clear about aims and specific about context, as well as knowing what to leave out to avoid overwhelm and misunderstanding. Unfortunately, scientists are not always great communicators (using complex terminology and detail, rather than common parlance and generalization) as Covid demonstrated:

  • Did having a vaccine mean you wouldn’t get sick? Or just less sick?
  • ‘Everyone should wear a mask’ became ‘wear a mask if you can’. This was due to limited supply, but it appeared that the science was not clear

“The scientific approach means you never have an answer… we are trained as scientists to focus on the fact that we don’t know”

In fact, the only answer is that the right data, used consistently and communicated clearly, will always allow us to be prepared, not reactive. To make decisions for the public good that protect every individual.

You can find out more about the common language of privacy in our Rosetta Stone eBook.

You can also watch Tessa’s fascinating podcast with Dr Graeden below.

The post Making complex data available for the benefit of society appeared first on Calligo.

]]>
https://www.calligo.io/insights/data-insights/making-complex-data-available-for-the-benefit-of-society/feed/ 0
Why data-ambitious organizations need more than a Chief Data Officer (CDO) https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/ https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/#respond Fri, 04 Feb 2022 10:16:16 +0000 https://www.calligo.io/insights/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/ The rise of the CDO The potential value of data – if used optimally – is unquestioned. In recent years, there has been a clear acceleration in the number of organizations keen to not only better understand their data’s potential, but also govern it more rigorously, structure it more usefully and use it more creatively. […]

The post Why data-ambitious organizations need more than a Chief Data Officer (CDO) appeared first on Calligo.

]]>
The rise of the CDO

The potential value of data – if used optimally – is unquestioned.

In recent years, there has been a clear acceleration in the number of organizations keen to not only better understand their data’s potential, but also govern it more rigorously, structure it more usefully and use it more creatively.

And so, they appoint a Chief Data Officer (CDO) to drive this change.

This person – the business hopes – will “take hold of the data problem”, pulling sources and siloes together to create clarity, drive automation, place data and insights into the hands of the front line, and improve business performance and customer satisfaction.

Discussing Client Ambition

When discussing these ambitions with our clients, the excitement and optimism is clear. But what is often missed, or at best over-simplified, is the need to execute safely.

Managing the security risk to the organization is a fundamental part of a CDO’s remit. Depending on the organizational structure, it is usually shared with or delegated to a dedicated CISO or equivalent.

Similarly, compliance with industry regulations and certifications such as ISO and SOC comes under the governance aspect of the CDO role (again, often shared with / delegated to the CISO)

But what about Data Privacy?

CDOs and data privacy

In the pursuit of these ambitious data goals, while the CDO and/or CISO handle security and compliance, who will manage the privacy-related risks to the organization? And the risk to the data subjects?

  • What data is personally-identifiable, and therefore subject to data privacy laws?
  • Where is this data received from and held?
  • How retrievable is it?
  • How is it used?
  • Will personal data be exposed to machine learning or automated decision-making?
  • When and how is personal data shared?
  • Or disposed of?

In tackling these questions, some organisations believe the CDO can also perform the Data Protection Officer (DPO) role, or have one report into them or the CISO. Others appoint a Chief Privacy Officer, thinking they are the same as a DPO, or a “DPO+”. Others ignore the need for privacy oversight altogether.

None of these answers are wise. Some are even illegal and can result in penalties.

The truth is, most data-ambitious organizations require all three roles. Without them, data safety is jeopardised and the company is at risk of non-compliance, breaches, inefficiency and missed opportunity.

But how the remits are best defined and structured is often a mystery.

Below is a guide to the three pertinent roles – Chief Data Officer (CDO), Chief Privacy Officer (CPO) and Data Protection Officer (DPO) – outlining why each role is essential for every data-ambitious organization, plus their differences, inter-relationships, boundaries and overlaps.

CDOs and data privacy

In the pursuit of these ambitious data goals, while the CDO and/or CISO handle security and compliance, who will manage the privacy-related risks to the organization? And the risk to the data subjects?

  • What data is personally-identifiable, and therefore subject to data privacy laws?
  • Where is this data received from and held?
  • How retrievable is it?
  • How is it used?
  • Will personal data be exposed to machine learning or automated decision-making?
  • When and how is personal data shared?
  • Or disposed of?

In tackling these questions, some organisations believe the CDO can also perform the Data Protection Officer (DPO) role, or have one report into them or the CISO. Others appoint a Chief Privacy Officer, thinking they are the same as a DPO, or a “DPO+”. Others ignore the need for privacy oversight altogether.

None of these answers are wise. Some are even illegal and can result in penalties.

The truth is, most data-ambitious organizations require all three roles. Without them, data safety is jeopardised and the company is at risk of non-compliance, breaches, inefficiency and missed opportunity.

But how the remits are best defined and structured is often a mystery.

Below is a guide to the three pertinent roles – Chief Data Officer (CDO), Chief Privacy Officer (CPO) and Data Protection Officer (DPO) – outlining why each role is essential for every data-ambitious organization, plus their differences, inter-relationships, boundaries and overlaps.

Who you need

The Chief Data Officer (CDO)

Responsible for using data to best effect. The basis of this is data governance – its stewardship, consolidation, structure, management and distribution, but also the security and compliance risk it presents. On top of this lies innovation and how it can be most profitably exploited, whether through automation, analysis or data science.

The Chief Privacy Officer (CPO)

This role sits within the overall CDO responsibility. This role adds the perspective of privacy compliance to the CDO function, specifically in terms of any action’s risk to the company. As such, they will lead on the construction of the privacy programme, its roll-out and training and any necessary assessments.

The Data Protection Officer (DPO)

Represents the data subject within the organization. They oversee activities from data processing, assessments and employee training to ensure that none of them conflict with data subjects’ privacy rights, and as such must maintain independence from activities and reporting lines. While perhaps not technically required within your organization (for instance if you are not a public body, do not systematically process personal data as a core activity, or are not processing ‘large volumes’ of sensitive data), it is nonetheless a firmly recommended role for any data-ambitious organization with any degree of use of personal data.

Can these roles be combined into single individuals?

The CDO and CPO can be the same person, and arguably should be to ensure that the entirety of data safety – security and privacy – are the foundations of all data use and governance, and reducing the risk of accidental non-compliance, or painful retrofitting of compliance requirements.

The DPO and CDO (and/or CPO) must never be the same person, as it would create a punishable conflict of interest. They should not even be in the same reporting structure. The DPO’s role is to independently monitor and question all activities, strategic policies and objectives, which means they need the platform to challenge every level of the organization.

The risk of getting this wrong

Risk of unethical / non-compliant data processing

Our data privacy experts have often seen overenthusiasm and ambition innocently leading to personal data being misused. Without anyone overseeing the privacy risk to the data subject (DPO) or even the business (CPO), and a focus only on security, then organizations can easily overstep.

Missed opportunity

DPOs and CPOs are often mistaken for naysayers, as they too often focus on limiting what can be done with data and curtailing the ambition of the CDO. In fact, the best DPOs and CPOs will support the CDO’s objectives, by suggesting innovative approaches to data use that balance ambition with risk.

Delays

If privacy is not a foundation on which data ambitions are built, then it will either be forgotten or retrofitted. The former creates risk of breaches, while the latter creates delays. Projects that lay privacy on top, rather than being designed with it in mind from the outset, risk needing costly redesign and rebuilding.

Conflict of interest

A DPO has to be independent of the day-to-day processes of data management, including its receipt, use, treatment and security. This rules out those job titles that are classically given this second role, such as CIOs and Heads of Compliance, and that regulators are now punishing.

The Chief Data Officer (CDO)

Remit unique to this position:

Data governance

Ranging from data’s structure and architecture to its management and ongoing quality assurance. Accurate and efficient data governance is the foundation stone of all data initiatives. Data siloes, untidy or incomplete data and inconsistent data structures are the principle barriers to data ambitions.

Security-related risk to company

Clearly overlapping with the above, the CDO is required to identify where the ambitions for data’s structure, storage and use will create security and regulatory compliance risk. Working with the CISO – who may be alongside or within the CDO’s team – these risks then need to be mitigated comprehensively, and without obstructing operations.

Innovation / Data Science & Insights

This is the principal reason for the appointment of a CDO: using data creatively to further the aims of the organization as a whole. Building on the groundwork of data governance and security, this may be through automation, analytics, visualizations, machine learning or other forms of AI. Projects may be intended for internal efficiency, or the development of new products and services, but one truth remains at every initiative’s core: using data more intelligently.

The Chief Privacy Officer (CPO)

Remit unique to this position:

Privacy-related risk to company

While the CDO handles the security-related risk, the CPO looks specifically at personally-identifiable data, how well protected it is and how ethically / compliantly it is used. This will include determining how all the organization’s activities affect the regulations whose scope they fall under, and ensuring the various obligations are all addressed.

Clearly, this responsibility overlaps with the CDO’s security-related remit, and requires the cooperation of the CISO, as a lot (though not all) of a privacy-focused risk assessment is based in typical security technical and organizational measures (TOMs). As such, the CPO role may well be part of the CDO’s, if the individual has the relevant privacy skills.

Devise & deploy the privacy programme

This is the tactical implementation of the above. It involves the creation of policies and processes that will protect personal data in every department, by every user and with every data interaction, and specifically on an ongoing basis.

Unlike many other areas of compliance, data privacy requires continuous management and oversight. A breach of ISO compliance requirements on a given day is unlikely to jeopardise completing the next audit’s requirements and maintaining certification. In contrast, a single breach of data privacy requirements could result in customer dissatisfaction, being reported to regulators and potentially fines and irreparable brand damage. As such, the deployment of the privacy programme must ensure continuous protection.

Data Protection Officer

Remit unique to this position:

Privacy-related risk to data subjects

This is the crux of the DPO role. A Data Protection Officer is one of few senior roles who categorically do not serve the interests of the organization, but of third parties – arguably the only one. It is this unusual perspective that requires them to be independent of the mechanics of the organization, and that underpins all other responsibilities.

Oversight

The DPO is responsible for continuously monitoring all data processing activities and independently assessing their adherence to the GDPR and any other relevant legislation. Any faults or risks found are then the responsibility of the CPO and/or CDO to remedy, working alongside any relevant departmental head.

Internal audit

Part of the Oversight role above will include regular internal audits of data processing activities. An initial GAP Analysis will show a baseline of compliance, while subsequent periodic audits will showcase the evolving privacy maturity of the organization, plus any persistent weaknesses.

Liaison with authorities and data subjects

DPOs also act as a conduit for all communications with supervisory authorities and data subjects. They may do this proactively, for example securing approval from authorities on the legitimacy of any new and unusual data processing initiatives. DPOs will also handle the communications with any data subjects in the case of Data Subject Requests.

The Shared Remits

Shared remit: CDO & DPO

Automated decision-making

This is a crucial overlap. For many data-ambitious organizations, especially those in consumer services such as banking, telecoms or utilities, there will be a drive to use automation or machine learning to systematize interactions with customers based on the data on them as individuals. These may include the pricing and terms offered to them, which would mean that automated decisions are being made that have a legal or similarly significant effect – which is specifically limited by the GDPR and many other privacy regulations that followed in its footsteps.

This is therefore a classic example of a situation where the CDO and the DPO would have to work together to ensure that the project is legitimately designed and executed, and is highly indicative of why the DPO cannot be the same person or even be in the same reporting structure as the CDO. The CDO’s project needs to be able to be objectively critiqued and perhaps stopped by an independent DPO.

Shared remit: CDO & CPO

Ethical Data Impact Assessments (EDIAs)

EDIAs are modern supplements to the pre-existing Data Protection Impact Assessment (DPIA), and are effectively documented evidence of the scrutiny required above in instances of Automated Decision-making.

While not specifically required by privacy legislation or guidance – as a DPIA is – the sort of rigour they encompass is. As mentioned above, references are found in the GDPR and many other pursuant regulations. The extra scrutiny is recommended because of the deliberate removal of human oversight from processes, and therefore the risk of the inadvertent removal of understanding, proportionality, fairness and even values.

For a DPIA, a DPO and a CPO (see below) will collaborate on mitigating the risks to data subjects – hence the DPO’s involvement.

An EDIA’s extra considerations beyond a DPIA focus on accountability, transparency, necessity and sustainability. These are more technical, strategic and concerned with personal rights including but also beyond privacy, such as the right to not be discriminated against.

The CDO’s input will therefore cover the technical and strategic sides, while the CPO is best placed to review the technology’s ethical use. In truth, this is not a perfect fit. But there are few alternatives. A DPO’s role is to monitor activity through a strict lens of protecting data subjects’ privacy rights – and arguably their independence means their role can never be to perform assessments, only to review. Legal counsel is concerned with the application of the codified law, not the wider topic of ethics. Compliance roles are similarly used to implement specific rules and standards.

Upholding ethics is different by its nature, and not typically a nominated role within organizations, but a CPO is arguably the closest fit, not least because they lead the completion of DPIAs, on which EDIAs are based.

Shared remit: CPO & DPO

Training employees

This is part of the CPO’s deployment of the overall privacy programme, but requires the involvement of the DPO because of their responsibility for monitoring internal compliance. Acting on behalf of data subjects, the DPO will check the suitability and comprehensiveness of the training programme, in essence confirming that should the training be satisfactorily completed (the CPO’s responsibility to ensure), then data subjects’ rights are protected

Data Protection Impact Assessments (DPIAs)

These tools identify any potential risks that may arise from processing personal data, allowing the organization to minimise and negate them in advance. They are a key requirement for demonstrating adherence to GDPR and most other privacy regulations, and should be completed for every way in which an organization processes data.

They are the CPO’s responsibility to perform, though as with the Training above, the DPO is required to provide an oversight role to ensure data subjects’ rights are protected. They will advise the CPO on whether a DPIA is necessary in any given situation, how it should be performed, what measures can be legitimately put in place to negate any risks identified, and whether the ultimate decision that process is permitted or not is correct.

This process and shared responsibility applies equally to other privacy adherence tools such as Legitimate Interest Assessments (LIAs), where the CPO is responsible for performing the duty, while the DPO ensures their completion and verifies their outcomes.

Data Subject Access Requests (DSARs)

Some of the most common instances of CPOs and DPOs having to collaborate are on DSARs. In some industries, these are rather common, especially those with high volumes of consumer interaction such as retail, utilities, telecoms and retail banking. A CPO will be responsible for the performance of the DSAR – for example, verifying the identity of the data subject and collecting relevant data – while the DPO will be responsible for overseeing the process, approving the data to be shared, ensuring deadlines are met and handling communications with the data subject.

The Universal Responsibilities

Data Quality

All three Data Officers have a responsibility – or at least a vested interest – in maintaining the continuous quality of all the organisation’s data.

  • For a CDO, this is of course a principal strategic objective. Better use of data relies on data sources being cleansed for interrogation, and probably integrated under common data models to allow for deeper insights. But without continuous data governance – the process by which data quality is preserved – then interrogation becomes impossible, and integrations fall apart.
  • Data quality requires common rules – defined and upheld ultimately by the CDO – for how data is collected and stored; agreed responsibilities for how it is maintained and kept complete, credible, useful and clean,; and a clear vision for how it may be used.
  • The CPO and DPO will also have involvement in this, and vested interests in its performance. How and where the CDO decides to store data will need to adhere to data residency and sovereignty requirements. Data privacy regulations routinely give data subjects a Right to Accuracy, where every reasonable step must be taken to rectify data inaccuracies or erase data if no longer correct. And of course, without complete, clean and credible data, then DSARs cannot be accurately performed, and DPIAs and other typical processes cannot be conducted or verified easily.

DPIAs in fact even have a specific question of:

“Are you satisfied that the personal data processed is of good enough quality for the purposes proposed? If not, why not?”

Of course, the easiest way for Data Quality to serve all three Data Officers needs is to base the organization’s Data Quality framework on the principles of Privacy by Design & Default.

Contracts

While the above is a strategic imperative that requires all three Data Officers’ involvement, this is a tactical overlap.

  • Contracts with new suppliers, partners, and potentially customers that inherently involve the processing of personal data create responsibilities for CDOs, DPOs and CPOs alike.
  • A CDO needs to ensure that the contract and the mechanics of the engagement will not undermine or contradict any element of data governance. For example, if the new contract is with a new cloud services provider, can the provider support any ISO, SOC or PCI obligations? If the contract is with a new CRM, is the data structure consistent with any pre-existing common data model and how will data quality and accuracy be maintained? And in all cases, what security measures are in place to protect data from internal and external threats?
  • Meanwhile, a CPO will be concerned with whether the contract is in line with the organization’s privacy obligations. To use the example of the new cloud provider again, will data residency obligations be met? Or for new SaaS platforms, where will data be stored and are the correct cross-border data transfer mechanisms such as Standard Contractual Clauses (SCCs) in place?
  • Finally, a DPO’s role in a contract scenario is to review the legitimacy of the decisions made above, and verify that the privacy of data subjects’ personal data will not be jeopardised – regardless of whether the organization is a controller or a processor in the given scenario.

The Core Lessons

  • All three roles – CDO, CPO, DPO – are probably required in your organization, even if a DPO is not strictly required it is nonetheless advisable.
  • The CDO can also be the CPO, but the DPO must be independent.
  • The CDO defines the strategy and is responsible for the vision of what is to be accomplished with your organization’s data. This will include its structure, security, governance, maintenance and creation of value.
  • The CPO is responsible for ensuring that the implementation of this strategy will not put the organization at any privacy-related risk, and is tasked with mitigating any risk with a defined and well-executed privacy programme.
  • The DPO is the representative of the data subject within the organization, and is primarily responsible for overseeing the activities and ensuring no rights are or could be infringed.
  • The more fundamental or complex the operation (such as data quality or intelligent data use), the more likely it is to require all three roles.
  • Putting privacy – and better yet, total data safety – at the heart of every data initiative and interaction will make it more likely that every role’s agendas are equally met.

The post Why data-ambitious organizations need more than a Chief Data Officer (CDO) appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/why-data-ambitious-organizations-need-more-than-a-chief-data-officer-cdo/feed/ 0
Does your DPO have a Conflict of Interest? https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/ https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/#respond Mon, 20 Dec 2021 15:30:47 +0000 https://www.calligo.io/insights/does-your-dpo-have-a-conflict-of-interest/ What is a DPO? Unlike many other areas of compliance, data privacy adherence is not something that can be audited once and then presumed to continue for the foreseeable future. Data is the most voluminous, mobile, essential and potentially dangerous asset any business owns. It is created, deleted and interacted with constantly, often in new […]

The post Does your DPO have a Conflict of Interest? appeared first on Calligo.

]]>
What is a DPO?

Unlike many other areas of compliance, data privacy adherence is not something that can be audited once and then presumed to continue for the foreseeable future.

Data is the most voluminous, mobile, essential and potentially dangerous asset any business owns. It is created, deleted and interacted with constantly, often in new ways by new individuals.

A point in time audit is simply not suitable for continuous oversight of how data is treated.

It is this unavoidable truth that led the GDPR legislators to require organizations that process the most data, and/or the most sensitive data, to ensure that the interests of the data subject are continually and adequately represented in any and all data processing. Hence, the mandated requirement for the Data Protection Officer (DPO).

Under Article 37, DPOs are a mandated requirement if:

  • You are a public authority or body
  • You are an organisation whose core activities consist of processing operations that require regular and systematic monitoring of data subjects on a large scale (e.g. online behaviour tracking)
  • You engage in the processing of large volumes of special category data, or data related to criminal offences and convictions

The DPO’s tasks are outlined in Article 39 of the GDPR as:

  • To inform and advise the business and its employees of their GDPR obligations.
  • To monitor and audit compliance with the GDPR and the business’ data processing policies, including the assignment of responsibilities, awareness-raising and training of staff.
  • To manage data protection impact assessments, and monitor their outcomes.
  • To cooperate with and serve as the contact point for Supervisory Authorities.

Appointing a DPO internally

Many mandated businesses have dutifully appointed their DPO. They have consciously sought to avoid the expense, time and difficulty of hiring a new head, and distilled the requirements and responsibilities to their raw essences and found a person internally who:

  • Understands the way the company ingests and uses data
  • Has the standing and breadth of involvement in the business to appreciate every data workflow
  • Is experienced in the administrative, legalistic and monitoring sides of compliance
  • Is senior and credible enough – as the GDPR requires – to interact with, advise and perhaps argue with the highest levels of the business

This seems suitable. The rights and interests of the data subjects appear to be best protected by a person who has this experience and background, and who can monitor the organization’s activities and ensure their adherence to the rules and the sentiment of GDPR, such as the CIO, CISO, Head of Compliance, Head of Legal, even the CEO.

These organizations seem to be acting in totally good faith. After all, Article 38(6) even allows the DPO role to be secondary role on top of day-to-day operations.

But they have forgotten an underlying principle of the GDPR: the DPO must be independent.

By expecting someone who also has responsibility for the management, oversight, strategy or security of data and how it is processed (i.e. a data controller), to also scrutinise, critique and object to those same processes on behalf of data subjects is creating a conflict of interest.

It is like asking students to mark their own homework. As much as they may be obliged to remain impartial, they have their own obligations, objectives and interests that prevent them from being completely and undeniably impartial.

No matter how ethically they may think they act, it represents a compliance failure.

The danger

And legislators are hot on this. Most Supervisory Authorities, including the UK’s Information Commissioner’s Office (ICO), have issued specific guidance on how to avoid conflict of interest. While this proactive support shows that the SAs intend to help businesses avoid making this error, the flipside is that it also means they will not tolerate failure.

Indeed, fines have started to be handed to firms who overstep, intentionally or otherwise. A prime example is a E50,000 penalty for a Belgian telecoms operator whose DPO was also their Head of Compliance, responsible for the compliance, risk management and audit functions. Dispassionate and independent review of their data protection processes from a data subject’s perspective versus the business’ was deemed impossible.

Some examples of roles often asked to also take on the DPO role

  • CIOs
    who define the IT strategy, including where data resides, how it is accessed and who by, and on which platforms.
  • CISOs
    who build security strategies that prioritize certain measures or defending against certain cybersecurity threats.
  • COOs and CEOs
    who have responsibility and/or influence over how data is processed, for what purpose and through what tools.
  • Heads of legal
    who balance the interests of the organization against what is permissible or possible under the law.
  • Heads of compliance
    who balance the organization’s needs and operations with the requirements of various regulatory frameworks.
  • Heads of departments
    E.g. marketing and HR, who determine how data is processed within their teams in order to meet their objectives.

The whole point of the DPO is to stand apart from the interests of the business and be the voice of the data subject.

How can any of these roles – all of which put the interests of the business first – be compatible with a second role that expects them to demand the business undertakes specific actions that will protect the interests of the data subject? Or even to spot the need for additional actions. External perspective is often key.

Should you outsource your DPO?

A company must appoint a DPO who is free to operate independently. There should be no pressure from management, or risk of insufficient perspective on data-centric processes or strategies that may jeopardize the continuous privacy of personal data.

If you suspect your current internal DPO appointment is putting your GDPR adherence at risk, then you should consider making a change soon.

Reasons for considering outsourcing the DPO role:

  • Guarantees impartiality
    Appointing an external party is specifically permitted under the GDPR, due to the ability for the person to avoid conflict of interest, act dispassionately and often challenge senior management easier.
  • Greater accuracy
    An external DPO is likely to perform better than an internally-appointed DPO who may be restricted by the working practices of the business or by not wishing to undermine wider objectives.
  • Wider skillsets
    The better tier of outsourced DPO services bring not only legal expertise, but also data security and technology, plus experience across numerous jurisdictions and data privacy frameworks.
  • A show of trust
    It shows data subjects and Supervisory Authorities that you take the privacy of data seriously, and are not willing to take dangerous short cuts to adherence.
  • Faster to appoint
    Some try to hire a dedicated DPO, but find they are in high demand and short supply – some reports say 1 candidate to 10 open roles, and many taking over a year to appoint.
  • Significant savings
    Because of how rare suitably qualified people are, they often command a premium salary. Outsourcing the role is far more cost-efficient, and tends to bring wider skillsets.

How Calligo can help

Calligo’s expert and highly-qualified data privacy consultants, who each have a unique mix of legal, technical and infosecurity expertise, are ideally suited to serve as your outsourced Data Protection Officer.

Our DPO as a Service clients range from SME to the largest enterprises, span every sector, multiple geographies and privacy regulations, and process some of the most sensitive categories of data.

Our experts provide ongoing monitoring and audits of the collection and processing of personal data, plus staff training to ensure our clients’ total and ongoing protection. They also represent your organization to both data subjects and Supervisory Authorities .

To find out more about our Data Protection Officer as a Service, click the button below and speak to our expert Data Privacy Consultants

The post Does your DPO have a Conflict of Interest? appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/does-your-dpo-have-a-conflict-of-interest/feed/ 0
Data privacy programmes deliver more than privacy adherence https://www.calligo.io/insights/glossary/data-privacy-programmes-deliver-more-than-privacy-adherence/ https://www.calligo.io/insights/glossary/data-privacy-programmes-deliver-more-than-privacy-adherence/#respond Fri, 21 May 2021 13:05:37 +0000 https://www.calligo.io/data-privacy-programmes-deliver-more-than-privacy-adherence/ Examples of data privacy programmes delivering more than privacy adherence, such as reduced costs, new revenues, greater customer trust & new markets

The post Data privacy programmes deliver more than privacy adherence appeared first on Calligo.

]]>
Reduced costs, new revenue streams, greater customer trust and new markets

The best data privacy programmes are granular.

They assess the root of every data source, the nuances of every data use and the specifics of every way in which data is stored and shared.

From that finite visibility, liabilities can be identified and appropriate remedies put in place that carefully balance the demands of the data subjects with the needs of the business.

Without such an exact approach, then any privacy programme is paper-thin. Literally.

Policies and documentation do not make a data privacy programme. It has to be lived.

And this privacy-driven visibility of the entire data environment – every source, dataset, workflow and exit point – does even more for a business. It delivers a host of additional benefits beyond simple avoidance of sanctions, ranging from commercial opportunity to innovation.

Discover the 8 Commercial Benefits of Data Safety What is Data Safety? How do I achieve it? And how can I transform it from a cost to an investment?     Find out more

Below are four examples of Calligo Data Privacy Services customers who have used their data privacy programmes’ increased visibility of their data to achieve greater commercial benefits.

Customer Trust

One of our Data Privacy customers is a SaaS CRM provider that routinely handles special category personal data. Data safety and responsibility are non-negotiables for this vendor.

The granularity of Calligo’s data privacy programme led to the design and delivery of a Data Privacy by Design and Default initiative that put data privacy at the beginning of every aspect of development, minimizing risk to existing and even emerging regulations, without sacrificing time to market.

This strong privacy posture has given the customer the confidence to not just claim it is a data-responsible provider, but to even differentiate itself among its competition based on its heightened capabilities.

Re-discovered lost revenue

While working with a global fast-food franchisor, we went into the deep detail of every way in which franchisees shared their customers’ data with the overall franchise organisation. In so doing, we created a data workflow map, showing the routes that personal data took and the liabilities that may be created.

These data workflows coincidentally triggered and overlapped with various invoicing processes. In tracing them through, the franchisor discovered broken processes that were costing $10ks in lost invoicing opportunities. Lost revenue that would have never been rediscovered without the data privacy programme.

Shadow IT resolved, reducing risk, inefficiency and costs

We were also able to show that same customer the amount of risk that their teams’ widespread use of Shadow IT was creating. This ranged from unauthorised data-sharing mechanisms to ungoverned SaaS tools for individual departments, most of which were handling personal data. A familiar story for many businesses.

Not only was the customer able to put in place governance that allowed the safe continued use of some of the tools, but they were also able to spot where the use of others was sacrificing short term efficiency for longer-term inefficiency across the wider organisation. The customer acted positively and assessed why these tools were being introduced and what a better approach may be that could support the entire organization’s needs. This resulted in optimized processes and reduced costs.

Access to markets

One of the most common reasons for businesses wishing to formalise their data privacy approaches is so that they can confidently expand their geographical reach. With data privacy protections in place in more than 130 jurisdictions around the world, a robust and adaptable privacy programme is fast becoming a necessity for any ambitious business.

Similarly, many Data Privacy Services customers have used their deep understanding of their data to quickly attain the data security and privacy certifications required to start doing business with new industries. Healthcare, legal and financial services, and many more, all have their own industry-specific requirements, and the capabilities that granular data privacy programmes provide often account for substantial proportions of those frameworks.

The post Data privacy programmes deliver more than privacy adherence appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/data-privacy-programmes-deliver-more-than-privacy-adherence/feed/ 0
How to design Data Safety into your cloud https://www.calligo.io/insights/glossary/how-to-design-data-safety-into-your-cloud/ https://www.calligo.io/insights/glossary/how-to-design-data-safety-into-your-cloud/#respond Fri, 21 May 2021 11:55:48 +0000 https://www.calligo.io/how-to-design-data-safety-into-your-cloud/ What is Data Safety, why is it important, and how do you go about designing into the foundations of your data environment?

The post How to design Data Safety into your cloud appeared first on Calligo.

]]>
at is Data Safety, why is it important, and how do you go about designing into the foundations of your data environment?

When you see the phrase “Data Safety”, the chances are you think of Data Security. Most people do.

What is far less likely is that you think of the other two pillars of Data Safety: Data Privacy and Data Governance.

Clearly, all three pillars overlap. But Data Security seems to attract the most media attention, the most scrutiny and the most attention among business data leadership. In fact, when you compare the worldwide relative volumes of searches for the three terms, it shows an almost spookily even distribution:

Data Security Privacy Governance Google Trends

And yet, when you consider the typical data lifecycle, all three pillars have an equally vital role in the protection of data at every single stage.

A simplistic – and by no means exhaustive – example…

 Data SecurityData Privacy Data Governance
Data is created / receivedThreat assessmentRight to Object Right to Rectification Authority to receiveSuitable administration and custodianship
Data is hostedEncryptionTransparent and suitable locationSuitable administration and custodianship Backup and archival
Data is processedAppropriate use  Appropriate userData subject consentIndustry regulations
Data is relocatedSuitable destinationTransparency with data subject Data residencySuitable destination
Data is sharedAppropriate and verified recipient – not a malicious actorAppropriate and verified recipient – transparency with data subjectAppropriate and verified recipient – industry regulations
Data is lostDuty to reportDuty to reportBackup and disaster recovery

As has been said about Data Security for decades, the only way to ensure robust and continuous Data Safety with every interaction is to design it into the fabric of your data workflows. It is after all well-known that neither security, privacy nor governance can be applied as afterthoughts – they have to be built into a business’s operations from the ground-up. Every process the data flows through, every person who interacts with it and yes, every technology on its journey.

And there is no technology more crucial to data’s journey through a business than your cloud environment. Your cloud sets the tone for how your data is treated.

How can Data Safety become part of my cloud DNA?

We asked our Chief Information Security Officer, Mark Herridge, for his guidance on how to make sure that your cloud environment sets the right tone for how your data is treated throughout the business.

Data Safety in your cloud environment

Shift ‘Data Safety’ leftInclude security, privacy and governance considerations early into the procurement process versus adding in the final stages of development.
Own Your DataAll data requires an owner, so assign owners who understand the datasets, the current and potential value it holds to your business, and who are made responsible for defining each dataset’s data safety requirements.
Classify and TagAssign a sensitivity hierarchy to all your data, and keep security context with data whenever it moves between systems and services to ensure its Data Safety is maintained.
LifecycleSet a lifecycle that determines when data can be retired and is no longer needed to help ensure stale data does not linger, increasing your risk profile unnecessarily, and also consuming cost and potentially impacting decisions.
Location and LegislationKnow where all your data is stored and why, and the associated local data protection laws
Redefine your architectureDefine your architecture around the benefits offered by the cloud. Don’t redeploy the same architecture you use in your legacy environments in the cloud – especially as your previous Data Safety measures are either inappropriate to the cloud or outdated.
Control AlignmentCheck the alignment between your and your cloud provider’s security controls and where responsibilities lie.   Identify and address any gaps.
Monitor and Manage Vendor RiskEnsure the provider complies with relevant regulations and you proactively monitor the service.   Identify any sub-services the provider uses. Review the provider’s third-party audits.

“Data safety really does entail security, privacy and governance. They go hand-in-hand, you can’t focus fully on one, and not the others – they are both supportive of and reliant on each other.”

Mark Herridge
Chief Information Security Officer, Calligo

The two key takeaways are simple: Data Safety must not be treated as synonymous with Data Security, and the entirety of Data Safety must be written into the fundamentals not only of your cloud environment’s design, but also how data is interacted with from it.

To find out more about data safety and the commercial benefits it can deliver to your organization.

 

The post How to design Data Safety into your cloud appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/how-to-design-data-safety-into-your-cloud/feed/ 0
Data Privacy Update: Virginia Consumer Data Protection Act (VCDPA) https://www.calligo.io/insights/glossary/data-privacy-update-virginia-consumer-data-protection-act-vcdpa/ https://www.calligo.io/insights/glossary/data-privacy-update-virginia-consumer-data-protection-act-vcdpa/#respond Mon, 12 Apr 2021 14:33:42 +0000 https://www.calligo.io/data-privacy-update-virginia-consumer-data-protection-act-vcdpa/ Virginia passed its own privacy law - Virginia Consumer Data Protection Act (VCDPA) - giving consumers more control over the use of their data. Learn more

The post Data Privacy Update: Virginia Consumer Data Protection Act (VCDPA) appeared first on Calligo.

]]>
And so it continues. Last month, Virginia passed its own privacy law, the Virginia Consumer Data Protection Act (VCDPA), adding fuel to the fire over a US federal privacy law, and introducing new complexities for businesses operating in or addressing the US market.

It will take effect on January 1, 2023 (the same day as California’s CPRA which amends the current CCPA) and was passed in record-breaking time: less than two months, and by an overwhelming majority.

Such was its speed and simplicity that many other state bills are actively mimicking some of its propositions, including Colorado, Connecticut and Minnesota.

Theoretically, this active copycatting will limit the ongoing differences between state laws, but this of course remains to be seen.

So what are the similarities and the differences that you need to be aware of?

It’s best we focus only on what has actually been passed: CCPA (and where necessary, CPRA), GDPR and of course now Virginia’s VCDPA.

The next likely additions will be New York State, though it is some way off, or Washington State, though it seems engulfed in controversy and a lack of big tech backing due to fears of open floodgates for class action lawsuits. But much may change in their provisions between now and their implementation.

Observations

In short, CCPA, VDCPA & GDPR all overlap, but in different ways.

Any two of the three have substantial differences. But taken as a group, the areas of overlap – both philosophical and practical – are increasing.

For instance, more and more core rights and requirements are reappearing:

  • Universal rights:
    • Right to Access;
    • Right to Rectification;
    • Right to Deletion;
    • Right to Data Portability;
    • Right to Object to Data Processing;
  • Privacy Notices explaining what PII is collected, what is done with it & why
  • Appropriate Security Measures
  • Concept of Special Category data – although definitions vary
  • Controller / Processor concepts (if not the exact same name) and requirements for binding contracts between them

But it is the differences that create confusion and difficulty.

 What does this tell us?

Businesses have to date focused mainly or even solely on GDPR adherence, even if their activities bring them into the scope of CCPA, VCDPA or even other international laws.

As can be seen, this is not altogether a bad thing – the overlaps in the core principles and the nuances of the differences mean that:

A. Focusing on GDPR means the core universal rights and basic measures and requirements of most other legal frameworks will be addressed.

B. By pursuing solid, continuous and genuine GDPR adherence, there is tangible evidence of consumers’ rights being considered and respected, which goes a long way with authorities when the nuanced differences of other frameworks are not fully met.

However, there is still substantial risk with not appreciating local responsibilities. Local regulators exist to protect their own consumers and their own local rights, so while “honest efforts” will likely be an excuse in the early days, leniency will not be everlasting.

Businesses must start taking an intelligent approach to their liabilities, building a global privacy program that identifies the common ground across all relevant frameworks, and also introduces variations in data handling processes, internal and external policies and even company-wide strategy as soon as borders are crossed in any way.

It may be a lot to ask, but the benefits of the granular visibility of data workflows and interactions that this program requires can be significant, including brand trust, filled security gaps and even process efficiency

And until the fabled federal law arrives – which it must surely do – it is utterly necessary. After all, more states are coming with their own laws that while sharing plenty of similarity, will inevitably bring more individuality: Florida, Colorado, New York, Connecticut, Washington, Oklahoma, Ohio and Minnesota.

For more commentary on the future of data privacy, take a look at the Periodic Table of Data Privacy: an industry-renowned project that seeks to keep privacy professionals and business leaders up to date and informed on the practical application of data privacy

Periodic Table of Data Privacy The Data Privacy Periodic Table is an industry-renowned, easily digestible view of how the privacy world fits together   Download

The post Data Privacy Update: Virginia Consumer Data Protection Act (VCDPA) appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/data-privacy-update-virginia-consumer-data-protection-act-vcdpa/feed/ 0
The Data Privacy ‘To Do List’ for the new US administration https://www.calligo.io/insights/glossary/the-data-privacy-to-do-list-for-the-new-us-administration/ https://www.calligo.io/insights/glossary/the-data-privacy-to-do-list-for-the-new-us-administration/#respond Wed, 20 Jan 2021 11:25:54 +0000 https://www.calligo.io/the-data-privacy-to-do-list-for-the-new-us-administration/ If privacy were to be on the agenda for the new US administration, what would be the most beneficial data privacy initiatives that ought to be introduced?

The post The Data Privacy ‘To Do List’ for the new US administration appeared first on Calligo.

]]>
A new administrion is a federal privacy law.

It is a conversa

The US should learn from this. It has after all its own longstanding experience of how state by state commerce rules can at times create difficulties and additional expense. Law-making and enforcement is notoriously especially tricky. Imagine what happens with ddegree of protection and oversight.

This creates excessive regulatory burdens and hampers innovation.

But it is easily solved: recognise that the sensitivity of personal data can be classified not only by its technical category but also by its potency.

The US administration – as the government whose states are creating and debating the most privacy laws, and that oversees some of the largest technology organizations in the world – has an opportunity to address the proliferation of data-hungry organizations, control their appetites, while also appreciating the true variety of personal data beyond simple technical classifications.

To do so would not only earn the simultaneous approval of ‘big tech’ and small innovators, but also legislators, policymakers and privacy professionals who labour under this absurdity every day. And it would lay the foundations for the most modern and up to date privacy framework in the world.

A new administration in the most influential economy in the world triggers news hopes and expectations in every industry. But if major change were to be on the agenda, what would be the most beneficial, transformative, impactful or prudent new data privacy initiatives that the new US administration ought to introduce?

A federal privacy law

The obvious – and trickiest – first area for the new administration is a federal privacy law.

It is a conversation that appears every time a state introduces its own bill or law, as their frequent arrival only highlights the absence of anything federal.

The good news is that the various laws and bills in progress at the moment have largely the same motivations and aim to safeguard the same core rights of the individual.

The bad news is that their construction and provisions overlap by roughly 95%. Why is this bad news? Because those 5% differences are lethal.

5% of difference multiplied across dozens of states’ own laws has the potential to create gargantuan complexity. Frankly, untenable complexity.

Europe recognised this danger. With 28 countries (pre-Brexit) with a long history of – and reliance on – co-operation, when each of those countries had their own un-co-ordinated data privacy legal frameworks, chaos ensued. GDPR was essential to solve a real and present problem.

The US should learn from this. It has after all its own longstanding experience of how state by state commerce rules can at times create difficulties and additional expense. Law-making and enforcement is notoriously especially tricky. Imagine what happens with data flows, that have a foot in each of the business and legal camps. And yet, the US is seemingly willingly walking into a bizarre situation of hopelessly applying an “it’s OK, we’ve crossed the state lines” attitude to technology and data.

But aside from the apparent madness of the situation, what could be the practical consequences of the new administration not focusing on a federal privacy law?

right-arrowInnovation and development would be inhibited as it simply becomes too hard to be ambitious.
right-arrowData subjects will lose faith in their rights to data privacy and their ability to hold organizations to account, as their rights will be practically impossible to keep track of.
right-arrowThe confusion and practical difficulty of multiple state laws will likely lead to mass non-adherence, whether deliberate or otherwise, which will undermine not only the individual laws but also the principles of data privacy as a whole.
shield (3)Filling the gap left by Privacy Shield
Filling the gap left by Privacy Shield

When it was struck down in July 2020, nowhere was it mentioned what businesses who were in good faith accredited with Privacy Shield ought to now do instead. Especially as Standard Contractual Clauses were also called into question.

For more background on Privacy Shield and the “Schrems II” case that struck it down, visit our step-by-step guide here

The ruling focused instead on US surveillance practices and their incompatibility with the EU’s data privacy requirements, and the toothlessness of the US Ombudsman to enforce EU data subjects’ rights in the US. Both are important areas in need of redress.

But with the status quo deemed unsuitable, six months on businesses are still unclear how to build a framework through which they can legitimately transfer EU data into the US without relying on SCCs, that have themselves already suffered warning shots. After all, they are recognised as not covering all scenarios and many US companies are simply incapable of adopting the extra measures that the EDPB requires.

Of course, offering business advice is not the court’s job. But the previous administration has not picked up the ball and announced even a pathway to a solution, so it must fall to the new administration to do so – and fast.

UPDATE: On the first day of the new adminstration, Christopher Hoff was appointed as Deputy Assistant Secretary for Services at the Department of Commerce. Hoff’s key role is to oversee discussions with the European Commission on a new framework to protect transfers of personal information between Europe and the US. Hoff is a seasoned privacy professional, having been a chair of APEC’s Cross Border Privacy Rules Panel and a chief Privacy Officer at Huron. By placing a privacy expert in a broad international trade role, the new administration is signalling a reassuring appreciation of privacy’s centrality to international trade. 
project-managementCatch up with technology
Catch up with technology

Data privacy almost universally categorises all personal data as the same: any data that can be used to identify an individual is deemed personal data.

There may be varying levels of sensitivity that bring with them their own protection requirement, but fundamentally, all personal data is to be treated the same.

But while this casts a wide net of protection – which is good in many ways – in practice, this does not work.
Location data for example is a broad category. Is the data from your mobile phone that can record your whereabouts, typical journeys, patterns and deviations as sensitive as the data your smart vacuum cleaner processes? Technically, they are the same category of data, the same level of sensitivity and require the same degree of protection and oversight.

This creates excessive regulatory burdens and hampers innovation.

But it is easily solved: recognise that the sensitivity of personal data can be classified not only by its technical category but also by its potency.

The US administration – as the government whose states are creating and debating the most privacy laws, and that oversees some of the largest technology organizations in the world – has an opportunity to address the proliferation of data-hungry organizations, control their appetites, while also appreciating the true variety of personal data beyond simple technical classifications.

To do so would not only earn the simultaneous approval of ‘big tech’ and small innovators, but also legislators, policymakers and privacy professionals who labour under this absurdity every day. And it would lay the foundations for the most modern and up to date privacy framework in the world.

For more commentary on the future of data privacy, take a look at the Periodic Table of Data Privacy: an industry-renowned project that seeks to keep privacy professionals and business leaders up to date and informed on the practical application of data privacy
  

The post The Data Privacy ‘To Do List’ for the new US administration appeared first on Calligo.

]]>
https://www.calligo.io/insights/glossary/the-data-privacy-to-do-list-for-the-new-us-administration/feed/ 0