By Shaundra Watson, Venkatesh Krishnamoorthy, Hadrien Valembois, Tomoko Naoe, Joseph Whitlock, Thomas Boué, Kate Goodloe, Wong Wai San, and Aaron Cooper.
For Data Privacy Week, BSA’s global policy team looks at the top tech issues outside of privacy that could impact privacy professionals around the world and throughout the year. Sharing insights from Brussels, New Delhi, Singapore, Tokyo, and Washington, BSA’s team highlights digital policy areas — from AI to cybersecurity to trade — that privacy professionals should watch in 2023.
- What AI Regulations Should Privacy Professionals Pay Attention to Around the World?
- India Is Among Several Countries Focused on Regulating Non-personal Data. Why Are Governments Interested in Applying Privacy-Style Safeguards to Non-personal Data?
- What Practical Issues Arise When Regulations Attempt to Govern ‘Data’ Broadly?
- How Are Policymakers Focused on Supporting Privacy-Protective Data Transfers?
- What Digital Economy Agreements Should Privacy Professionals Focus on in 2023?
- Privacy and Security Regulations Often Intertwine With Political Goals. How Is This Playing Out in Europe?
- When Governments Need to Access Data in the Course of Criminal Investigations, It Raises a Series of Privacy Concerns. How Are Policymakers Looking at Those Issues?
- Cybersecurity Is a Frequent Focus for Governments in the Asia Pacific Region. How Do Cybersecurity Measures in APAC Overlap With Privacy Regulations? Which Ones Should Privacy Professionals Focus on?
- What’s The Trend to Watch in 2023?
What AI Regulations Should Privacy Professionals Pay Attention to Around the World?
Shaundra Watson: As more companies begin using AI tools, their privacy teams are likely to be at the forefront of understanding related legal obligations and identifying best practices.
AI regulations are getting global attention. The proposed EU AI Act is the most comprehensive effort to regulate AI to date, setting out requirements for certain high-risk AI systems, including pre-market conformity assessments, post-market monitoring requirements, and transparency obligations. Beyond the EU, India’s telecom regulator is developing AI fairness metrics, while South Korea and Brazil have AI bills pending that would require AI systems to respect user privacy.
In the United States, policymakers have been active on the federal, state, and local levels, often in ways that combine AI and privacy. For example, Congress, the New York state legislature, and the Washington, DC Council are considering legislation that implicates both privacy and AI issues. In California, forthcoming privacy regulations will govern access and opt-out rights for automated decision-making, while state privacy laws coming into effect this year in Colorado, Connecticut, and Virginia all create rights to opt out of certain types of automated profiling.
Because many of these laws and regulations are not interoperable, they pose a challenge to developing compliance approaches that satisfy multiple jurisdictions. Another challenge is that the laws themselves may trigger privacy concerns, for instance through creating obligations to share sensitive information with government agencies or other third parties. These privacy-related issues may increase as the AI policy landscape continues to evolve.
― Shaundra Watson is Director, Policy at BSA, based in Washington, DC.
India Is Among Several Countries Focused on Regulating Non-personal Data. Why Are Governments Interested in Applying Privacy-Style Safeguards to Non-personal Data?
Venkatesh Krishnamoorthy: India has been considering privacy legislation for more than a decade. But in a surprise to many, policymakers published a draft Personal Data Protection (PDP) Bill in 2019 that addressed both personal data — the type of data typically covered by privacy and data protection laws worldwide — and “non-personal data,” which appeared to include all data that is not personal. Under the 2019 bill, the government could demand companies to provide non-personal data to “enable better targeting of delivery of services or formulation of evidence-based policies.” This might include, for example, a ride-hailing service mandatorily sharing data with the government to better target road repairs and traffic rerouting, or a large e-commerce platform being required to share anonymized data with small local companies to improve their community products or services.
Establishing one regulatory framework to govern both personal and non-personal data creates a range of concerns because those efforts have different policy goals. In proposing non-personal data regulation, policymakers in India have stressed their aim to create economic value from data. But efforts to monetize non-personal data may run contrary to efforts to protect the privacy and security of that data, which are long-standing goals of privacy legislation designed to safeguard personal data. Regulating both types of data through the same measures can conflate those objectives — and fail to achieve either goal.
In December 2022, the Government of India published a new Digital Personal Data Protection Bill that focuses on personal data and does not regulate non-personal data. However, India’s policy landscape is evolving rapidly, and there are plans to draft a Digital India Act (DIA) that would replace the Information Technology Act of 2000, in which non-personal data issues may resurface. That creates an opportunity for policymakers to address non-personal data in a manner that focuses on their broader goals, such as unlocking innovation in the digital ecosystem through the promotion of voluntary data-sharing mechanisms.
― Venkatesh Krishnamoorthy is Country Manager ― India at BSA, based in New Delhi.
What Practical Issues Arise When Regulations Attempt to Govern ‘Data’ Broadly?
Hadrien Valembois: Europe’s privacy “big bang” kicked off with the General Data Protection Regulation (GDPR), a legal framework that entered into force in 2018 to protect the personal data of EU citizens — even beyond EU territory. Since then, the EU has adopted several additional pieces of data-related legislation, including the Regulation on the free flow of Data, the Open Data Directive, and the Data Governance Act, among others. These instruments have highlighted the thin line between personal and non-personal data — and related practical issues in attempting to govern data broadly.
The EU Data Act, which is currently under negotiations, puts a spotlight on the challenge. Aimed at “unlocking the EU data economy,” the draft Data Act is Europe’s general legislative framework on non-personal data. Yet several provisions attempt to regulate situations where personal and non-personal data are closely intertwined. A chapter on cloud switching, designed to ensure a customer can “switch” its data from one cloud service provider to another, is a perfect example: by definition, such data includes mixed sets of both personal data (e.g., a customer’s address or phone number) and non-personal data (e.g., inventory data showing the number of items bought from an online store in a month, or information about a bank’s quarterly capital flows). Because the Data Act covers both sets of data, it blurs the line between where the Data Act stops and the GDPR starts — creating uncertainty over what rules should take precedence and when.
― Hadrien Valembois is Senior Manager, Policy — EMEA at BSA, based in Brussels.
How Are Policymakers Focused on Supporting Privacy-Protective Data Transfers?
Tomoko Naoe: Japan assumed the G7 presidency this year and is planning to advance the Data Free Flow with Trust (DFFT) concept, which encourages countries to promote data transfers while providing safeguards to build trust. Launched by Japan in 2019, DFFT was a major international data flow initiative under Japan’s G20 leadership that same year; it further advanced into roadmaps and an action plan by the UK and Germany during their recent hostings of the G7.
While world leaders rightfully acknowledge that cross-border data transfers offer enormous value in driving innovation, raising employment, and rebuilding economies, there has also been an increasing trend in some countries to impose data localization requirements and restrict cross-border data transfers.
To support policies that help instill digital trust, BSA established the Global Data Alliance (GDA) — a cross-industry coalition of companies that are committed to high standards of data responsibility. Launched in 2020, the GDA represents a wide range of industries including advanced manufacturing, aerospace, automotive, financial services, health, supply chain, and telecommunications. The GDA’s Cross-Border Data Policy Principles encourage policymakers to ensure that rules impacting cross-border data transfers do not result in arbitrary or unjustifiable discrimination or disguised trade restrictions. It is particularly important that when restrictions are applied, an array of data transfer mechanisms supporting the use of accountability models for personal data protection are available to foster responsible data transfer practices.
Last month, efforts to promote interoperability and transparency among different national systems resulted in the OECD adopting the Declaration on Government Access to Personal Data Held by Private Sector Entities, recognizing a shared set of principles for trusted government access to personal data. The EU and the United States are also making strides toward implementing a new transatlantic data privacy framework, with the European Commission publishing a draft adequacy determination in December. This year, Japan intends to propose an Institutional Arrangement for Partnership to bring policy experts, academia, and businesses together to collaborate on implementing projects to materialize the DFFT. Together, these efforts demonstrate the significance that governments worldwide are placing on promoting data policies that maximize the benefits of the digital economy.
— Tomoko Naoe is Director, Policy — Japan at BSA, based in Tokyo.
What Digital Economy Agreements Should Privacy Professionals Focus on in 2023?
Joseph Whitlock: Digital economy agreement (DEA) negotiations with a focus on privacy and cross-border data have accelerated following the 2018 Japan-US Digital Trade Agreement. Today, over 100 countries are negotiating DEAs to support digital trust and responsibility while safeguarding the ability to access information and transfer data across borders for the benefit of diverse sectors and communities. This includes negotiations in the World Trade Organization and the Indo-Pacific Economic Framework (IPEF), as well as other DEA negotiations led by Australia, Japan, Singapore, the EU, the UK, and the United States.
Privacy professionals should pay attention to these negotiations for three reasons. First, many DEAs require their signatories to adopt or maintain legal frameworks to protect personal information based on principles found in the OECD Privacy Guidelines, including principles of accountability, limitations on the collection and use of data, and purpose specification. Second, these agreements promote greater integration within regional and global innovation ecosystems, as countries agree to refrain from imposing data restrictions that impede the cross-border exchange of knowledge, technical know-how, scientific research, and other information. Finally, DEAs can help spread the benefits of digital transformation across sectors and communities: 75 percent of the value of cross-border data accrues to companies in sectors such as manufacturing, agriculture, and logistics. SMEs, in particular, benefit from cross-border market opportunities yet lack the resources of global corporations to navigate a myriad of different national data transfer barriers.
In DEA negotiations in 2023, we will see the strengthening of a policy consensus that reflects longstanding norms of international economic law regarding the cross-border movement of goods, services, investment, and data. This consensus — reflected in the GDA Cross-Border Data Policy Principles — reaffirms:
- The freedom to adopt measures necessary to achieve legitimate policy objectives, including privacy;
- The renunciation of discrimination against non-national persons, products, or services;
- The commitment to minimize trade-restrictive effects; and
- Due consideration of principles of compatibility and interoperability with trading partner laws.
Today’s DEAs promote the cross-border adoption of robust, interoperable privacy frameworks, while prohibiting unnecessary or discriminatory digital barriers that impede access to information. Reflecting a growing commitment to digital trust among like-minded democracies, DEAs can and should promote responsible cross-border data policies that champion privacy, cybersecurity, scientific progress, and economic opportunity for all.
― Joseph Whitlock is Director, Policy at BSA and Executive Director of the Global Data Alliance, based in Washington, DC.
Privacy and Security Regulations Often Intertwine With Political Goals. How Is This Playing Out in Europe?
Thomas Boué: In Europe, current and ongoing efforts to increase cybersecurity levels and privacy protections are being derailed by political considerations that risk undermining both goals.
A political push toward greater protectionism in Europe is currently playing out in a draft European Cybersecurity Certification Scheme for Cloud Services (EUCS). Initially intended as a technical instrument, the latest draft of the EUCS includes several requirements that are not really about security. One rule, for example, would impose corporate ownership requirements for Cloud Service Providers (CSPs) in which only an EU headquartered company meeting specific (foreign) ownership and voting rights could hold “effective control” over the CSP — in this case, effective control means the “possibility” to influence decisions, not an actual instance. Other rules would impose localization requirements for data storage and processing.
The problem is that these so-called sovereignty requirements in the EUCS insert political goals into what should be a purely technical, outcomes-focused document. In the scheme’s current form, most businesses, organizations, and even governments in Europe would no longer be able to procure state-of-the-art tech tools, including cybersecurity services, ultimately hampering their ability to operate and keep their systems secure — never mind compete. The draft rules would also put troves of personal data at risk, as data transfer restrictions and localization requirements increase cyber risk.
Europe must work together with like-minded partners, including industry and governments, to advance its values-based approach to digital policymaking and ensure that cybersecurity and privacy, not politics, remain on top of the agenda.
― Thomas Boué is Director General, Policy — EMEA at BSA, based in Brussels.
When Governments Need to Access Data in the Course of Criminal Investigations, It Raises a Series of Privacy Concerns. How Are Policymakers Looking at Those Issues?
Kate Goodloe: Policymakers are looking at these issues in at least two ways.
First, governments worldwide recognize the importance of ensuring that law enforcement agencies can seek digital information in a manner that respects important safeguards for privacy and data protection.
Within the European Union, we’ve seen progress recently toward adopting a final e-Evidence proposal, which would create a framework for law enforcement authorities in one EU Member State to seek data in another Member State. Globally, the United States has also entered into a series of negotiations under the CLOUD Act, a US law that creates a framework for the United States to enter into agreements with other countries to govern access to digital information. Importantly, CLOUD Act agreements can include commitments to the rule of law, due process, privacy, civil liberties, and human rights protections, which can raise the protections reflected in government access standards.
Second, policymakers are also focused on ensuring that government requests for digital information do not remain secret forever. In the United States, the House of Representatives passed a bill that would reform gag orders issued under the Electronic Communications Privacy Act of 1986 (ECPA). That bill would have created higher thresholds for issuing gag orders and a statutory limit on how long they could last. Although that measure did not become law, the issue will likely continue confronting policymakers in the new Congress.
― Kate Goodloe is Managing Director at BSA, based in Washington, DC.
Cybersecurity Is a Frequent Focus for Governments in the Asia Pacific Region. How Do Cybersecurity Measures in APAC Overlap With Privacy Regulations? Which Ones Should Privacy Professionals Focus on?
Wong Wai San: In the Asia Pacific region, there are many examples where privacy concepts and rules appear in laws and regulations that are not strictly about privacy but about other issues like cybersecurity.
In 2016, China enacted the Cybersecurity Law (CSL), which includes measures on personal information and restrictions on data transfers out of China. Notably, the CSL conflates personal information with “important” information that may be critical for national security. As a result, agencies more focused on cybersecurity, including the Cyberspace Administration of China (CAC), oversee implementation rather than agencies dedicated to consumer privacy protection. In 2021, China enacted the Personal Information Protection Law (PIPL) — but it too was drafted and is implemented by the CAC and continues to treat personal information through a lens of national security rather than consumer privacy protection.
Vietnam followed China’s example by enacting the Law on Cybersecurity (LOCS) in 2018. Like China’s CSL and PIPL, the LOCS includes personal information protection and requires certain organizations to store personal data in Vietnam. Like China, Vietnam’s approach to personal information protection is heavily influenced by agencies in charge of national security — in this case, the Ministry of Public Security (MPS). And like China, we expect legislation specifically related to personal information protection, but these efforts are also driven by the MPS.
This pattern, of embedding privacy rules in laws and regulations not directly related to personal information protection, occurs throughout APAC and other regions. Privacy professionals must be aware of these overlapping laws — including how they intersect with more straightforward privacy regulations and which rules take precedence — even though they may fall out of their traditional areas of responsibility.
― Wong Wai San is Senior Manager, Policy — APAC at BSA, based in Singapore.
What’s The Trend to Watch in 2023?
Aaron Cooper: The reality is that data privacy is increasingly linked to other policy areas and objectives, as BSA’s team from around the world has highlighted. These areas include cybersecurity, AI, data transfers, law enforcement, and non-personal data. In many cases, concepts familiar to privacy professionals can often be exported to these emerging, adjacent areas — such as the use of risk management frameworks or distinctions between companies with different roles in handling data, like the longstanding distinction between controllers and processors in privacy laws worldwide.
The trend to watch in 2023 is whether governments use regulations in one substantive area to address a separate policy objective — or regulate in one area (e.g., cybersecurity) without considering the impact on another (e.g., privacy). These mismatched or siloed approaches often lead to confusion and ultimately do not advance either goal. But when governments take a holistic approach to data policy, clearly defining objectives and using the appropriate risk-based tools to address them, the resulting laws can address the problem of bad actors while maximizing the potential of digital transformation.
― Aaron Cooper, Vice President, Global Policy