Data Privacy and Cybersecurity: Crisis Avoidance and Management Strategies
Lawmakers’ and regulators’ focus on data privacy and cybersecurity continues to intensify across the globe. In Latin America, penalties and sanctions for non-compliance with the Brazilian General Data Protection Law (LGPD) became enforceable from August 2021. The new Panamanian Law No. 81 came into effect in 2021, and draft laws in Argentina, Bolivia and Chile also are powerful examples of this trend.
Even as companies contend with new challenges from remote workforces and the pressures of maintaining business as usual during the challenging conditions of the covid-19 pandemic, they face increasingly stringent legal obligations to carefully handle corporate data. With regard to personal data, companies must comply with privacy protections in a broad range of areas such as initial data collection; daily data usage in the ordinary course; the transfer of personal data to vendors, acquirers or other third parties; and the sending of personal data to other countries.
These data privacy requirements go hand in hand with escalating requirements on data security. Under the laws of many jurisdictions, corporate cybersecurity programmes must now be at a certain level of substantive adequacy – often defined as ‘reasonable’ or ‘appropriate’ security. These mandates generally apply both to personal data and to other important corporate data, such as intellectual property and financial information. Companies are also increasingly required to disclose breaches of data security to regulators, affected individuals, counterparties and others.
As the way business is conducted changes in response to the covid-19 pandemic, the cyber risks companies face – and what would be considered reasonable security to protect against those risks – are also evolving. For example, what was reasonable when many sensitive meetings were face-to-face may not be reasonable now that almost all communication is remote. Companies’ increasing reliance on work-from-home technology also creates new opportunities for hackers, who have found remote workers often make easier targets for phishing.
Importantly, data privacy and cybersecurity are not just legal issues. They are crucial to the trust between a company and its customers and other stakeholders. People want to know that the companies they do business with, work for or invest in will handle data with care. Missteps in privacy and cybersecurity, therefore, can create a crisis with the potential to cut deeply into a company’s reputation and balance sheet.
A prominent example of the long-lasting impacts of a breach is Equifax, one of the leading credit reporting agencies. In 2017, when Equifax disclosed a data breach affecting the personal data of nearly 150 million Americans, USA Today reported on the incident under the headline, ‘Equifax image is battered by data breach as consumers feel violated’. Years later, the financial cost of the breach has continued to climb, and Equifax has spent nearly US$2 billion to resolve dozens of government investigations and private lawsuits.
We discuss below some of the key legal requirements that apply around the globe, starting with a focus on Latin America, and strategies for reducing legal and reputational risks related to data management. Because the applicability of many data protection laws depends on where the data subject lives and not necessarily where the company collecting or using the data is located, a broad understanding of global laws is valuable. Many best practices can help mitigate the risks that may materialise into a crisis, but the bottom line is simple: prepare, prepare, prepare. Bad data events do happen to good companies. It is best to assume that such bad events, in time, will happen to yours. Companies are thus well advised to be ready to respond vigorously and transparently, with a focus on maintaining that all-important trust.
Key privacy and cybersecurity laws
Argentina enacted the Personal Data Protection Law Number 25,326 (PDPL) in October 2000. Since 2003, Argentina has been recognised by the European Commission as a jurisdiction providing an adequate level of data protection.
In May 2019, the Agency of Access to Public Information issued Resolution 4/2019 setting out guidelines for the interpretation and application of data protection law in Argentina. The resolution provides guidance on consent (including consent of minors), automated data processing, data dissociation, right of access to personal data collected through surveillance and biometric data. Prior to the 2019 presidential election, the Agency of Access to Public Information also issued guidance, by way of Resolution 86/2019, to confirm that political opinions are considered sensitive personal information.
Key obligations of companies
Companies processing personal data must register their database or other data storage system with the Argentine Personal Data Protection Agency. Personal data cannot be processed beyond the purpose for which it was collected. Companies are obligated to ensure the accuracy of the personal data they process. Prior to processing personal data, companies must provide notice to and obtain consent from data subjects. The PDPL also requires companies to enact measures to guarantee the security and confidentiality of personal data that they hold and process.
Key rights of data subjects
Data subjects in Argentina have the right to request information from data controllers and receive access to certain of their personal information. Data subjects can also request the correction, modification or suppression of personal information stored by data controllers.
There is not currently a breach notification obligation in Argentina.
Transfer of personal data requires the consent of the data subject and is prohibited unless the receiving country provides an adequate level of protection. In 2018, the Agency of Access to Public Information promulgated Provision 159/2018, the Guidelines and Basic Contents of Binding Corporate Rules for International Data Transfers. Similar to the use of such rules for data transfers out of the European Union, a company’s adoption of these model rules allows for the transfer of personal data from Argentina to a country that Argentina deems not to have an adequate level of protection.
Privacy and cybersecurity
In July 2019, Brazil amended its data protection law, the LGPD, which was originally passed in August 2018. After several delays, the LGPD was expected to come into effect in August 2021. But in a surprise move on 27 August 2020, the Brazilian Senate officially declined to further postpone the law, meaning that the main provisions of the LGPD took effect on 18 September 2020, when the Brazilian President signed Conversion Bill 34/2020.
The LGPD was inspired by the European Union’s General Data Protection Regulation (GDPR). The GDPR increasingly serves as a global model for data protection legislation. While the LGPD is not as extensive as the GDPR, it shares many similarities. The LGPD applies to all processing of personal data by private entities if the data is collected or processed in Brazil, or if the processing is for the purpose of offering or providing goods or services in Brazil. As amended, the law created the National Data Protection Authority (ANPD), which will be responsible for overseeing personal data protection compliance and implementing and enforcing sanctions. On 26 August 2020, the President published Decree No. 10,464, which will establish the ANPD once its executive director is appointed.
Administrative sanctions under the LGPD came into effect in August 2021. Companies operating in Brazil or that collect personal data from Brazilian data subjects therefore should review their compliance with the LGPD, including by reviewing privacy policies, implementing security measures, updating procedures, including breach notifications and identifying agreements that involve cross-border transfers from Brazil. Brazilian regulators appear to be wasting no time in beginning to make use of this civil litigation authorisation, announcing the first civil action against a Brazilian company for alleged LGPD violations only four days after the law went into effect.
Key obligations of companies
The LGPD establishes 10 principles applicable to all data processing in Brazil, key among them that all processing must be ‘for legitimate, specific and explicit purposes of which the data subject is informed’. Other key principles include limiting processing to the minimum necessary, free access and transparency to data subjects, and an obligation to ensure accuracy of data. Companies are also required to establish security measures to protect personal data and to appoint a data protection officer.
Key rights of data subjects
The new law vests the data subject with the ownership rights to the subject’s personal data, and grants the subject the right to obtain access to and correction of personal data and to revoke consent to process his or her personal data.
The LGPD creates a data breach notification obligation. Companies must notify both the Brazilian authorities and data subjects of any ‘security incident that may create risk or relevant damage to the data subjects’. This notification must be completed within a reasonable period and contain a description of the incident, the information involved, the measures taken to protect the data, the risks related to the incident and the measures taken to mitigate the effects.
The LGPD prohibits the cross-border transfer of personal data unless such transfer falls within a limited number of enumerated exceptions. Exceptions include where the receiving country or organisation provides a level of data protection comparable to the LGPD or the data subject has provided specific consent for the transfer ‘distinct from other purposes’.
Key privacy and cybersecurity laws
Data privacy and cybersecurity in Chile is regulated through the Law for the Protection of Private Life (PDPL) 1999.
Key obligations of companies
Companies are required to provide notice to and receive consent from data subjects prior to the processing of their personal information, unless otherwise permitted by law. Personal data can only be used for the purpose for which they were collected.
Key rights of data subjects
Data subjects have the right to object to a company’s use of his or her personal data. Data subjects also have the right to request modification and deletion.
There is not currently a general breach notification obligation in Chile. Financial institutions regulated by the Superintendence of Banks and Financial Institutions (SBIF) do have regulatory obligations – updated as recently as August 2018 – that require reporting any incident that affects business continuity, the entity’s funds or other resources, the quality of the entity’s services, or the image of the entity. The SBIF has stated that it expects these reports to be made within 30 minutes – an extraordinarily short window during a high-pressure situation.
Under certain circumstances, where an incident affects the continuity of client services or the security of clients’ personal data, the affected institution may also be required to report the incident to its clients. Client notifications must be made in a timely manner; there is no fixed deadline.
There are no regulations on the transfer of data within Chile or across borders.
Key privacy and cybersecurity laws
Colombia enacted Statutory Law No. 1581, which regulates data privacy and security, in 2012. The Law applies to personal data processed in Colombia or where a foreign processor is subject to Colombian legislation. The Law establishes eight principles for interpretation and application:
- legality of data processing;
- legitimate purpose for processing;
- freedom for data subjects to control their personal data;
- accuracy of data;
- transparency in processing;
- limitation of access to those with authorisation;
- security of personal data; and
- confidentiality of personal data.
The Ministry of Trade, Industry and Tourism has enacted regulations pursuant to the law.
Key obligations of companies
Companies must provide notice to and obtain consent from the data subject prior to or simultaneously with the collection of his or her personal data, except where the data is publicly accessible. Before processing, companies must develop privacy policies available to data subjects, which must inform data subjects of their rights under the law. At the request of the Superintendence of Industry and Trade, companies must be able to demonstrate that they have implemented appropriate and effective measures to comply with the law.
Key rights of data subjects
Data subjects have the right to access at no charge from data controllers. Data subjects also have the right to request updating, rectification or suppression of personal data held by companies to ensure accuracy of the data.
There is no obligation to notify data subjects of a breach in Colombia, but data owners and processors must notify the Data Protection Authority of security violations where there is a risk to the administration of data subjects’ information.
Transfer of personal data to other jurisdictions generally is prohibited where the receiving jurisdiction does not provide an adequate level of protection. Transfer can nonetheless be made where the data subject has provided his or her express consent. Further, consent is not required for the transfer of personal data from a data controller to an overseas data processor, where there is a contract in place that complies with Article 25 of Decree 1377.
Key privacy and cybersecurity laws
Mexico enacted the Federal Law on the Protection of Personal Data Held by Private Parties in 2010. The government has also issued regulations pursuant to the Law, which came into effect in 2011; privacy notice guidelines, which came into effect in 2013; and parameters for self-regulation, which came into effect in 2014. The Law applies to all data processing in Mexico, including when processing is done outside of Mexico on behalf of a Mexican data processor.
Key obligations of companies
Mexican law requires that all personal data must be collected and processed fairly and lawfully. Further, personal data must be collected only for specified, explicit and legitimate purposes, and the amount of data collected may not be excessive relative to the purposes for which it was collected. Companies must take reasonable steps to ensure that the personal data in their databases is accurate and kept only for the time necessary to effectuate the legitimate purpose for which the data was collected. Companies must also appoint a personal data officer or department and establish risk-based security measures at least as robust as those used to protect the company’s own data.
Key rights of data subjects
Individuals in Mexico have the right to access and correct personal data, oppose the processing of personal data and revoke consent to the processing of personal information. Individuals also retain the right to be notified prior to consenting to the processing of personal data.
Mexico requires breach notification to affected data subjects where the incident materially affects the property or individual rights of a subject. The notification must include information regarding the nature of the breach, the personal data compromised, recommendations to the data subject to protect his or her interest, corrective actions implemented by the company and a method for data subjects to obtain further information.
Consent is generally required to transfer personal data across borders, and privacy notices in Mexico must inform data subjects when companies intend such a transfer. The transfer cannot exceed the scope of the disclosure in the privacy notice, and the receiving company must follow Mexican data privacy law.
Key privacy and cybersecurity laws
Data privacy and cybersecurity in Peru are regulated by the Law on the Protection of Personal Data (DPL), which was enacted in 2011. The Peruvian government issued the Security Policy on Information Managed by Databanks of Personal Data in 2013. In February 2021, Peru’s financial services and insurance regulator issued cybersecurity regulations that came into effect on 1 July 2021, except for a few sections that are slated to come into force on 1 July 2022.
Key obligations of companies
Companies may only collect personal data by legal methods, and they only can collect and process personal data with consent from and notice to data subjects for collection and processing. Data processing must be both proportional and non-excessive to the legitimate purpose of collection. Companies must work to ensure the accuracy of data collected and processed and implement necessary security measures to protect personal data. All personal data must be given an adequate level of protection. Additionally, financial institutions and insurance companies regulated by the Peruvian Superintendent of Banking, Insurance and Pension Fund Administrators (SBS) must comply with specific data security requirements as set out in its cybersecurity regulations published on 23 February 2021 (the SBS Regulations). The SBS Regulations came into force on 1 July 2021 and require regulated entities to develop cybersecurity programmes with reference to specific minimum security measures provided by the Regulations. The Regulations also assign specific responsibilities to boards of directors, management and risk committees. The Regulations also require companies to obtain prior authorisation from the SBS before contracting with any third parties in other countries to process data.
Key rights of data subjects
The rights granted to data subjects under the DPL include the right of access to a data subject’s personal data, the right to be informed of the purpose of collection and how the personal data will be processed, the right to request the correction of personal data, the right to oppose the processing of personal data and the right to refuse providing personal data. The DPL also grants data subjects the ability to pursue legal claims against companies that violate their data privacy rights.
Companies must provide notification to data subjects of ‘any incident that significantly affects their property or their moral rights’. Such notification must include a description of the incident, the personal data affected, information for the data subject on how to mitigate the potential damage and the remediation steps taken by the company. The breach notification obligation was echoed in January 2020 through Emergency Decree No. 007-2020, which confirmed that public and private entities acting as digital service providers must report to the data protection authorities when a digital security incident involving personal data occurs. Additionally, companies regulated by the SBS must also notify the SBS of any cybersecurity incident that has (or has the potential for) a significant adverse impact on the company as soon as possible, and carry out a forensic analysis of the incident that it must make available to the Superintendent.
The transfer of personal data outside of Peru is generally allowed as long as the destination country provides adequate data protection measures. If the destination country does not provide adequate protection, transfer may still occur where the receiving party agrees to comply with the DPL, where the transfer is necessary pursuant to a contractual relationship with the data subject, or with the data subject’s informed and express consent.
Key privacy and cybersecurity laws
In March 2019, Panama enacted Law No. 81, which took effect in March 2021. On 28 May 2021, regulations in the form of Executive Decree No. 285 were approved. The general principle of personal data protection is enshrined in Articles 29, 42, 43 and 44 of Panama’s Constitution. This summary sets out the requirements to comply with Law No. 81.
Key obligations of companies
Companies must obtain consent from the data subject, who must be informed of the proposed use. Consent must be recorded in a clear and easily accessible manner so that it may be traced back to the data subject. Companies may store data in a secure database for a maximum of seven years. The regulations promulgated through Executive Decree No. 285 also require that companies provide notice to the data subject when collecting his or her information that includes, among other things:
- the purpose for which the data is being collected;
- the basis for the company’s processing of the data;
- the categories of personal data collected;
- whether the company intends to transfer the data to another jurisdiction; and
- the term of retention or criteria that will be used to determine the term of retention.
Companies must also implement technical and organisational measures sufficient to guarantee the confidentiality, integrity, availability and resilience of systems that process personal data. When determining what security measures would be appropriate, companies should consider factors such as the nature of the data, number of data subjects, possible consequences stemming from a security violation and the costs of implementing particular security measures.
Key rights of data subjects
Information may be collected only with the prior consent of the data subject. There are limited exceptions where companies may process an individual’s data without his or her consent, including where necessary for a commercial relationship, for medical emergencies, for statistical or scientific purposes or where there is a legitimate interest pursued by the data controller. Data subjects have an ongoing right to access, modify, change or remove their personal information. Decree No. 285 lays out the procedures by which a data subject may exercise these rights and how a company must respond to any such requests.
In the event of a data breach, companies must inform data subjects within 72 hours. There is no requirement for companies to register with the National Authority for Transparency and Access to Information (ANTAI), but any notification of a breach to data subjects should also be reported to ANTAI.
Cross-border transfers are permitted where the receiving country has comparable data protection standards to Panama and the transferring company takes all necessary steps to protect the personal data being transferred. There are a few exemptions to this: (1) where the data subject provides consent to the transfer; (2) where the transfer is required for the performance of a contract; (3) for banking or stock exchange transfers; and (4) as required by law in compliance with Panama’s international treaty obligations.
Key privacy and cybersecurity laws
Data protection in Uruguay was governed by Law No. 18,331, passed in August 2008. Uruguay began modernising its existing data protection legislation with the approval of Law No. 19,670 in October 2018 and with Decree 64/2020 in February 2020. On 16 September 2021, the Data Protection Authority (DPA) of Uruguay published Resolutions Nos. 023/021 and 041/021, which contain updated guidance on international data transfers in the wake of the Schrems II decision.
Key obligations of companies
All companies holding databases of personal information in Uruguay must register with the Uruguayan data protection authority and record:
- the categories of data;
- how the data is collected and processed;
- details of the data controller;
- the storage location;
- retention period;
- security measures;
- codes of conduct;
- international data transfers; and
- how rights to access, update and delete personal data can be exercised.
If a company processes data outside of Uruguay, the database must still be registered with the authorities where processing activities are offered in connection with goods or services targeting Uruguay or where required by contract or international law. The register should be updated every three months. Private companies that process data on a large scale (i.e., the data of 35,000 or more individuals) must appoint a data protection officer to monitor compliance.
Key rights of data subjects
Data subjects have the right to access their own personal data and the right to rectify any inaccurate records, update and delete their data. Companies must correct, update or delete personal data on request and without charge.
Decree 64/2020 introduces a mandatory breach notification to the Uruguayan data protection authority within 72 hours of becoming aware of a security breach.
Under Law No. 18,331, international transfers of personal data are permitted where the receiving country provides an adequate level of data protection. Resolution No. 023/21 removed the United States from the list of countries that are deemed suitable for international data transfers from Uruguay. Transfers to the United States going forward will need to be justified via exceptions to the general prohibition on international data transfers from Law No. 18,331, such as upon consent of those whose data is being transferred, where appropriate protections are required by contract, etc. Resolution No. 41/021 includes guidance on drafting those contractual clauses. Companies that previously relied on the adequacy of the level of protection in the United States will have six months to justify transfers of data to the DPA of Uruguay in accordance with the new requirements.
Bolivia does not currently have a specific data protection law, though there is a general right to privacy in the country’s Constitution. There are currently two draft data protection laws pending consideration by the Legislative Assembly that, if passed, will apply to all individuals or legal entities processing data in Bolivia. The draft laws would require companies to appoint a data protection officer if the organisation carries out regular data processing. The draft laws also may require express, written consent of the data subject and that organisations implement security measures to protect personal data, maintain its confidentiality, and allow the Agency for the Protection of Personal Data (APP) to inspect and verify data records. If the draft laws are passed, data controllers will be permitted to transfer personal data internationally where the receiving country has adequate data protection laws as required by the APP, the exporter offers sufficient guarantees that the personal data will be safeguarded, the parties have sufficient clauses in their contracts (as validated by the APP) to protect data, or where specifically authorised by the APP.
With the enactment of the Data Security Law and Personal Information Protection Law in 2021, with effective dates in September and November of 2021, respectively, China continues to be one of the most active countries in expanding data privacy and cybersecurity regulation, building on past years’ efforts.
China’s E-Commerce Law came into effect on 1 January 2019. The Law requires registration by e-commerce vendors operating in China. It further reiterates e-commerce operators’ obligation to comply with Chinese personal data protection regulations, including providing customers with procedures allowing them to correct, erase or enquire about their personal data.
The Chinese Cybersecurity Law, which was enacted in 2017, imposes substantive requirements on ‘network operators’ as well as ‘providers of network products and services’ to ensure that they are securing their data, and have adopted appropriate incident-response plans and contingency measures in the event of a data security incident. Moreover, the Cybersecurity Law places enhanced obligations on operators of ‘critical information infrastructures’, including data localisation and submission to a state security review prior to the procurement of network products and services. Since the enactment of the Cybersecurity Law, the Chinese government has published many draft guidelines to assist companies in compliance with the Law.
In March 2020, the National Information Security Standardisation Technical Committee released an amendment to the current Personal Information Security Specification (the Specification), which came into force on 1 October 2020. The Specification is a set of voluntary best practices for businesses operating in China. The Specification is intended to set a baseline reference for regulatory bodies in China to use when evaluating how companies protect personal information. The Specification emphasises that a data subject’s consent is required to collect, transfer, share and disclose data and that the data should not be retained beyond the minimum necessary period. Additionally, the Specification specifies a data subject’s rights to his or her data, which are similar to the global standards (e.g., rights to access, delete or rectify data). The Specification also suggests substantive best practices for organisations, including incident-response planning (e.g., mock incident exercises), and preparations for notifying individuals in the event of a data breach. The Specification adds specific requirements for biometric data, such as facial recognition.
The 2019 Draft Security Assessment Measures for the Export of Personal Data (the 2019 Draft Measures) published by the Cyberspace Administration of China for comment apply to cross-border transfers of personal data from China. The Measures extend the requirement for data localisation from critical information infrastructure operators to all network operators, and require network operators to pass on certain data protection obligations to their recipients through contracts and other binding agreements and to retain records of data transfers for five years. At present, the Cybersecurity Law does not require an overseas operator to designate local representatives to address concerns from the authorities or data subjects. If enacted, the 2019 Draft Measures could introduce an obligation on overseas institutions that collect personal information from domestic users in the course of their business activities to appoint representatives to fulfil their legal and compliance responsibilities within China.
On 13 April 2020, the Cyberspace Administration of China, together with 11 other Chinese government agencies, published the Measures for the Review of Cybersecurity, detailing the procedures of operators of critical information infrastructures under the Cybersecurity Law in relation to the procurement of certain network products and services, if such procurement may affect state security. Pursuant to the Measures, operators of critical information infrastructures need to actively submit a security review, if they determine that their procurement may have state security risks. The review process may take 55 to 120 business days.
On 10 June 2021, the Chinese National People’s Congress passed the Data Security Law, which was released for public comment in June 2020. The Data Security Law took effect in September 2021 and covers all data activities within the territory of China and data activities outside of China that may harm China’s national security, public interests or rights of Chinese persons. Under the Data Security Law, ‘data activities’ are broadly defined and include the collection, storage, use, processing, transmission, provision and disclosure of data. The Data Security Law establishes a ‘tiered protection regime’ for data security in which data will be categorised, in accordance with its importance and potential harm in the event of a breach to China’s economic and social development, national security and public and private interests. Each region and department is granted the authority to catalogue such ‘important data’ and regulate them according to their classification. The Data Security Law also introduces the concept of ‘core state data’, which will be subject to stricter regulation than ‘important data’ due to their relevance to China’s national security, the lifeline of the national economy, the livelihood of Chinese citizens and major public interests. Key data security obligations imposed on data processors include:
- establishing internal governance and control regimes, including instituting accountability and oversight structures with designated personnel in charge;
- conducting periodic training on data security;
- adopting technical measures for data protection;
- taking remedial actions upon the discovery of risks or vulnerabilities, and occurrence of data breaches, promptly notifying users, and reporting the breaches to the relevant departments; and
- conducting periodic risk assessment and submitting a report to the relevant departments.
Non-compliance with the Data Security Law may lead to criminal liabilities or administrative penalties, including the issuance of a warning letter, confiscation of illegal profits, an order for the rectification of misconduct, fines against entities up to 10 million yuan and against individuals up to 1 million yuan, suspension of business or revocation of business licences.
On 20 August 2021, the Standing Committee of the National People’s Congress passed the Personal Information Protection Law (PIPL), which took effect on 1 November 2021. With the PIPL, China’s version of the GDPR, China is joining the global movement toward more restriction on the processing of personal information. Like the GDPR, the PIPL has extraterritorial effect and applies if any of the following circumstances exist:
- where the purpose is to provide products or services to natural persons within China;
- where the processing involves analysis or evaluation of activities of natural persons within China; or
- any ‘other circumstances as provided by laws or regulations’.
Foreign entities engaged in such processing are also required to establish a domestic agent or designated representative within China to be responsible for matters related to personal information, whose contact information must be reported to relevant authorities.
The PIPL imposes new restrictions on the transfer of data outside of China, which will likely be a challenge for global companies doing business in China. Export of personal data outside of China is permitted only when numerous requirements, including legal authorisation, notification and consent, and a risk assessment, are completed.
The PIPL also implements fresh data breach notification requirements. As a general rule, personal information processors are required to notify ‘authorities performing duties related to protecting personal information and relevant individual(s)’ of data breaches and ‘immediately take remedial measures’ in the event of actual or possible leakage, distortion and loss. Personal information processors are not required to notify the relevant ‘individual(s)’ if the ‘remedial measures can effectively avoid the harm’ caused by the breach unless the relevant regulator disagrees and orders otherwise. The PIPL is silent on what ‘remedial measures’ are or what measures would be considered effective.
In addition to potential civil and criminal liabilities, the PIPL provides severe penalties for non-compliance and violation, including:
- warnings, correction orders for misconduct, confiscation of illegal profits, and orders to suspend or shutdown services;
- up to 1 million yuan fine for relevant entities and up to 100,000 yuan fine for responsible individuals;
- in serious cases, a fine up to 50 million yuan or 5 per cent of the prior year’s turnover, as well as orders for suspension or shutdown of business operations, or revocation of business licence for relevant entities, and a fine ranging from 100,000 yuan to 1 million yuan plus additional restrictions for relevant individuals; and
- publicly disclosed negative records in a credit report.
As recent actions by the Cybersecurity Administration of China involving ride-hailing app Didi Chuxing demonstrate, non-monetary consequences can be imposed swiftly and with little notice.
On 20 January 2020, the Constitutional and Mainland Affairs Bureau issued a discussion paper to propose amendments to the existing Personal Data (Privacy) Ordinance (PDPO), which was enacted in 1996 and overhauled in 2012. Recent major personal data breach incidents exposed significant gaps in the current law, including the absence of a mandatory requirement to report data breaches and inadequate penalties to deter violations. Currently, breach notifications are made on a voluntary basis, but the discussion paper proposes a mandatory breach notification to require a data user to report breaches as soon as practicable and, in any event, within five business days. The discussion paper also proposes requiring data users to formulate a retention policy, recognising the risk of a data breach increases the longer the data is held. Under the PDPO, the maximum fine for non-compliance with an enforcement notice is HK$50,000 and imprisonment for two years on first conviction. The discussion paper proposes the introduction of a fine linked to annual turnover, which will bring Hong Kong closer in line with sanctions under the EU’s GDPR.
In May 2020, Singapore’s Ministry of Communications and the Personal Data Protection Commission launched a public consultation on a new bill proposing key amendments to the existing 2012 Personal Data Protection Act. The bill was read for the first time in Singapore’s parliament in October 2020, passed in November 2020 and came into force in February 2021. The amendments aimed to ensure the existing Act keeps pace with technological advances and global developments in data protection legislation. The amended Act imposes a mandatory data breach notification obligation for the first time in Singapore and increases the maximum penalties for violations of the Act to the greater of 10 per cent of annual gross turnover in Singapore or S$1 million. The amendments also expand deemed consent as a legal basis for the collection, use and disclosure of personal data to include contractual necessity and consent by notification and proposes legitimate interests and business improvement as alternatives to consent. These amendments will provide organisations with greater flexibility in how they use data. The amendments also introduce the right for individuals to request that an organisation transmits a copy of their personal data to another organisation. In conjunction with the amended Personal Data Protection Act, the Personal Data Protection Commission issued the Personal Data Protection (Notification of Data Breaches) Regulations 2021 and the Personal Data Protection Regulations 2021, both of which came into effect on 1 February 2021. The former provides a list of the types of personal data that, if compromised, would be deemed to result in significant harm to individuals and a list of what notifications must include. The latter regulations address requests for access to and correction of data, deemed consent by notification, and transfer of data outside of Singapore.
UK and Europe
On 25 May 2018, the GDPR took effect across the European Union (including the nations of the European Economic Area). The GDPR imposes substantial privacy and security requirements, which apply to companies with ‘establishments’ in Europe. But the GDPR also applies to companies around the world – including in Latin America – that target or monitor EU citizens.
Regarding privacy, EU data subjects enjoy significant rights to receive robust notice upfront of how their personal data will be used. EU data subjects also now have the right to access, correct and even delete their personal data that is held by companies. Companies, in turn, face tough requirements to process personal data only for the limited purposes that the GDPR permits. The GDPR also limits the ability of companies to transfer personal data outside the European Union.
The GDPR is best known as a privacy regulation, but it also has a significant cybersecurity component. The Regulation mandates that companies maintain substantive cybersecurity protections at a level ‘appropriate’ to the risk of harm if the data were compromised. Companies are also required to disclose certain data breaches to data protection authorities and, in certain circumstances, to the affected individuals. Disclosure to the relevant authority is generally due within 72 hours – a short time frame that makes clear the importance of being prepared to respond to an incident.
Since the GDPR came into effect, EU supervisory authorities have demonstrated their willingness to use their newfound enforcement powers aggressively, imposing hefty fines on companies including Google and the Italian telecommunications operator TIM. The operational cost to companies is illustrated by the fine imposed on a Swedish-headquartered data analytics firm for failing to mail privacy notices to over 6 million people, despite the fact that the cost of that mailing would have exceeded the company’s turnover for the year. The most substantial fines thus far have related to companies for alleged over-collection, misuse or misconfiguration of data without any breach. The maximum administrative fine imposable is the higher of €20 million or 4 per cent of the data user’s global annual turnover in the preceding year.
On 16 July 2020, in Data Protection Commission v. Facebook Ireland (Schrems, more commonly known as Schrems II) the Court of Justice of the European Union (CJEU) invalidated the European Commission’s adequacy determination regarding the EU–US Privacy Shield, and cast substantial doubt over European Commission-approved standard contractual clauses (SCCs) for cross-border transfers of personal data. In general, the GDPR allows for transfers to non-EU countries through approved channels, which, broadly speaking, ensure that EU personal data that arrives at non-EU destinations will continue to be protected by privacy standards approximate to those of the GDPR. Historically, two transfer mechanisms on which companies could rely for EU–US data transfers were the Privacy Shield and SCCs that parties could include in their own contracts.
In the Schrems II decision, the CJEU determined that the Privacy Shield did not sufficiently safeguard EU personal data once it leaves the EU. The CJEU further found that the SCCs can constitute a lawful basis for the transfer of personal data to a jurisdiction without an adequacy decision if the recipient is in a jurisdiction that affords the data subject ‘a level of protection essentially equivalent to that guaranteed within the EU’; but given the holding relating to the Privacy Shield, it seems unlikely that the United States would be considered such a jurisdiction.
In June 2021, the European Commission adopted two new sets of SCCs – one for use between controllers and processors and one for the transfer of personal data to third countries. Companies subject to the GDPR that export data have until 27 December 2022 to transition their existing arrangements to the new SCCs (or their own set of clauses that meet the requirements of Article 28 of the GDPR). Companies may no longer validly use the old SCCs in new agreements. The new SCCs reflect the changes to data protection law brought in by the GDPR in 2018, as the old SCCs were not updated at the same time, and also incorporate the judgment in Schrems II, which set out the need for organisations to look to the local laws of the destination country when transferring personal data to third countries in reliance on SCCs.
The United Kingdom left the EU on 31 January 2020. The United Kingdom enacted its own Data Protection Act in 2018 to implement the GDPR at the national level, and the GDPR is retained in domestic law as the UK GDPR. This means that data protection legislation in the United Kingdom will continue to be largely consistent with the GDPR, though any amendments to the GDPR after the end of the transition period will not automatically apply in the United Kingdom, and, as such, the new EU SCCs do not automatically apply in the UK. In May 2021, the ICO announced that it will produce its own version of the SCCs for use under the UK GDPR, and a draft for consultation was published in August 2021. In June 2021, the European Commission published two adequacy decisions for transfers of personal data to the UK under the GDPR and Law Enforcement Directive, confirming that the EU accepts the UK’s data regime as substantially equivalent to the EU’s regime and allowing personal data to flow freely from the EU and EEA to the UK until June 2025. In 2024, the European Commission will start work to decide whether to extend the adequacy decisions for a further period up to a maximum of four years.
The California Consumer Privacy Act (CCPA), a significant consumer privacy statute, took effect on 1 January 2020 and became enforceable on 1 July 2020. The CCPA applies to for-profit companies of all kinds and governs the collection, use and disclosure of the personal information of California residents. Most notably, the CCPA requires companies to allow consumers to opt out of the sale of their personal data. Covered companies also are required to give consumers extensive notice of how their data will be handled. Individual consumers have broad rights to compel companies to provide access to their data, and to correct or delete it – similar to the GDPR.
California’s Office of Administrative Law approved final regulations under the CCPA on 14 August 2020. Among the changes in the final regulation was a broadening of the definition of ‘personal information’ to include IP addresses that could be linked to a consumer or household regardless of whether a particular business actually links or is capable of linking the IP address to a consumer. The regulation also imposes additional annual reporting requirements on any business that ‘knows or reasonably should know’ that it buys, receives, sells or shares, for commercial purposes, the personal information of 10 million or more consumers in a calendar year.
Although the scope of the CCPA is still quite broad, it has a number of important exemptions. For example, the CCPA exempts protected health information (PHI) governed by the federal Health Insurance Portability and Accountability Act (HIPAA). The statute also exempts entities governed by HIPAA (including business associates) ‘to the extent they maintain . . . patient information in the same manner’ as HIPAA requires. The amendments also exempt ‘personal information collected, processed, sold or disclosed pursuant to’ the federal Gramm-Leach-Bliley Act, the California Financial Information Privacy Act and the federal Driver’s Privacy Protection Act of 1994. These exemptions are not entirely safe harbours – some of a company’s uses may not fall within the exemptions. The CCPA also has important exemptions for employees’ and job applicants’ personal information and personal data obtained in the context of M&A due diligence that, through the passage of the California Privacy Rights Act (CPRA) discussed below, will currently expire on 1 January 2023 unless the legislature acts again.
The CCPA is focused primarily on data privacy, but also has a security component. The CCPA grants consumers the right to sue and receive generous money damages in the event of a data breach. To date, there have been at least 98 cases brought alleging violation of the CCPA since it went into effect, many of them not specifying what, other than falling victim to a breach, the company did wrong. A separate, pre-existing California statute also requires companies to take ‘reasonable’ cybersecurity measures to protect personal data.
In November 2020, California voters approved the CPRA. The CPRA is essentially a set of amendments to the CCPA. The CPRA defines a new category of ‘sensitive personal information’. This category includes data elements such as social security number, ethnic origin and religious beliefs. Consumers now have the right to ‘limit the use of [their] sensitive personal information to that use which is necessary to perform the services or provide the goods reasonably expected by an average consumer who requests such goods or services’. That appears to mean, among other things, no use of sensitive personal information for marketing or analytics. The CPRA explicitly broadens the CCPA to restrict companies ‘sharing’ – not just ‘selling’ – personal data. The CPRA becomes enforceable on 1 January 2023. Consumers’ data requests, though, will relate back one year. This means a request made after the CPRA’s effective date may require searching for, disclosing, correcting or deleting data going back as far as 1 January 2022.
On 7 July 2021, Colorado’s Protect Personal Data Privacy Act (ColoPPA) was signed into law and will go into effect on 1 July 2023. The requirements for businesses under the ColoPPA are largely similar to those prescribed by the CCPA, as amended by the CPRA, and the data protection law in Virginia – the Virginia Consumer Data Protection Act (VCDPA) (see below). For example, similar to the CCPA, the ColoPPA provides consumers the right to access, correct or delete their personal data, and enables them to obtain a portable copy of their data in certain circumstances. It also obligates businesses to allow consumers to opt out of the processing of their personal data for targeted advertising as well as for sale. Going beyond the CCPA, the ColoPPA also permits consumers to opt out of data processing for the purpose of profiling resulting in legal consequences (defined as ‘the provision or denial of financial or lending services, housing, insurance, education enrolment or opportunity, criminal justice, employment opportunities, health-care services, or access to essential goods or services’). The law further requires businesses to provide a universal opt-out mechanism by 1 July 2024 that consumers can use to opt out of the processing, sale or disclosure of their personal information. The ColoPPA affords enhanced rights and imposes additional restrictions – including the need for opt-in consent – on the processing of sensitive data, which is defined as data that reveals ‘racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status’ or ‘genetic or biometric data that may be processed for the purpose of uniquely identifying an individual’ or ‘personal data from a known child’.
In addition to the privacy rights it enshrines for consumers, the ColoPPA also requires businesses to conduct regular data protection assessments for each of their activities that involve personal data and present a heightened risk of harm to consumers, including targeted advertising or profiling, selling personal data and processing sensitive data. Companies also need to make these assessments available to the Colorado Attorney General upon request.
The ColoPPA does not contain a private right of action provision. The Colorado Attorney General and District Attorneys will have the exclusive enforcement authority to bring enforcement actions. The current 60-day period will expire on 1 January 2025.
On 4 March 2021, Virginia became the second US state, after California, to adopt a comprehensive privacy law: the VCDPA. The VCDPA will take effect on 1 January 2023.
The VCDPA confers various GDPR-like data subject rights to consumers, including that of access, correction, deletion, portability and opt out of processing personal data used for targeted advertising, sale or profiling ‘in furtherance of decisions that produce legal or similarly significant effects concerning the consumer’. Akin to the GDPR, but going beyond the CCPA, controllers must also undertake ‘data protection assessments’ that evaluate the benefits of data processing against the risks to the consumer. This includes the processing of personal data used for profiling when there is a ‘reasonably foreseeable risk’ that such profiling will lead to discriminatory impact; economic, reputational or actual harm; and invasions of privacy. This requirement will likely cover a substantial amount of targeted advertising and AI activities.
Finally, it is worth noting that the Virginia Attorney General has exclusive authority to bring an action against violating parties. After a 30-day cure period, it may seek an injunction or a civil penalty of up to US$7,500 for each violation.
In mid-2019, the US state of New York enacted the Stop Hacks and Improve Electronic Data Security Act (the SHIELD Act), which created new substantive requirements of ‘reasonable’ cybersecurity. The SHIELD Act also expanded the definition of personal information in New York’s data breach notification requirements.
The SHIELD Act requires any person or business that owns or licences the computerised personal information of any New York resident to ‘develop, implement and maintain reasonable safeguards to protect the security, confidentiality, and integrity of the private information including, but not limited to, disposal of data’. The law does not precisely define ‘reasonable’ security, but offers some guidance on minimum expectations. It appears this is intended to be an evolving standard, which will likely become more stringent over time, as the collective definition of an objectively appropriate cybersecurity programme evolves to match developing threats.
An entity is deemed compliant with New York’s new ‘reasonableness’ standard if it is subject to, and compliant with, certain other cybersecurity regimes: the federal Gramm-Leach-Bliley Act; the federal healthcare standards; the New York Department of Financial Services’ (DFS) Cybersecurity Regulation (DFS Part 500); or any other data security rules and regulations promulgated by the federal or New York state government.
In July 2020, the DFS brought its first enforcement action for alleged violations of DFS Part 500, under imposed stringent cybersecurity regulations on financial institutions beginning in 2017. In its statement of charges, the DFS asserted that the subject company had failed to perform an adequate risk assessment; failed to maintain proper access controls; failed to provide adequate security training for cybersecurity employees; and failed to encrypt certain non-public information. The company in question also allegedly failed to remediate a vulnerability that exposed the sensitive personal information of thousands of individuals, despite identifying the issue during penetration testing. The violations carry potential penalties of up to US$1,000 each. In its press release, the DFS asserts that each instance of non-public information that was accessed by an unauthorised person constitutes a separate violation.
Both the SHIELD Act and DFS Part 500 have national and global implications, including in Latin America, because financial institutions from around the US and the world do business in New York under licence from the DFS and own and licence New York residents’ information.
Other states and the federal government
All 50 US states, the District of Columbia, Puerto Rico, the Virgin Islands and Guam now have breach notification laws, many of which have recently been strengthened.
There is still no general breach notification requirement at the federal level in the United States. Notice of certain breaches involving healthcare data is required under the federal HIPAA statute. Financial institutions regulated under the federal Gramm-Leach-Bliley Act are subject to regulatory agency guidance instructing them that they should give notice of breaches. Perhaps the biggest decision for companies going forward will be whether to adopt a highest common denominator compliance approach – that is, voluntarily treating the data of all consumers regardless of location as if they were subject to all of these increasingly stringent privacy laws.
Following the 7 May 2021 ransomware attack on Colonial Pipeline, an American oil pipeline servicing the US southeast, there has been a flurry of regulatory and legislative activity for critical infrastructure entities at the federal level. The Biden Administration has tasked the Cybersecurity Infrastructure Security Agency (CISA) with establishing cross-sector cybersecurity goals for critical infrastructure and directed CISA to work with sector-specific regulators to create coordinated cybersecurity standards for each sector. Simultaneously, members of the US Senate and House of Representatives have proposed – but at the time of writing not yet passed – a number of bills that would create federal cybersecurity standards and incident reporting requirements for critical infrastructure entities.
Best practices for risk reduction and crisis management
This section focuses on how companies may seek to prevent and ultimately respond to data breaches in a way that both meets any legal disclosure obligations and preserves trust with their stakeholders. While the guidance here is focused on security breaches, it also applies in large measure to privacy breaches that are unrelated to security issues.
Substantive cybersecurity measures
As noted, the emerging global law of cybersecurity typically states that a company’s security programmes must be ‘appropriate’ to the risk (e.g., GDPR), ‘reasonable’ (e.g., California law and HIPAA) or use similar terminology. Notably, both the GDPR and the LGPD require both ‘technical and organisational’ measures – meaning that the cybersecurity programme must include a combination of policies and procedures, such as a well-tested incident-response plan (discussed below), alongside strong technical protections (e.g., encryption of sensitive data).
Collectively, this means that cybersecurity is not simply the domain of technical experts. The required level of protection is risk-based and should contemplate the sensitivity of the data in question, the risk of harm if a given data set were compromised, whether best practices as recognised by the technical community are in place, and whether the cybersecurity programme is regularly evaluated and improved based on the evolving threat profile.
Certain measures have already been so widely embraced by the security community that they would be part of almost any ‘appropriate’ or ‘reasonable’ cybersecurity programme. As noted, encryption of data, both at rest and in transit, is required by New York’s DFS cybersecurity regulation. So is the use of multifactor authentication – that is, the use of both a password and a second entry credential, such as a short-term code transmitted to the user by text message, to access an account.
Threat vectors and best practices are constantly evolving, as is the technical community’s understanding of what are ‘reasonable’ or ‘appropriate’ security measures – as well as the law’s understanding. Companies should thus encourage strong communication among their information security, legal and compliance teams. This will help companies recognise and respond to new technical standards as they begin to shape into legal standards.
The incident-response plan
Good preparation begins with having a written incident-response plan (IRP). Strong IRPs have a number of recognised elements.
The IRP should identify all the key teams within a company that are essential to cross-functional incident response. Typically, the IRP will assign primary leadership roles to the information security team and the legal team. Other teams with key roles include the C-suite, communications (including media relations and social media), risk, human resources and government relations. The privacy team and the information technology team – to the extent these are separate from information security – generally should be included as well.
The IRP should identify the specific personnel members who will form the company’s incident response team (IRT). Each business unit should have both a primary and backup person designated. Contact details for each person should be listed, including business contact information, personal email addresses and mobile phone numbers that can be used if corporate systems are compromised.
The IRP also should identify key external resources that may be engaged in an incident. The list of key external resources should typically include:
- external counsel;
- at least one external forensic vendor;
- law enforcement;
- relevant regulatory agencies;
- the company’s insurance broker and carriers;
- key members of the company’s external board of directors; and
- a crisis communications consultant or vendor who can handle large-scale mailings to affected customers or shareholders.
Once again, both work and non-work contact details should be included. Many of these stakeholders should not just be listed in the IRP but should be engaged in its preparation and testing, so that they are aware of the role they would play in a breach.
External counsel and the external forensic consultant should be brought together with other key IRT members ahead of any breach so that they can all become familiar with the company’s relevant systems, policies, procedures and personnel. As the saying goes, ‘do not meet your team for the first time on the day of the game’. External counsel have a key role to play in ensuring that legal requirements are met and that the legal privilege applicable to the work of the IRT is protected to the maximum extent possible under local law.
The IRP should spell out a process for classifying incidents according to their severity and the degree of certainty regarding the facts. There is usually an early period during which an incident is suspected but not yet confirmed. It is usually best that a smaller ‘core team’ take charge of evaluating potential incidents and responding to less severe incidents. The broader cross-functional team should be engaged to help respond to larger incidents once the facts are confirmed or if there is an extended period of uncertainty.
The IRP should provide a process for responding to confirmed incidents. There should be clear pre-defined roles for each IRT member. Someone should be designated to chair the IRT and to keep a record of its work. Key documents that are likely to be needed as part of a breach response – such as notices to regulators, to affected data subjects and to the press – should be drafted in advance and appended to the IRP, with blanks left for the facts specific to a given incident.
The IRP should be as short and clear as possible. The goal is to have IRT members actually rely on and utilise the IRP in the event of a crisis. The longer and more complicated the IRP is, the greater the chance that people will simply disregard it.
Testing the IRP
A well-written IRP and a well-defined IRT are essential to strong incident response, but they can be ineffective if they are not also well tested. Incident-response simulation drills, known informally as ‘tabletop’ exercises, have become an important part of many corporate cybersecurity programmes.
The best tabletops are prepared with an eye towards the specific facts and circumstances of the company. Certain personnel (often external counsel or forensic consultants) are designated to prepare the tabletop scenario, in isolation from the participants in the tabletop. This ensures that participants are responding during the drill without prior knowledge of the ‘facts’.
On the day of the drill, the members of the IRT (or whatever business units are part of the drill) gather in a room, or via teleconference or video link. The person responsible for guiding the drill then announces the ‘facts’, revealing additional facts periodically as the drill proceeds. Tabletops can last anywhere from a couple of hours to a whole day.
Over the course of the tabletop, the moderator announces a series of new factual revelations according to a stated timeline: ‘it is Tuesday at 10am, and the hacker just did X; now we assume it is Thursday at 2pm, and law enforcement just announced Y’ and so on. With each factual revelation, different participants are called on to state what they would do, and how and with whom they would communicate. There is active discussion between all participants throughout.
The results of the tabletop are often processed in two stages. Before people leave the room at the close of the drill, they step out of the role-playing format and have an immediate discussion about the lessons learnt from the drill. Afterward, thoughts are collected from participants in a more systematic manner, and the lessons learnt are incorporated in the form of revisions to the IRP.
Responding to an actual incident
With a well-tested IRP in place, a company is prepared to respond to an actual incident:
- the IRP is activated and the IRT is periodically brought together at a set time and place. As the facts are confirmed, necessary notifications begin to go out – to civil regulators, data subjects, the press, the board, employees and other stakeholders;
- technical measures are implemented to protect the company’s systems, for example, by cleansing malware from infected computers, or backup systems are activated to circumvent a ransomware attack that has disabled main systems;
- a careful record is kept of all key incident response steps, with one or more IRT members specifically designated to act as the secretary or archivist of the process;
- if criminal activity is suspected, the company makes a decision as to whether and how to engage with law enforcement;
- evidence that may be needed to document the events is carefully retained. For example, any cleansing of infected computers is conducted by the information security team or outside forensic experts in consultation with counsel and law enforcement, so that evidence necessary for subsequent investigations and legal proceedings is preserved; and
- all participants in the breach response process are carefully cautioned to communicate in a careful manner. Secure communication channels should be used until it is certain that intruders are not present on company systems.
As the days and weeks go by, the crisis atmosphere will begin to recede. Immediate forensic and communications measures are completed. The company can then begin to engage in a ‘lessons learned’ exercise. This involves going beyond the purging of infected computers to consider and address any more systemic weaknesses identified by the breach. Longer-term remedial measures in a large company can easily take months or even years to complete. A ‘lessons learned’ exercise specific to the work of the IRT is often useful as well, and can lead to positive improvements to the IRP.
The importance of communications in minimising legal and reputational harm cannot be overstated. The guidance here is simple: companies survive breaches best when they communicate early, clearly, accurately and tersely. There is an understandable wish to deny or minimise a cybersecurity problem, rather than admit embarrassing facts. At the other extreme, there can be a temptation to state the details with great precision, to encourage the impression that the company is fully in command of the situation. But cybersecurity incidents often do not lend themselves to either approach. Cyber forensics take time, and the facts are rarely clear at first.
Accordingly, an early statement along the lines of ‘we are aware of suspicious activity, we are investigating and we will post updates as we know more’ will often be most consistent with the facts. A company that denies the problem, or that prematurely states uncertain facts as if they were definitive, may then have to issue corrective statements as the facts change. This can create the impression that the company is not candid or competent. That, in turn, tends to create reputational damage and increases the chances of tough legal scrutiny from regulators and courts. As legal requirements for prompt breach disclosure grow, clear and careful early communication becomes ever more important.
Impact of covid-19 on data breaches
In countries impacted by covid-19, the introduction of lockdown measures, the use of new virtual communication platforms and the increased numbers of employees working from home have multiplied many companies’ risk of cyber threats. In particular, companies around the world have seen a dramatic rise in phishing attempts and resulting security incidents, including the deployment of ransomware and the exfiltration of sensitive data.
Cybersecurity and protection of personal data should remain a priority for companies even during these challenging times. Companies should ensure that employees are provided with encrypted devices and do not use their personal emails for professional purposes and companies should review, update and distribute their confidentiality, cyber hygiene and IT policies. To minimise the threat posed by phishing attacks, companies should regularly train employees on how to spot a potential phishing email, ensure that they use consistent formatting in emails, bolster their anti-virus software and firewalls, and protect their networks from ransomware attacks.
As companies begin to reopen their physical spaces, it is also becoming common to collect significant new amounts of personal information from employees and visitors. New procedures include temperature checks, health questionnaires, and the use of various apps and devices to track health and location. It is prudent to check these new practices against the standards of the privacy laws of the countries where the practices are used. Privacy law, for example, may call for additional disclosure of data collection and processing, while security law may call for additional measures to protect the data.
Increased adoption of artificial intelligence technologies
The global market for artificial intelligence (AI) is poised to skyrocket over the next decade. Companies are increasingly embedding AI technology in their products and services to automate complex tasks, solve problems and learn from new data at scale. AI also can improve efficiencies of both human and physical capital, such as by reducing worker fatigue or predicting which machinery will need maintenance before problems occur.
AI adoption in both the public and private sectors is proceeding rapidly in Latin America, as in the rest of the world. According to Accenture, AI has the potential to add up to one percentage point in annual economic growth to the economies of Argentina, Brazil, Chile, Colombia and Peru through 2035. AI is already deployed across key sectors – from chatbots in banking and retail to sentiment recognition in hiring and autonomous drills in mining. Latin America’s AI adoption has been fuelled not only by the digitisation of sectors such as healthcare, but also by regulatory changes that make it easier to collect and share data. For example, in the fintech space, open banking reforms have encouraged portability of customer data and transaction histories, driving the ability of start-ups to innovate and compete with established banks.
Attempts to regulate AI in Latin America are still in the early stages and have focused largely on the adoption of national AI strategies or ethical frameworks. In 2018, Mexico became the first country in Latin America – and among the first 10 countries worldwide – to publish a national AI strategy, which included a focus on developing adequate governance frameworks. Mexico also published a set of principles and a ‘risk assessment tool’ to facilitate the ethical and responsible use of autonomous systems in its federal government. More recently, Brazil, Argentina, Chile and Uruguay likewise have embarked on the process of developing their own national AI strategies and frameworks. Argentina, Brazil, Chile, Colombia, Costa Rica, Mexico and Peru also adhere to the OECD’s Principles on Artificial Intelligence, which aim to ‘promote AI that is innovative and trustworthy and that respects human rights and democratic values’.
In the absence of binding laws and regulations, the burgeoning growth of AI in Latin America has been effectively left to be regulated by data protection or consumer protection laws. Many countries are currently looking to the European Commission, which has announced its intention to develop a comprehensive regulatory framework for AI. In February 2020, the European Commission published a white paper outlining a series of possible oversight mechanisms for certain ‘high-risk’ AI applications or sectors, such as transportation, healthcare and energy. If the European Union ultimately adopts a sweeping approach to AI regulation similar to the GDPR, this could serve as a template for other jurisdictions in the future.
Legal requirements concerning cybersecurity and data privacy are continuing to multiply in the Americas and around the globe. As they do, global standards are emerging for what a corporate cybersecurity and data privacy programme should look like in the ordinary course and for how to respond when things go wrong.
History and the law provide this simple message: companies that prepare for the worst will respond the best. The key is to have a robust suite of cybersecurity and data privacy measures designed to reduce the chances of a crisis, accompanied by a robust plan for incident response when a crisis inevitably hits. That plan should be practical, business-friendly, cross-functional, written clearly and compactly, and well-tested. Above all, response plans should be designed to preserve and build trust through clear, prompt and careful communication and action followed by effective long-term remediation.
 Jeremy Feigelson and Andrew M Levine are partners, Johanna Skrzypczyk is counsel and H Jacqueline Brehmer, Mengyi Xu, Hilary Davidson and Michael Bloom are associates at Debevoise & Plimpton LLP.
 Lei Geral de Proteção de Dados (Law No. 13,709/2018). Unofficial English translation available at https://iapp.org/resources/article/brazilian-data-protection-law-lgpd-english-translation/.
 English translation available at http://www.jus.gob.ar/media/3201023/personal_data_protection_act25326.pdf.
 European Commission Decision C (2003) 1731 of 30 June 2003.
 Agencia de accesso a la informacion publica (Resolución 4/2019) http://servicios.infoleg.gob.ar/infolegInternet/anexos/315000-319999/318874/norma.htm.
 Agencia de accesso a la informacion publica (Resolución 86/2019) http://servicios.infoleg.gob.ar/infolegInternet/anexos/320000-324999/323901/norma.htm.
 Chapter IV, Article 21.
 Chapter II, Article 4.
 Chapter II, Article 4.
 Chapter II, Article 5.
 Chapter II, Article 9.
 Chapter III, Articles 13–14.
 Chapter III, Article 16.
 Chapter II, Article 12.
 Law No 13.853/2019, available at http://www.planalto.gov.br/ccivil_03/_ato2019-2022/2019/lei/L13853.htm.
 Decree No. 10,464, available at http://www.in.gov.br/en/web/dou/-/decreto-n-10.474-de-26-de-agosto-de-2020-274389226.
 MPDFT ajuíza 1ª ação civil pública com base na LGPD, 22 September 2020, http://www.mpdft.mp.br/portal/index.php/comunicacao-menu/sala-de-imprensa/noticias/noticias-2020/12384-mpdft-ajuiza-1-acao-civil-publica-com-base-na-lgpd.
 LGPD Article 6.
 LGPD Article 17.
 LGPD Article 18.
 LGPD Article 8.
 LGPD Article 48.
 LGPD Article 33.
 PDPL Article 4.
 PDPL Article 9.
 PDPL Article 3.
 Ley Estatutaria 1581 de 2012, available in Spanish at https://www.alcaldiabogota.gov.co/sisjur/normas/Norma1.jsp?i=49981.
 Law Title I, Article 2.
 Law Title II, Article 4.
 Decree No. 1377/2013. English translation available at https://iapp.org/media/pdf/knowledge_center/DECRETO_1377_DEL_27_DE_JUNIO_DE_2013_ENG.pdf.
 Decree Chapter II, Article 5.
 Decree Chapter III, Articles 13-15.
 Decree Chapter III, Article 26.
 Decree Chapter IV, Article 21.
 Decree Chapter IV, Article 22.
 Law Title VI, Articles 17(n) and 18(k).
 Law Title VIII, Article 26.
 Decree, Article 25.
 Text in Spanish: http://www.diputados.gob.mx/LeyesBiblio/pdf/LFPDPPP.pdf.
 Law Chapter II, Article 11.
 Law Chapter II, Article 13.
 Law Chapter II, Article 11.
 Law Chapter IV, Article 30.
 Law Chapter II, Article 19.
 Chapter III, Articles 22_27.
 Chapter II, Article 8.
 Regulations Chapter II, Articles 12-14.
 Chapter II, Article 20.
 Regulations Chapter II, Article 65.
 Chapter V, Article 36.
 Ley de Proteccion de Datos Personales, available in Spanish at https://leyes.congreso.gob.pe/Documentos/Leyes/29733.pdf; English translation available at https://www.huntonprivacyblog.com/wp-content/uploads/sites/28/migrated/Peru%20Data%20Protection%20Law%20July%2028_EN%20_2_.pdf.
 Resolucion SBS No. 504-2021, available in Spanish at https://busquedas.elperuano.pe/download/url/aprueban-el-reglamento-para-la-gestion-de-la-seguridad-de-la-resolucion-no-504-2021-1929393-1/.
 DPL, Articles 4, 6.
 DPL, Articles 5, 18.
 DPL, Article 7.
 DPL, Article 8.
 DPL, Article 9.
 DPL, Article 11.
 Resolucion SBS No. 504-20211.
 DPL, Article 19.
 DPL, Article 18.
 DPL, Article 20.
 DPL, Article 22.
 DPL, Article 21.
 DPL, Article 10.
 Directorial Resolution No. 019-2013-JUS/DGPDP, Section 188.8.131.52.
 Decreto de Urgencia No. 007-2020, available in Spanish at https://busquedas.elperuano.pe/normaslegales/decreto-de-urgencia-que-aprueba-el-marco-de-confianza-digita-decreto-de-urgencia-n-007-2020-1844001-2/.
 Resolucion SBS No. 504-2021, Article 15.
 Law No. 81 available in Spanish at https://www.antai.gob.pa/reglamentan-ley-81-de-proteccion-de-datos-personales.
 Decree No. 285, Article 14.
 Decree No. 285, Article 36.
 Decree No. 285, Chapter II.
 Law No. 18.331 and Decree No. 414/009.
 Available in Spanish at https://legislativo.parlamento.gub.uy/htmlstat/pl/leyes/Ley19670.pdf.
 Available in Spanish at https://www.gub.uy/unidad-reguladora-control-datos-personales/institucional/normativa/resolucion-n-23021 and https://www.gub.uy/unidad-reguladora-control-datos-personales/institucional/normativa/resolucion-n-41021.
 Resolution No. 023/21.
 China's Data Security Law, available in Chinese at http://www.npc.gov.cn/npc/c30834/202106/7c9af12f51334a73b56d7938f99a788a.shtml; unofficial English translation available at https://www.chinalawtranslate.com/en/datasecuritylaw/.
 id., Article 21.
 id., Article 27.
 id., Article 30.
 id., Articles 44–52.
 China's Personal Information Protection Law, available in Chinese at http://www.npc.gov.cn/npc/c30834/202108/a8c4e3672c74491a80b53a172bb753fe.shtml. See Debevoise's analysis 'China Passes the Personal Information Protection Law' (1 September, 2021), available at https://www.debevoise.com/insights/publications/2021/08/china-passes-the-personal-information-protection.
 id., Aricle 3.
 id., Article 53.
 id., Article 38.
 id., Article 39.
 id., Articles 55(4), 56.
 id., Article 57.
 id. Article 66.
 id., Article 67.
 See Tracy Qu and Zhou Xin, 'China Takes Didi off app stores two days after Beijing announces cybersecurity review', South China Morning Post (4 July 2021), https://www.scmp.com/tech/big tech/article/3139786/china-takes-didi-app-storestwo-days-after-beijing-announces.
 Announcement (Amendments to the Personal Data Protection Act (PDPA) Take Effect From 1 February 2021) dated 29 January 2021, available at https://www.pdpc.gov.sg/news-and-events/announcements/2021/01/amendments-to-the-personal-data-protection-act-take-effect-from-1-february-2021.
 Personal Data Protection (Notification of Data Breaches) Regulations 2021), available at https://sso.agc.gov.sg/SL-Supp/S64-2021/Published/20210129?DocDate=20210129.
 Personal Data Protection Regulations 2021, available at https://sso.agc.gov.sg/SL/PDPA2012-S63-2021?DocDate=20210129.
 Case No. C-311/18 (16 July 2020), available at http://curia.europa.eu/juris/document/document.jsf?text=&docid=228677&pageIndex=0&doclang=en&mode=req&dir=&occ=first&part=1&cid=9714885.
 European Commission, 'Standard contractual clauses for controllers and processors in the EU/EEA' (4 June 2021), https://ec.europa.eu/info/law/law-topic/data-protection/publications/standard-contractual-clauses-controllers-and-processors/.
 European Commission, 'Standard contractual clauses for international transfers' (4 June 2021), https://ec.europa.eu/info/law/law-topic/data-protection/international-dimension-data-protection/standard-contractual-clauses-scc/standard-contractual-clauses-international-transfers_en.
 European Commission Implementing Decision, C(2021) 4800 (28 June 2021), available at https://ec.europa.eu/info/sites/default/files/decision_on_the_adequate_protection_of_personal_data_by_the_united_kingdom_-_general_data_protection_regulation_en.pdf.
 European Commission Implementing Decision, C(2021) 4801 (28 June 2021), available at https://ec.europa.eu/info/sites/default/files/decision_on_the_adequate_protection_of_personal_data_by_the_united_kingdom_law_enforcement_directive_en.pdf.
 Department of Justice, Title 11, Division 1, Chapter 20. California Consumer Privacy Act Regulations (14 August 2020), available at http://www.oag.ca.gov/sites/all/files/agweb/pdfs/privacy/oal-sub-final-text-of-regs.pdf?.
 In Re First American Title Insurance Co., No. 2020-0030-C (21 July 2020) http://www.dfs.ny.gov/system/files/documents/2020/07/ea20200721_first_american_notice_charges.pdf.
 National Security Memorandum on Improving Cybersecurity for Critical Infrastructure Control Systems (28 July 2021); EO 14028, Executive Order on Improving the Nation's Cybersecurity.
 Cybersecurity Incident Notification Act of 2021, S. 2407, 117 Cong. (2021-2022); Sanction and Stop Ransomware Act, S. 2666, 117th Cong. (2021-2022), National Defense Authorization Act for Fiscal Year 2022, H.R. 4350, 117th Cong. (2021-2022).
 See Armen Ovanessoff and Eduardo Plastino, 'How Artificial Intelligence Can Drive South America's Growth', Accenture Research (2017), available at http://www.accenture.com/_acnmedia/pdf-48/accenture-ai-south-america.pdfla=es-la.
 See PYMNTS.com, 'All Eyes Are on LATAM Open Banking' (24 July 2020), available at http://www.pymnts.com/bank-regulation/2020/all-eyes-are-on-latam-open-banking/.
 Enrique Zapata, Estrategia de Inteligencia Artificial MX 2018, México Digital Blog (22 March 2018), available at http://www.gob.mx/mexicodigital/articulos/estrategia-de-inteligencia-artificial-mx-2018.
 See Innova MX, 'Guía de análisis de impacto para el desarrollo y uso de sistemas basadas en inteligencia artificial en la APF' (28 November 2018), available at http://www.gob.mx/innovamx/articulos/guia-de-analisis-de-impacto-para-el-desarrollo-y-uso-de-sistemas-basadas-en-inteligencia-artificial-en-la-apf.
 See OECD.AI Policy Observatory Dashboard, Brazil Formal Consultations for a National Artificial Intelligence Strategy, available at https://oecd.ai/dashboards/policy-initiatives/2019-data-policyInitiatives-25303; OECD.AI Policy Observatory Dashboard, Argentina Artificial Intelligence National Plan, available at https://oecd.ai/dashboards/policy-initiatives/2019-data-policyInitiatives-24309; OECD.AI Policy Observatory Dashboard, Chile Artificial Intelligence Working Plan, available at https://oecd.ai/dashboards/policy-initiatives/2019-data-policyInitiatives-24840; OECD.AI Policy Observatory Dashboard, Uruguay Data Science and Machine Learning Roadmap, available at https://oecd.ai/dashboards/policy-initiatives/2019-data-policyInitiatives-26480.
 See OECD Press Release, 'Forty-two countries adopt new OECD Principles on Artificial Intelligence' (22 May 2019), available at http://www.oecd.org/going-digital/forty-two-countries-adopt-new-oecd-principles-on-artificial-intelligence.htm.
 See European Commission, White Paper on Artificial Intelligence – A European Approach to Excellence and Trust (19 February 2020), available at https://ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf.