GDPR vs. PIPL: 4 Differences

Joe Stanganelli Managing Editor, TechBeacon

On November 1, the jungle of privacy compliance got thicker. That's when the Personal Information Protection Law (PIPL) went into effect in China. 

As seems to be the case with almost every new privacy law these days (e.g., the California Consumer Privacy Act), PIPL has drawn comparisons to the EU's General Protection Data Regulation (GDPR)—itself in effect since May 25, 2018.

In this case, the comparisons are apt. Both are broad data-stewardship schemes with substantial potential penalties and long-arm jurisdiction. As with GDPR in the EU, PIPL extends its applicability globally on the basis of both geography of data handling and geography of the individual whose data is being handled. In any given case, if either of those prongs fall within China's borders, PIPL may apply—regardless of the data wrongdoer's location.

Still, these laws are not entirely the same—and their differences go beyond minor technical facets. While these differences are numerous, below are four of the more notable differences.

1) PIPL defines "sensitive information" differently.

Not all personal information is created equal, and long-arm privacy laws generally recognize this fact. Thus, "sensitive personal information" ("SPI") typically receives a greater level of specific legal protection than do other kinds of personal information.

GDPR doesn't use the exact term SPI, but clearly refers to and outlines categories of personal data that are held to higher standards for processing purposes. Article 9 of GDPR carves out limited circumstances under which data concerning any of the following may be processed:

  • Race/ethnicity
  • Politics
  • Religion
  • Philosophy
  • Union membership
  • Genetics
  • Biometric data used for the purposes of uniquely identifying someone
  • Health
  • Sex life
  • Sexual orientation

PIPL, too, creates additional protections for SPI. PIPL's umbrella of data that qualifies as SPI is seemingly broader than GDPR's special carveouts, however. Article 28 of PIPL offers a broad definition of SPI: "personal information that, once leaked or illegally used, may easily cause harm to dignity [or] grave harm to personal or property security[.]"

Article 28 goes on to list some examples of such information; as with GDPR, PIPL offers enhanced protections for personal information involving biometrics, religion, and health. 

PIPL also offers other specific examples of SPI:

  • Data related to a person's "specially-designated status,"
  • Data related to a person's financial accounts, and
  • Location-tracking data.

While not all of GDPR's specific carveouts are listed among PIPL's examples of SPI, those categories of data may still qualify as SPI under PIPL because of how broadly it is defined.

That said, there is one additional category of data that PIPL specifically categorizes as SPI.

2) PIPL treats the handling of minors' personal information differently.

PIPL also specifically defines any and all personal information of a minor under the age of 14 as SPI. Processing this type of data requires yet more-heightened protections under PIPL, including that the personal-information handler obtain a parent or guardian's consent and that the personal-information handler "formulate specialized personal-information handling rules" for minors' data.

On this point, GDPR's burdens are simultaneously stricter and less strict. For the purposes of processing personal information, GDPR categorizes a minor as someone who is under the age of 16—although it allows that individual EU Member States may set this bar to as low as 13. On the other hand, GDPR does not automatically characterize all minors' personal data as SPI. GDPR merely requires parental or guardian consent to process a minor's personal data where consent is otherwise required. 

This is not the only area where GDPR is stricter in some areas than PIPL is, while less strict in others.

3) PIPL has different standards for government oversight of risk-assessment processes.

Under GDPR, data controllers must conduct a data-privacy impact assessment ("DPIA") of their data-processing plans in circumstances "likely to result in a high risk to the rights and freedoms of natural persons"—such as where new technology, mass surveillance, and/or particular categories of SPI are concerned. A DPIA must specifically outline:

  • A "systematic description of the envisaged processing operations;"
  • A similar description of the purposes of the data-processing operations, including the data controller's legitimate interest;
  • The "necessity and proportionality" of said data-processing operations;
  • The risks those operations present to data subjects' rights and freedoms; and
  • The "measures envisaged" to actually treat/mitigate those risks.

Generally, the data controller may conduct a DPIA independently. There are a couple of exceptions to this general rule, however. A guideline document released by an EU advisory body prior to GDPR coming into effect makes clear that data controllers must consult with the supervisory authority:

  1. "Whenever the data controller cannot find sufficient measures to reduce the risks to an acceptable level," or
  2. When EU Member State law so directs vis-à-vis data processing related to the public interest.

PIPL requires that personal-information handlers be subject to a personal-information protection impact assessment ("PIPIA") in similar (though not necessarily all the same) circumstances. Unlike with DPIAs under GDPR, however, PIPL does not require that the organization consult a regulatory authority if the PIPIA identifies risks that the organization cannot remediate.

On the other hand, consider the case of cross-border transfers of personal information. Notably, PIPL treats data transfers outbound from China's borders as inherently risky activities that automatically trigger the need for a PIPIA. In these cases, while PIPL may not mandate direct government supervision for PIPIA purposes, it will generally mandate it in the form of a separate "security assessment organized by the State cybersecurity and informatization department." This assessment will take place after the organization conducts a self-assessment of outbound data-transfer risk and submits a report on that assessment to regulators.

GDPR, too, provides for security assessments for data-handling activities related to personal information—albeit in general, and not limited to cross-border data transfers. These assessments, however, may be self-conducted.

Still, neither self-conducted security assessments under GDPR nor self-conducted PIPIAs under PIPL mean that a subject organization can rest on its laurels. The penalties for noncompliance under both frameworks can be severe.

But one framework's penalties may be more severe than the other's.

4) PIPL is potentially much harsher.

In the roughly two-year period between when GDPR was adopted by the EU Member States in 2016 and the day it went into effect, the business world panicked. Yes, GDPR was a sweeping legal framework claiming universal reach. But even companies normally subject to EU regulations were concerned—because of the potential penalties.

Prior to GDPR coming into effect, the maximum penalties for personal-data-related blunders were tame by today's comparisons. But GDPR raised the stakes. Violators of GDPR provisions may, for the worst infractions, face a maximum of either €20 million or 4% of total annual revenue globally—whichever is greater. Corporate monoliths and midsize firms alike took notice, while small enterprises fretted over the spectre of bankruptcy.

PIPL goes yet further than that. Companies that perpetrate a "grave" violation of PIPL may face up to 5% of their total global annual revenue (or, if greater, ¥50 million—equal to about €7.26 million). If that extra percentage point isn't scary enough, it should be noted that such fines do not appear to be restitutional; in addition to these fines, any and all of a PIPL violator's "unlawful income" may be subject to forfeiture. To make matters worse for those who cross the line, Chinese authorities may also suspend or revoke a violator's business license in China.

As much as businesses as a whole have to lose for PIPL violations, individual employees and officers may have even more to lose. PIPL creates personal liability for individuals. Employees "directly responsible" for the gravest of violations may face fines of up to ¥1 million.

They may also sustain lasting damage to their livelihoods (and possibly way of life itself). After being found directly responsible for a PIPL violation, an employee may be prohibited from holding management or privacy-officer roles "for a certain period". Even after that period, a record of the violation may follow the employee; Article 67 of PIPL provides that violations will become a part of an individual's social-credit file.

Accordingly, even if the immediate fines for data-handling violations under PIPL are manageable, global enterprises—and their employees—potentially have much more at stake under PIPL than they did under GDPR.


 

Read more articles about: SecurityInformation Security

More from Information Security