LinkedIn Report pursuant to Article 3, Subsection (g)(vii) of Regulation (EU) 2021/1232

Last updated: 1 year ago
Pursuant to Article 3, Subsection (g)(vii) of Regulation (EU) 2021/1232, LinkedIn provides the following report on its data processing activities in connection with the use of technology to detect child sexual abuse material (“CSAM”) (specific to LinkedIn Number-Independent Interpersonal Communications Services [NI-ICS] for July 14 through December 31, 2021.
1. Type and volumes of data processed during the reporting period:
Over 8 million files originating from the European Union (“EU”).
2. Specific ground relied on for the processing pursuant to Regulation (EU) 2016/679:
Regulation (EU) 2021/1232 and public interest under GDPR Article 6(1)(e).
3. The ground relied on for transfers of personal data outside the European Union pursuant to Chapter V of Regulation (EU) 2016/679, where applicable:
Varies based on transfer, including standard contractual clauses under GDPR Article 46(2)(c).
4. Number of cases of online child sexual abuse identified:
31 files originating from the EU confirmed as CSAM during the period, all of which were reported to the National Center for Missing and Exploited Children (NCMEC).
5. Number of cases in which a user has lodged a complaint with the internal redress mechanism or with a judicial authority, and the outcome of such complaints:
None.
6. Number and ratios of errors (false positives) of the different technologies used:
LinkedIn uses hash-matching technology (including PhotoDNA) to detect known CSAM imagery. Hash-matching technology works by using a mathematical algorithm to create a unique signature (known as a “hash”) for digital images and videos. The hashing technology then compares the hashes generated from UGC with hashes of reported (known) CSAM, in a process called “hash matching.” LinkedIn then implements manual review and confirmation of CSAM hashes to monitor the performance of CSAM detection tools and support our moderation processes.

Using the above process, during the reporting period 75 files originating from the EU were manually reviewed, 31 of which were confirmed as CSAM.
7. Measures applied to limit the numbers and ratios of errors (false positives) of the different technologies used:
See response number 6 regarding manual review.
8. The retention policy and data protection safeguards applied pursuant to Regulation (EU) 2016/679:
Where content is confirmed as CSAM, it is retained in a separate, specialized CSAM container for 90 days as required by law. After 90 days, the data is purged.
9. The names of organizations acting in the public interest against child sexual abuse with which data has been shared:
The National Center for Missing and Exploited Children (NCMEC).