🤖The Bulgarian Data Protection Authority(CPDP) issued a detailed opinion on the use of facial recognition systems in retail stores. The request came from a private security company proposing an “enhanced video surveillance” system using facial recognition to identify individuals previously engaged in unlawful conduct, such as theft or aggression, within the premises of retail clients. The CPDP acknowledged that such processing involves biometric data, which is considered a special category of personal data under Article 9(1) #GDPR and subject to strict conditions. The company argued that Article 9(2)(g) GDPR (processing for reasons of substantial public interest under Union or Member State law) could justify the system, highlighting objectives such as protecting property and public order. However, the CPDP rejected this reasoning. It emphasized that the notion of “public interest” must be grounded in specific legal provisions, ensuring the processing is necessary, proportionate, and safeguards data subjects’ rights. It held that preventing petty crimes like shoplifting does not meet the threshold of “substantial public interest” justifying biometric data use under Article 9(2)(g) GDPR The CPDP also underlined that none of the other exceptions under Article 9(2) GDPR apply, including consent, given the imbalance of power and lack of freely given, explicit consent in the retail context. Furthermore, the Authority found the proposed processing incompatible with the principles of purpose limitation, data minimisation, and necessity under Article 5 GDPR. Importantly, the CPDP referenced EDPB Guidelines 3/2019 and the #AIAct, stressing that facial recognition systems used in uncontrolled environments raise significant risks, including bias and discrimination. The opinion concludes that the proposed use of facial recognition in stores is unlawful under the GDPR and may only be justified in high-security contexts, such as national security or critical infrastructure protection. #AI #biometrics #retail
Legal Frameworks Governing Biometric Security
Explore top LinkedIn content from expert professionals.
Summary
Legal frameworks governing biometric security are sets of laws and regulations that manage how sensitive biometric information—like fingerprints, facial scans, or voice patterns—is collected, stored, and used to protect privacy and prevent misuse. These rules require transparency, clear consent, and strong safeguards to ensure that personal biometric data is handled responsibly.
- Prioritize transparency: Inform individuals when biometric data is collected and clearly explain how it will be used, making privacy notices easy to understand.
- Secure explicit consent: Always obtain clear and voluntary permission before collecting or processing anyone's biometric data, especially in settings like retail or public institutions.
- Minimize data usage: Only collect the biometric information that is strictly necessary for your stated purpose and ensure it is protected through strong security measures.
-
-
𝑩𝒊𝒐-𝒎𝒆𝒕𝒓𝒊𝒄 𝒅𝒂𝒕𝒂 𝒊𝒔𝒏’𝒕 𝒍𝒊𝒌𝒆 𝒂 𝒑𝒂𝒔𝒔𝒘𝒐𝒓𝒅 𝒚𝒐𝒖 𝒄𝒂𝒏 𝒓𝒆𝒔𝒆𝒕. Once it’s exposed, it’s gone forever. That’s why laws like 𝗜𝗹𝗹𝗶𝗻𝗼𝗶𝘀’ 𝗕𝗶𝗼𝗺𝗲𝘁𝗿𝗶𝗰 𝗜𝗻𝗳𝗼𝗿𝗺𝗮𝘁𝗶𝗼𝗻 𝗣𝗿𝗶𝘃𝗮𝗰𝘆 𝗔𝗰𝘁 (𝗕𝗜𝗣𝗔) are so strict—and why founders who are building with 𝗳𝗶𝗻𝗴𝗲𝗿𝗽𝗿𝗶𝗻𝘁𝘀, 𝗳𝗮𝗰𝗲 𝘀𝗰𝗮𝗻𝘀, 𝗼𝗿 𝘃𝗼𝗶𝗰𝗲 𝗿𝗲𝗰𝗼𝗴𝗻𝗶𝘁𝗶𝗼𝗻 need to think about compliance from day one. I’ve put together a playbook especially for startups that explains: ➡️ The real risks of non-compliance (including penalties running into millions) ➡️Why investors and users both care about how you handle sensitive data ➡️Practical steps founders can take today—policies, consent, retention, and security ➡️Industry examples from fintech, healthtech, HR tech, gaming, and more At 𝗠𝗢𝗦 | 𝗖𝘆𝗯𝗲𝗿𝗗𝗼𝗰𝘀, we work with founders and teams to turn data privacy into a strength—not a hurdle. Done right, compliance builds trust, boosts credibility, and helps you scale with confidence. If you’re building products that touch biometric data, this guide is a must-read. You can find it attached in this post. I’d love for you to follow me for more practical insights on data privacy and global compliance laws—and if you need support embedding privacy into your startup’s DNA, feel free to reach out. #DataPrivacy #BiometricData #Compliance #Startups #PrivacyByDesign #MOS #CyberDocs #TrustByDesign
-
(PRIVACY - Use of biometric data) Information and Privacy Commissioner of Ontario (IPC) recently issued a decision regarding the use of #FacialRecognitionTechnology. The facts are a bit unique in that some students alleged that "smart" vending machines appeared to use facial recognition technology that was collecting facial images without consent or proper notice. Key Issues Identified: - Lack of meaningful consent from individuals whose biometric data was collected. - Inadequate notice and transparency regarding the use of facial recognition. - Failure to demonstrate necessity and proportionality in deploying such invasive surveillance tools. Legal Findings: The IPC concluded that the university’s actions contravened the Freedom of Information and Protection of Privacy Act (FIPPA), reinforcing that biometric surveillance must meet stringent legal and ethical standards—especially within public institutions. This decision is consistent with the analysis taken by the federal privacy commissioner in the Cadillac Fairview decision. Legal professionals and privacy officers should take note of the IPC’s emphasis on transparency, consent, and accountability when implementing AI-driven surveillance systems. #PrivacyLaw #Biometrics #FIPPA #OntarioIPC #DataProtection #Cybersecurity https://lnkd.in/gk4B4ewx
-
China Surprises World with Strict Ban on Facial Recognition Without Consent New Regulations Emphasize Privacy, Consent, and Minimal Impact in Facial Recognition Use In a move that caught global observers off guard, China has introduced sweeping regulations banning the use of facial recognition technology without explicit consent. The new rules, unveiled by the Cyberspace Administration and the Ministry of Public Security, establish a framework aimed at curbing the unchecked spread of surveillance technologies, mandating security, transparency, and necessity in their deployment. This regulatory pivot contrasts sharply with China’s reputation as a leader in state surveillance, signaling a notable shift toward digital rights protections. Key Elements of China’s Facial Recognition Measures • Consent and Purpose-Driven Use: • Facial recognition must serve a clearly defined and necessary purpose, minimizing intrusion on individual rights. • Any collection or processing of facial data must be based on the individual’s informed consent. • Principles for Use: • The technology must be used in a way that has the least impact on individuals’ rights and interests. • Entities are obligated to notify users when their facial data is being collected and processed. • The new rules also mandate the implementation of strong data security and protection measures to safeguard collected biometric data. • Scope and Enforcement: • The regulation applies to both public and private sector entities using facial recognition. • Misuse could lead to administrative penalties or other legal consequences, though enforcement mechanisms remain to be tested. • Ethical and Legal Framing: • The guidance promotes an ethics-first approach to AI, embedding privacy and proportionality into the core legal standards governing its application. • This marks a significant attempt to legally balance technological advancement with individual civil rights. Why This Matters: A Surprising Turn Toward Digital Privacy in a Surveillance Giant China’s new facial recognition rules are noteworthy not only for their content but for the country enacting them. Long viewed as a global leader in AI surveillance and biometric monitoring, China’s shift toward consent-based regulation and individual rights signals a recalibration of its tech governance strategy. While questions remain about how rigorously these rules will be enforced, the move sets a precedent that could influence global norms—particularly as Western democracies struggle to pass similarly clear regulations. It may also reflect China’s broader aim to align its digital policies with international standards, especially as AI technologies come under increasing global scrutiny.
-
Office of the Privacy Commissioner of Canada/Commissariat à la protection de la vie privée du Canada issues detailed guidance for the use of biometrics in a privacy protective manner. Helpful framework for US-based companies to use when utilizing biometrics, in view of (1) the authorization-mandating US biometrics laws (like BIPA, CUBI and Colorado CPA); (2) the fact that biometric information/data is sensitive under the US privacy laws and according to the FTC and (3) the FTC Rite Aid decision (https://lnkd.in/emmvNkYa). It lays out the considerations to include such as: ◆ Make sure your purpose is clear and legitimate ◆ Make sure the product you are choosing does what it is supposed to do ◆ Check accuracy ◆ Make sure to consider other alternatives and determine that this one is better ◆ Consider whether the biometric program or initiative’s impact on privacy is proportional to the benefits gained ◆ Get consent (and it will usually need to be explicit) [in the US this would be GDPR-like consent unless an exception applies; or an opt out/right to limit in CA) ◆ Use the minimum number of biometric characteristics needed ◆ Use verification over identification ◆ Seek to keep the template in the individual’s control ◆ Don't extract secondary information without consent ◆ Limit retention ◆ Adopt technical, physical and organizational measures to protect access and use ◆ Document to ensure accountability ◆ Institute governance ◆ Provide transparency Pic by ChatGPT https://lnkd.in/eZ6rAG43
-
In late April, the Colorado legislature passed a CPA amendment bill that adds protections for biometric data and identifiers. The bill will move to Governor Polis shortly for consideration. Here's what you need to know: 1️⃣ Broader Applicability. The bill amends the CPA but its provisions apply more broadly (except for the right to access). The bill does not track the CPA's 100,000 consumer threshold requirement. 2️⃣ Employee Data. In a significant change, the bill applies protections to the collection of biometric identifiers from employees and prospective employees. Generally speaking, employers must obtain consent for the collection of such data but there are several exemptions. Colorado is the first non-California state to extend its consumer data privacy law into employee data. 3️⃣ Consumers. The bill creates a number of obligations for the collection of consumer's biometric data and identifiers. We go through them at length in the below post but, in general, they include obtaining consent, making disclosures as to how the information is used, numerous prohibitions including around the sale of such data, and information security requirements. 4️⃣ Right to Access. The bill creates a right to access information as to a controller's collection of biometric data. This is not a verified request but there also is no right to access specific biometric data or identifiers. The right to access applies to a narrower set of companies with applicability language that more closely aligns with the CPA's applicability. 5️⃣ Written Policy and Retention. The bill requires controllers to create written policies regarding the retention and deletion of biometric data as well as data breach information. The written guidelines apply to both consumer and employee data but there are differences. 6️⃣ Rulemaking. The Colorado Attorney General has permissive (not mandatory) rulemaking authority. 7️⃣ Enforcement. The bill is enforceable by the Colorado Attorney General and district attorneys. 8️⃣ Effective Date. July 1, 2025. 9️⃣ More. This is a dense bill with a lot to unpack. There's a lot more detail and analysis in the below-linked article. Co-authored with Keith Ybanez, Owen Davis and Taylor Kay Lively.
-
Illinois Biometric Information Privacy Act (BIPA) Amendment via SB2979 offers "Single Recovery" Liability Framework (and therefore limiting the astronomical liability!) Since Cothron v. White Castle Systems, Inc. (2018), it's no secret that the risk around collecting, using and storing biometric information in Illinois greatly increased for employers. In this verdict, the Illinois Supreme Court ruled that a BIPA “violation” occurs EVERY time a person’s biometric data is collected. For employers who have employees working 5 days a week and scanning their fingerprints to clock in/out, this meant up to $1,000 of damages per scan, or up to $5,000 per scan if there was a willful violation. For an employee clocking in/out and also taking lunch, this could mean up to $4,000 per day. Obviously, extremely detrimental to Illinois businesses, especially with the gray area around whether "electronic signatures" would hold up in Court as consent requirements. So, what changed? On May 16, 2024, the Illinois House of Representatives approved Senate Bill 2979 which amends the IL BIPA to limit damages to ONE violation per individual, versus EACH instance. "For purposes of subsection (d) of Section 15, a private entity that, in more than one instance, discloses, rediscloses, or otherwise disseminates the same biometric identifier or biometric information from the same person to the same recipient using the same method of collection in violation of subsection (d) of Section 15 has committed a single violation of subsection (d) of Section 15 for which the aggrieved person is entitled to, at most, one recovery under this Section regardless of the number of times the private entity disclosed, redisclosed, or otherwise disseminated the same biometric identifier or biometric information of the same person to the same recipient. (Source: P.A. 95-994, eff. 10-3) Other key highlights: - This change is meant to be prospective, and not retroactive (this would not apply to any pending BIPA cases). - SB 2979 includes a new definition of “electronic signature” and states that the “written release” required to collect, disclose or share biometric data includes an electronic signature. "Electronic signature means an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record." - The amendment is to take effect immediately upon becoming law. The General Assembly has thirty days to send the bill to Governor J.B. Pritzker, who has sixty days to sign it. I'll place the full text of the bill in the comments below. Pay attention to Sections 15(b) and 15(d). I believe this change aligns with what the BIPA was created for intentionally. For questions or comments, please leave them here or join the HRA of Greater Oak Brook for our legal update next week June 11th! #EmploymentLaw #HumanResources #Compliance #biometric #Illinois #HR #BIPA #privacy #HRAoakbrook #SHRM