Home New Trending Search
About Privacy Terms
#
#Biometricdata
Posts tagged #Biometricdata on Bluesky
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

FYI: Spain fines Yoti €950,000 over biometric data and consent failures #GDPR #BiometricData #DataPrivacy #Consent #DataProtection

0 0 1 1
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

FYI: Spain fines Yoti €950,000 over biometric data and consent failures #GDPR #BiometricData #DataPrivacy #Consent #DataProtection

0 0 0 0
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

ICYMI: Spain fines Yoti €950,000 over biometric data and consent failures #GDPR #DataPrivacy #BiometricData #Consent #Yoti

0 0 0 0
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

ICYMI: Spain fines Yoti €950,000 over biometric data and consent failures #GDPR #DataPrivacy #BiometricData #Consent #Yoti

0 0 0 0
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

Spain fines Yoti €950,000 over biometric data and consent failures #Spain #Yoti #GDPR #BiometricData #DataPrivacy

1 0 0 0
Preview
Spain fines Yoti €950,000 over biometric data and consent failures Spain's AEPD fined Yoti Ltd €950,000 for three GDPR violations involving biometric data, invalid consent, and excessive data retention in its age verification app.

Spain fines Yoti €950,000 over biometric data and consent failures #Spain #Yoti #GDPR #BiometricData #DataPrivacy

1 0 0 0
Preview
Spain fines FC Barcelona €500,000 for failing biometric data protection assessment Spain's AEPD fines FC Barcelona €500,000 for an inadequate data protection impact assessment covering biometric facial and voice data of 143,000 members.

FYI: Spain fines FC Barcelona €500,000 for failing biometric data protection assessment #FCBarcelona #DataProtection #BiometricData #PrivacyLaws #AEPD

0 0 0 0
Preview
Spain fines FC Barcelona €500,000 for failing biometric data protection assessment Spain's AEPD fines FC Barcelona €500,000 for an inadequate data protection impact assessment covering biometric facial and voice data of 143,000 members.

FYI: Spain fines FC Barcelona €500,000 for failing biometric data protection assessment #FCBarcelona #DataProtection #BiometricData #PrivacyLaws #AEPD

1 0 0 0
Preview
Spain fines FC Barcelona €500,000 for failing biometric data protection assessment Spain's AEPD fines FC Barcelona €500,000 for an inadequate data protection impact assessment covering biometric facial and voice data of 143,000 members.

Spain fines FC Barcelona €500,000 for failing biometric data protection assessment #DataProtection #PrivacyLaw #FCBarcelona #BiometricData #AEPD

0 0 0 0
Preview
Spain fines FC Barcelona €500,000 for failing biometric data protection assessment Spain's AEPD fines FC Barcelona €500,000 for an inadequate data protection impact assessment covering biometric facial and voice data of 143,000 members.

Spain fines FC Barcelona €500,000 for failing biometric data protection assessment #DataProtection #PrivacyLaw #FCBarcelona #BiometricData #AEPD

0 0 0 0
Computerized identification and data insecurity in Pakistan | LSE Research
Computerized identification and data insecurity in Pakistan | LSE Research YouTube video by LSE

🎥 In this new research video, Dr Asif Ali Akhtar reveals how #Pakistan’s high-tech biometric databases were actually built on colonial-era blueprints of control:
youtu.be/j_0F9UCHKHA?...

#Datafication #Cybersecurity #partofLSE #biometricdata #NADRA #surveillance #postcolonial

0 0 0 0
Preview
We read X's new privacy policy so you don't have to The company will now collect a lot more personal data and use it in new ways. We break it down for you.

X (Twitter) now collects biometric data, employment history, encrypted message metadata, and uses all content to train AI models, with vague policies on third-party sharing despite "free" access #XPrivacy #BiometricData #AITraining #TwitterDataGrab mashable.com/article/x-tw...

6 6 1 0

Navigating Biometric Data Laws? 🚨 Know your state's regulations! Illinois, Texas, and Washington lead the charge on specific biometric data statutes. #TechStartup #BiometricData #StateLaws

0 0 0 0

High-tech companies, keep an eye on emerging state legislation for employment-related biometric data. Manage your workforce data lawfully. #EmployeePrivacy #BiometricData #StartupPolicy

2 0 0 0
The Unsettling Invasion of AI in the Gaming World
The Unsettling Invasion of AI in the Gaming World YouTube video by Jen Ophelia

Jen Ophelia: "The Unsettling Invasion of AI in the Gaming World" | #Gaming #GamingIndustry #GamingNews #News #GenAI #ArtTheft #PlagiarismLaundering #Ubisoft #EA #Krafton #CDProjektRed #Consent #Ownership #Exploitation #Control #BiometricData #Monetization #Artbreeder
www.youtube.com/watch?v=U_Fc...

0 0 0 0
Post image

There is no reason to let the USA own sensitive data concerning people living in the EU. #MAGA going too far. Our freedom is our own. Let it stay that way.

#biometricdata #Europe #USA #data #stopworldcontrol #freedomofthought #freedom #peaceofmind

0 0 0 0

🚨👀 Wegmans in NYC scans eyes, faces, and voices for "safety"—is this a fair trade-off for privacy? 🤔 Share your thoughts! #PrivacyMatters #Wegmans #BiometricData LINK

0 0 0 0
Preview
Disruption Network Lab: Techno-Policing & Civic Control - with Sonja Peteranderl and Matthias Monroy Upcoming Conference: 'Exposing Crimes is not a Crime: The Real-World Consequences of WikiLeaks', March 19–22 2025, Berlin - This episode: Techno-Policing & Civic Control With: Sonja Peteranderl (...

#Techno-Policing & Civic Control - #Podcast on Digital Policing, #Palantir, #PredictivePolicing, #Riskscoring, #BiometricData

with @matthimon.bsky.social & @crimewatch.bsky.social

disruptionlab.libsyn.com/techno-polic... @disruptionlab.bsky.social

3 1 0 0

Tell #Apple you don’t wan’t #iOS26 because it requires your #biometricdata . G’wan! Ya know you want to!

0 0 2 0

Fuck iOS 26.1 and the AI it rode in on. I will not be updating my phone to that no matter how many times you remind me, Apple!🍏

No one needs my #biometricdata

1 0 0 0
Preview
Big Tech’s New Rule: AI Age Checks Are Rolling Out Everywhere   Large online platforms are rapidly shifting to biometric age assurance systems, creating a scenario where users may lose access to their accounts or risk exposing sensitive personal information if automated systems make mistakes. Online platforms have struggled for decades with how to screen underage users from adult-oriented content. Everything from graphic music tracks on Spotify to violent clips circulating on TikTok has long been available with minimal restrictions. Recent regulatory pressure has changed this landscape. Laws such as the United Kingdom’s Online Safety Act and new state-level legislation in the United States have pushed companies including Reddit, Spotify, YouTube, and several adult-content distributors to deploy AI-driven age estimation and identity verification technologies. Pornhub’s parent company, Aylo, is also reevaluating whether it can comply with these laws after being blocked in more than a dozen US states. These new systems require users to hand over highly sensitive personal data. Age estimation relies on analyzing one or more facial photos to infer a user’s age. Verification is more exact, but demands that the user upload a government-issued ID, which is among the most sensitive forms of personal documentation a person can share online. Both methods depend heavily on automated facial recognition algorithms. The absence of human oversight or robust appeals mechanisms magnifies the consequences when these tools misclassify users. Incorrect age estimation can cut off access to entire categories of content or trigger more severe actions. Similar facial analysis systems have been used for years in law enforcement and in consumer applications such as Google Photos, with well-documented risks and misidentification incidents. Refusing these checks often comes with penalties. Many services will simply block adult content until verification is completed. Others impose harsher measures. Spotify, for example, warns that accounts may be deactivated or removed altogether if age cannot be confirmed in regions where the platform enforces a minimum age requirement. According to the company, users are given ninety days to complete an ID check before their accounts face deletion. This shift raises pressing questions about the long-term direction of these age enforcement systems. Companies frequently frame them as child-safety measures, but users are left wondering how long these platforms will protect or delete the biometric data they collect. Corporate promises can be short-lived. Numerous abandoned websites still leak personal data years after shutting down. The 23andMe bankruptcy renewed fears among genetic testing customers about what happens to their information if a company collapses. And even well-intentioned apps can create hazards. A safety-focused dating application called Tea ended up exposing seventy-two thousand users’ selfies and ID photos after a data breach. Even when companies publicly state that they do not retain facial images or ID scans, risks remain. Discord recently revealed that age verification materials, including seventy thousand IDs, were compromised after a third-party contractor called 5CA was breached. Platforms assert that user privacy is protected by strong safeguards, but the details often remain vague. When asked how YouTube secures age assurance data, Google offered only a general statement claiming that it employs advanced protections and allows users to adjust their privacy settings or delete data. It did not specify the precise security controls in place. Spotify has outsourced its age assurance system to Yoti, a digital identity provider. The company states that it does not store facial images or ID scans submitted during verification. Yoti receives the data directly and deletes it immediately after the evaluation, according to Spotify. The platform retains only minimal information about the outcome: the user’s age in years, the method used, and the date the check occurred. Spotify adds that it uses measures such as pseudonymization, encryption, and limited retention policies to prevent unauthorized access. Yoti publicly discloses some technical safeguards, including use of TLS 1.2 by default and TLS 1.3 where supported. Privacy specialists argue that these assurances are insufficient. Adam Schwartz, privacy litigation director at the Electronic Frontier Foundation, told PCMag that facial scanning systems represent an inherent threat, regardless of whether they are being used to predict age, identity, or demographic traits. He reiterated the organization’s stance supporting a ban on government deployment of facial recognition and strict regulation for private-sector use. Schwartz raises several issues. Facial age estimation is imprecise by design, meaning it will inevitably classify some adults as minors and deny them access. Errors in facial analysis also tend to fall disproportionately on specific groups. Misidentification incidents involving people of color and women are well documented. Google Photos once mislabeled a Black software engineer and his friend as animals, underlining systemic flaws in training data and model accuracy. These biases translate directly into unequal treatment when facial scans determine whether someone is allowed to enter a website. He also warns that widespread facial scanning increases privacy and security risks because faces function as permanent biometric identifiers. Unlike passwords, a person cannot replace their face if it becomes part of a leaked dataset. Schwartz notes that at least one age verification vendor has already suffered a breach, underscoring material vulnerabilities in the system. Another major problem is the absence of meaningful recourse when AI misjudges a user’s age. Spotify’s approach illustrates the dilemma. If the algorithm flags a user as too young, the company may lock the account, enforce viewing restrictions, or require a government ID upload to correct the error. This places users in a difficult position, forcing them to choose between potentially losing access or surrendering more sensitive data. Do not upload identity documents unless required, check a platform’s published privacy and retention statements before you comply, and use account recovery channels if you believe an automated decision is wrong. Companies and regulators must do better at reducing vendor exposure, increasing transparency, and ensuring appeals are effective.  Despite these growing concerns, users continue to find ways around verification tools. Discord users have discovered that uploading photos of fictional characters can bypass facial age checks. Virtual private networks remain a viable method for accessing age-restricted platforms such as YouTube, just as they help users access content that is regionally restricted. Alternative applications like NewPipe offer similar functionality to YouTube without requiring formal age validation, though these tools often lack the refinement and features of mainstream platforms.

Big Tech’s New Rule: AI Age Checks Are Rolling Out Everywhere #AgeVerification #ArtificialIntelligence #Biometricdata

1 0 0 0

@1justech.bsky.social

#ICE forced
#BiometricData collection

0 1 0 0

DHS offers “disturbing new excuses” to seize kids’ biometric data, expert says https://arstechni.ca #DepartmentofHomelandSecurity #customsandborderprotection #biometricdata #voiceprints #DNAtesting #facescans #irisscans #Policy #cbp #DHS #ice

0 0 0 0

Palm payments sound futuristic until you realize Amazon now owns your vein map.

One breach, one subpoena, one partnership away from total ID tracking.

Break up Amazon. Protect your privacy.

#AmazonOne #DigitalPrivacy #BiometricData #Surveillanc

1 0 0 0

ICE’s forced face scans to verify citizens is unconstitutional, lawmakers say https://arstechni.ca #customsandborderprotection #facialrecognition #biometricdata #onlineprivacy #facescans #Policy #ice

0 0 0 0
Preview
Colombia Shuts Down Worldcoin: Biometric Data Collected for $25 Deemed Illegal Colombia has ordered the immediate and permanent cessation of Worldcoin's operations in the country, citing widespread violations of personal data protection

Colombia Shuts Down Worldcoin: Biometric Data Collected for $25 Deemed Illegal

#biometricdata #Colombia #SuperintendenciadeIndustriayComercio #ToolsforHumanity #WorldFoundation

0 0 0 0

#DataProtection #PersonalData #GDPR #UKGDPR #TerritorialScope #MaterialScope #InternationalLaw #WebScraping #BehaviouralMonitoring #AI #ArtificialIntelligence #AITraining #NationalSecurity #LawEnforcement #Law #Legal #Regulation #Compliance #FacialRecognition #BiometricData #SpecialCategoryData

2 0 0 0
Facial Privacy: New Policy Push to Guard Biometric Data

Facial Privacy: New Policy Push to Guard Biometric Data

A new policy push urges treating facial data as an inalienable right, demanding explicit consent and transparent use; the analysis was published in Oct 2025. Read more: getnews.me/facial-privacy-new-polic... #facialprivacy #biometricdata #privacy

1 0 0 0
Post image

#digitalID #biometricdata #travel #USA #EU #integrity (?)

"For Americans who value freedom and privacy, this is more than an inconvenience — it's a warning. Your biometric data, your identity, your face are now part of a system that is accountable to governments, not to you." The Doctors Appeal

0 0 1 0