<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Biometric Update</title>
	<atom:link href="https://www.biometricupdate.com/feed" rel="self" type="application/rss+xml" />
	<link>https://www.biometricupdate.com</link>
	<description>Biometrics News, Companies and Explainers</description>
	<lastBuildDate>Sat, 16 May 2026 19:40:00 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
<site xmlns="com-wordpress:feed-additions:1">66434804</site>	<item>
		<title>ID4Africa speakers urge legal identity inclusion for refugees, stateless persons</title>
		<link>https://www.biometricupdate.com/202605/id4africa-speakers-urge-legal-identity-inclusion-for-refugees-stateless-persons</link>
					<comments>https://www.biometricupdate.com/202605/id4africa-speakers-urge-legal-identity-inclusion-for-refugees-stateless-persons#respond</comments>
		
		<dc:creator><![CDATA[Ayang Macdonald]]></dc:creator>
		<pubDate>Sat, 16 May 2026 19:40:00 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Features and Interviews]]></category>
		<category><![CDATA[ID for All]]></category>
		<category><![CDATA[birth registration]]></category>
		<category><![CDATA[civil registration]]></category>
		<category><![CDATA[digital identity]]></category>
		<category><![CDATA[ID4Africa]]></category>
		<category><![CDATA[ID4Africa 2026]]></category>
		<category><![CDATA[legal identity]]></category>
		<category><![CDATA[national ID]]></category>
		<category><![CDATA[SDG 16.9]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342117</guid>

					<description><![CDATA[
		<img width="992" height="744" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/16153209/id4africa-2026-legal-identity-stats.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="Stage at ID4AFRICA 2026 with a speaker at the podium; screens show a poster about belonging and a statelessness infographic, audience silhouettes in front." decoding="async" fetchpriority="high" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/16153209/id4africa-2026-legal-identity-stats.jpg 992w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/16153209/id4africa-2026-legal-identity-stats-300x225.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/16153209/id4africa-2026-legal-identity-stats-150x113.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/16153209/id4africa-2026-legal-identity-stats-768x576.jpg 768w" sizes="(max-width: 992px) 100vw, 992px" />
		African governments must accelerate efforts to provide legal and digital identity to refugees and stateless populations, according to speakers at the <a href="https://www.biometricupdate.com/tag/id4africa-2026">2026 ID4Africa Annual General Meeting</a> in Abidjan.

Officials from identity agencies, civil registration authorities, UN organizations and development institutions called for more inclusive national ID systems, stronger legal safeguards and deeper integration between civil registration and foundational identity infrastructure.

The discussion comes as an estimated 4.4 million people worldwide remain stateless and roughly 800 million people still lack <a href="https://www.biometricupdate.com/202409/legal-identity-and-why-it-matters">legal identity</a>, despite years of efforts tied to the UN Sustainable Development Goal 16.9 target and the World Bank’s Identity for Development (<a href="https://www.undp.org/">ID4D</a>) agenda.

Speakers repeatedly emphasized integration between civil registration and vital statistics (CRVS) systems and national identity platforms, arguing that disconnected systems continue to leave vulnerable populations outside formal identity frameworks.

Dr Patrick Eba, Deputy Director at the UNHCR’s International Protection and Solutions Division, said governments should prioritize inclusion of refugees and stateless persons in national ID systems, safeguards against discrimination and misuse, accessible correction and redress mechanisms, and stronger integration between civil registration and identity systems.

“Universality cannot be just a word; it must work in practice,” Eba said.

Beyond the aspect of inclusion, countries must also build safeguards into ID systems from the onset.

“They must protect against discrimination, feature robust governance structures, and ensure that data collection is fully informed and consensual. Equally important are accessible redress mechanisms. If individuals believe their information has been recorded incorrectly or under the wrong category, they must have a clear pathway to correction. These safeguards cannot be an afterthought; they must be embedded by design,” Eba stated.

The UNHCR official also emphasized the point on CRVS-ID integration: “ID and civil registration systems should be better integrated so that birth registration and identity enrollment can help identify stateless individuals and connect them to nationality pathways, whether through domestic recognition or consular referral to a country of linkage.”

Heads of civil registration authorities from countries including Chad, Mali, <a href="https://www.biometricupdate.com/202511/angola-begins-issuance-of-id-cards-to-newborns">Angola</a>, Cameroon, Senegal, Benin, Sierra Leone, Tanzania and Côte d’Ivoire outlined ongoing efforts to improve birth registration, connect CRVS systems with national ID platforms and extend legal identity coverage to marginalized populations.

Speakers said meaningful integration will require stronger governance frameworks, secure data systems and interoperability between <a href="https://www.biometricupdate.com/202409/linking-crvs-national-id-systems-transformative-for-govts-citizens">civil registration and national identity platforms</a>.

Bhaskar Mishra, a CRVS and legal identity specialist at UNICEF, said CRVS-ID integration has become an operational necessity rather than a policy aspiration. He warned that many countries continue to struggle because agencies operate in silos or adopt models poorly suited to local realities.

UNICEF is developing a practical implementation guide for CRVS-ID integration that could be released later this year, Mishra said.

Speakers warned that without stronger integration between civil registration, national identity and refugee protection systems, millions of vulnerable people will remain excluded from formal services, legal protections and participation in the digital economy.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/id4africa-speakers-urge-legal-identity-inclusion-for-refugees-stateless-persons/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342117</post-id>	</item>
		<item>
		<title>Biometrics lawyer Dan Saeedi talks BIPA on Biometric Update Podcast</title>
		<link>https://www.biometricupdate.com/202605/biometrics-lawyer-dan-saeedi-talks-bipa-on-biometric-update-podcast</link>
					<comments>https://www.biometricupdate.com/202605/biometrics-lawyer-dan-saeedi-talks-bipa-on-biometric-update-podcast#respond</comments>
		
		<dc:creator><![CDATA[Joel R. McConvey]]></dc:creator>
		<pubDate>Sat, 16 May 2026 18:01:53 +0000</pubDate>
				<category><![CDATA[Biometric Update Podcast]]></category>
		<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Features and Interviews]]></category>
		<category><![CDATA[Biometric Information Privacy Act (BIPA)]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[Blank Rome]]></category>
		<category><![CDATA[lawsuits]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342098</guid>

					<description><![CDATA[
		<img width="2048" height="1152" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-300x169.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-1024x576.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-150x84.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-768x432.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/09/12092524/face-iris-biometrics-1536x864.jpg 1536w" sizes="(max-width: 2048px) 100vw, 2048px" />
		Dan Saeedi is a BIPA buster. The renowned Chicago attorney, CIPP/US,a partner and team co-lead of the biometric privacy team at <a href="https://www.blankrome.com/">Blank Rome</a>, specializes in biometrics, class action litigation and privacy law. He’s handled over 50 class actions related to Illinois’ Biometric Information Privacy Act and other privacy laws, including the first BIPA case tried in Illinois federal court. Per his <a href="https://www.blankrome.com/people/daniel-r-saeedi/#overview">bio</a> for law firm BlankRome, Saeedi “consistently delivers favorable outcomes for clients entangled in consumer and employee privacy disputes across various industries, including manufacturing, logistics, IT, gaming, marketing, healthcare, entertainment, and retail.”

Which is to say, if companies get hit with big-ticket biometric class action suits under <a href="https://www.biometricupdate.com/tag/bipa-biometric-information-privacy-act">BIPA</a>, Dan Saeedi can help.

“There are still filings, usually every day,” Saeedi says, “but not at the same frequency, and I do think that’s a function of the lower hanging fruit cases having been filed and dealt with. And now there’s more of a focus on potentially larger cases, maybe cases involving artificial intelligence, maybe cases involving new products, new technology deployments. It’s not as frequent, but it’s potentially more potent.”

The full conversation is available on the latest episode of the <em>Biometric Update Podcast</em>.

<iframe style="border: none; min-width: min(100%, 430px); height: 150px;" title="Ep. 45: BIPA-busting attorney Dan Saeedi talks biometric law" src="https://www.podbean.com/player-v2/?i=ifrbe-1ac5cf1-pb&amp;from=pb6admin&amp;share=1&amp;download=1&amp;rtl=0&amp;fonts=Arial&amp;skin=1b1b1b&amp;font-color=auto&amp;logo_link=episode_page&amp;btn-skin=3267a3" width="100%" height="150" scrolling="no" data-name="pb-iframe-player"></iframe>
<h2>Listen now: <a href="https://open.spotify.com/episode/2EeOXZURJxuJ5h3pmpbFwM?si=b480aa7a1c144af7">Spotify</a>, <a href="https://podcasts.apple.com/us/podcast/ep-45-bipa-busting-attorney-dan-saeedi-talks-biometric-law/id1809448800?i=1000768004499">Apple</a>, <a href="https://youtu.be/te_KSVVGOGk?si=OD71cxX3cek6k4jS">YouTube</a>, <a href="https://biometricupdatepodcast.podbean.com/e/ep-45-bipa-busting-attorney-dan-saeedi-talks-biometric-law/">Podbean</a></h2>
Runtime: 00:18:52]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/biometrics-lawyer-dan-saeedi-talks-bipa-on-biometric-update-podcast/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342098</post-id>	</item>
		<item>
		<title>World Bank, African DPAs outline formula for trusted digital identity, DPI</title>
		<link>https://www.biometricupdate.com/202605/world-bank-african-dpas-outline-formula-for-trusted-digital-identity-dpi</link>
					<comments>https://www.biometricupdate.com/202605/world-bank-african-dpas-outline-formula-for-trusted-digital-identity-dpi#respond</comments>
		
		<dc:creator><![CDATA[Chris Burt]]></dc:creator>
		<pubDate>Sat, 16 May 2026 18:00:21 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Features and Interviews]]></category>
		<category><![CDATA[ID for All]]></category>
		<category><![CDATA[Benin]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[data protection]]></category>
		<category><![CDATA[digital identity]]></category>
		<category><![CDATA[digital public infrastructure]]></category>
		<category><![CDATA[digital trust]]></category>
		<category><![CDATA[ID4Africa]]></category>
		<category><![CDATA[ID4Africa 2026]]></category>
		<category><![CDATA[Liberia]]></category>
		<category><![CDATA[World Bank]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342108</guid>

					<description><![CDATA[
		<img width="2048" height="1139" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-scaled.png" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-scaled.png 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-300x167.png 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-1024x570.png 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-150x83.png 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-768x427.png 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/08/19111418/world-bank-1536x855.png 1536w" sizes="(max-width: 2048px) 100vw, 2048px" />
		Trust has moved steadily to the center of the conversation around digital public infrastructure and identity at ID4Africa, and the <a href="https://www.biometricupdate.com/tag/id4africa-2026">2026 AGM</a> addressed the topic directly with a Friday presentation led by the World Bank.

The data lifecycle is not a familiar concept, let alone a well understood one, for most people around the world. But any weakness within it can break the trust of an individual or the general public.
<h2>99 problems but user fraud ain’t one</h2>
World Bank Digital Development Specialist Prakhar Bhardwaj noted that Kenya, India and Uganda all have successful digital ID systems, but suffered legal or operational break-downs due to weak safeguards in the data collection process.

He also explained a hypothetical example of how trust is tested at each point in the data lifecycle. The rural farmer in the example suffers from a range of problems including unmatched biometrics and false allegations of fraud. She is isolated from the authorities which are supposed to help, and not only loses trust in the hypothetical digital identity system herself, but also serves as a negative example for her neighbors.

His colleague Dr. Zhijun William Zhang identified six separate risks with storing identity data, from inconsistent vendor maturity through single points of failure, function creep, cryptographic gaps and weak insider threat controls to absent incident reporting and redress mechanisms. Cryptographic effectiveness also erodes over time.

Government databases in Bangladesh, the Philippines and Brazil have all suffered data breaches contributing to public mistrust, despite the latter being considered a digitally mature nation.

Practical risks in data processing have already led to problems in the Netherlands, in the form of a biased fraud risk assessment algorithm, and in India in the form of function creep.

The growth of DPI also creates an ever-larger attack surface. Defenses must therefore be constantly enhanced to cover the new territory, Zhang says.

Oversight must therefore be in place at the organizational level, as well by an independent party, which should handle redress. And stakeholders must collaborate, even and especially under stress.
<h2>DPAs and trust in DPI</h2>
Taylor Reynolds from the World Bank then moderated a panel featuring Lorpu Page, ED of Liberia’s Independent Information Commission, Amouda Abou Seydou, advisor and rapporteur to Benin’s APDP and Drudeisha Madhub, data protection commissioner for Mauritius.

Liberia’s <a href="https://www.liberianobserver.com/politics/house-reviews-privacy-data-protection-bill/article_6675211a-e8bc-43c6-8e04-e73cbf26278a.html">data protection bill</a> to empower the data protection commission is currently going through the country’s legislature.

Reynolds emphasized that while DPI is inherently data-intensive, data protection authorities are frequently under-resourced.

Weak trust is behind many data localization laws on the continent, Madhub noted, yet <a href="https://www.biometricupdate.com/202605/the-frontline-of-digital-identity-innovation-spans-the-global-south">weak enforcement</a> regular non-compliance results in the remedy worsening the harm. Digital forensics, cybersecurity, data protection and data governance must all go together, she says.

Benin set up the APDP to operate as one piece of the broader ecosystem in part to ensure its data protection principles are operational, Seydou says. Effective data protection implementation requires reliable supporting partners, Page added.

Businesses have been known to plead ignorance about data protection, which is why Mauritius mandated chief data officers. While the country is relatively advanced in its data protection regime, it is introducing amendments to its existing law to bring in administrative fines, which take less time than prosecution.

For data protection measures to increase trust, the public must also be aware of them, Seydou pointed out. Yet resource-constrained DPAs are not well positioned to run population-scale sensitization campaigns.

And people must trust the DPA in order to bring forward problems that can reduce trust in DPI and digital identity systems, said Page.

The proliferation of AI introduces another wrinkle, and an opportunity for organizations to use data in a way that was not even possible when it was collected. Benin has a specific policy for AI, according to Seydou, but also strict purpose limitations around data processing.

When compliance is breached, it does not need to result in a lengthy and expensive court battle. Madhub revealed that her office has issued 3,000 enforcement actions a year, but 99 percent of them are resolved through an amicable dispute resolution process, so do not need to proceed to prosecution.

In Benin, a government liaison provides a bridge between the DPA and the rest of the government. Seydou provided an example of how civil cooperation should work in his office’s interaction with the country’s electoral agency to have extraneous information stripped out of the electoral roll to make it compliant prior to its publication.

The benefit of cooperation also extends to DPAs themselves. Mateo Garcia Silva, digital transformation consultant with the World Bank and Rose Mosero, data protection and cybersecurity advisor to the East Africa Community (EAC) followed the panel with a discussion how data protection authorities work together under regional and continental frameworks.

The EAC has created a body to allow the DPAs of its eight member states to share knowledge to build capacity, align their practices, and if necessary, collaborate on investigations.

Careful system architecture, cybersecurity and data protection that is both enforced and at least somewhat understood by the public are therefore all part of the formula for digital identity systems that deserve the trust of citizens and businesses alike.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/world-bank-african-dpas-outline-formula-for-trusted-digital-identity-dpi/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342108</post-id>	</item>
		<item>
		<title>UK watchdog warns of legal risks as London police deploy LFR at protest</title>
		<link>https://www.biometricupdate.com/202605/uk-watchdog-warns-of-legal-risks-as-london-police-deploy-lfr-at-protest</link>
					<comments>https://www.biometricupdate.com/202605/uk-watchdog-warns-of-legal-risks-as-london-police-deploy-lfr-at-protest#respond</comments>
		
		<dc:creator><![CDATA[Masha Borak]]></dc:creator>
		<pubDate>Fri, 15 May 2026 17:27:24 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Facial Recognition]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[biometric matching]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[Biometrics and Surveillance Camera Commissioner]]></category>
		<category><![CDATA[false arrest]]></category>
		<category><![CDATA[law enforcement]]></category>
		<category><![CDATA[live facial recognition]]></category>
		<category><![CDATA[London Metropolitan Police]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342074</guid>

					<description><![CDATA[
		<img width="1024" height="683" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2024/09/03114134/essex-police-live-FRT.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2024/09/03114134/essex-police-live-FRT.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2024/09/03114134/essex-police-live-FRT-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2024/09/03114134/essex-police-live-FRT-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2024/09/03114134/essex-police-live-FRT-768x512.jpg 768w" sizes="auto, (max-width: 1024px) 100vw, 1024px" />
		London’s Metropolitan Police will deploy live facial recognition (LFR) technology at a protest for the first time this weekend, prompting warnings from the UK’s biometrics and surveillance watchdog that police could face legal action over misidentifications and infringements on fundamental rights.

On Wednesday, the Metropolitan Police announced that it would deploy LFR technology at the Unite the Kingdom march on Saturday, organized by far-right, anti-Islam activist Tommy Robinson. The facial recognition cameras will be deployed in London’s borough of Camden in an area likely to be used by those attending the event, according to James Harman, a Met deputy assistant commissioner.

The police force justified the deployment, saying they received intelligence indicating a likely “threat to public safety” from some of the attendees. Eleven “far-right agitators” have been banned from entering the UK, according to the government, including U.S anti-Islam influencer Valentina Gomez, the BBC <a href="https://www.bbc.com/news/articles/c8r8vgnn655o">reports</a>.

Last year’s Unite the Kingdom protests resulted in violent clashes, with police noting that more than 50 outstanding and unidentified suspects have remained after the event. The Met Police is also facing another large protest in London on the same day, the annual pro-Palestine “Nakba Day” march, as well as an FA Cup Final at Wembley, which is expected to bring  thousands of fans across London.  The police force is planning to deploy 4,000 officers to these events.

Biometrics and Surveillance Camera Commissioner William Webster, however, warns that deploying live facial recognition could bring legal issues for the police. Misidentified citizens could bring cases against the police for infringing on fundamental rights such as privacy, freedom of movement and freedom of association.

“There’s no escaping that the technologies are not foolproof,” Webster <a href="https://uk.finance.yahoo.com/news/police-face-court-action-over-082620625.html">told</a> the Press Association. “They will make mistakes, and the risk is that every time a mistake is made, a police force will find themselves in a court of law.”

The Commissioner also called for establishing a legal framework to ensure clarity for law enforcement use of facial recognition. The UK currently relies on a patchwork of laws, including data protection regulation, human rights, and the common law.

“A legal framework will set out in very clear words what different rights are involved here, where the clash is, and how police forces can mitigate protecting all those rights,” says Webster. “It will make it very clear to police forces how they can use these technologies and it will provide the public with the confidence that the police forces are using these technologies appropriately.”

Work on a new legal framework has already begun: In December last year, the UK Home Office launched a 10-week <a href="https://www.biometricupdate.com/202512/uk-tucks-biometric-bias-reports-deep-into-police-facial-recognition-plan">public consultation</a> on the use of facial recognition in law enforcement.

Facial recognition regulation has also received support from King Charles, who outlined the government's law-making plans in the <a href="https://www.biometricupdate.com/202605/the-kings-speech-signals-that-digital-id-in-the-uk-is-a-go-again">King's Speech</a> on Wednesday. The speech called for the creation of a “single, expert regulatory body to provide independent advice and oversight.”

​Regulators <a href="https://www.biometricupdate.com/202605/uk-regulators-pan-patchwork-policy-for-law-enforcement-facial-recognition">seem to agree</a>. Commissioner Webster has previously argued that any legal framework for law enforcement use of biometrics and facial recognition should include clear lines and expanded oversight powers.

Former Biometrics and Surveillance Camera Commissioner Fraser Sampson says the current legal environment prioritizes flexibility to the point of undermining certainty. The commissioner pledged to support a UK <a href="https://www.biometricupdate.com/202605/certainty-vs-flexibility-does-the-uk-need-a-biometric-surveillance-act">Biometric Surveillance Act</a> in a <em>Biometric Update </em>column.

Biometrics Commissioner for Scotland Brian Plastow has also noted that facial recognition is “nowhere near as effective as the police claim it is.”]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/uk-watchdog-warns-of-legal-risks-as-london-police-deploy-lfr-at-protest/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342074</post-id>	</item>
		<item>
		<title>Age assurance debate arrives in Bangladesh</title>
		<link>https://www.biometricupdate.com/202605/age-assurance-debate-arrives-in-bangladesh</link>
					<comments>https://www.biometricupdate.com/202605/age-assurance-debate-arrives-in-bangladesh#respond</comments>
		
		<dc:creator><![CDATA[Joel R. McConvey]]></dc:creator>
		<pubDate>Fri, 15 May 2026 17:26:32 +0000</pubDate>
				<category><![CDATA[Age Assurance]]></category>
		<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[age verification]]></category>
		<category><![CDATA[Bangladesh]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[children]]></category>
		<category><![CDATA[digital identity]]></category>
		<category><![CDATA[regulation]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342026</guid>

					<description><![CDATA[
		<img width="2048" height="1365" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="Person in a red striped shirt sits on the floor near a brick wall, looking at a smartphone with a charging cable plugged into a wall outlet nearby." decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-1024x683.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-768x512.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15111450/age-assurance-social-media-1536x1024.jpg 1536w" sizes="auto, (max-width: 2048px) 100vw, 2048px" />
		The dominos continue to fall in the game of global online safety legislation targeting social media platforms. Bangladesh is weighing what restrictions on social media might look like in a local context, as the government plans the launch of a unique digital ID-connected <a href="https://www.biometricupdate.com/202604/bangladesh-plans-digital-identity-wallet-system">digital wallet system</a> for all citizens, which has implications for age verification.

The debate has hit the pages of the Daily Star, which this week published an essay arguing that “online child safety needs age assurance, not age policing.” It illustrates the complexity of regulating massive platforms in different national contexts, where local culture and infrastructure factor into the conversation.
<h2><strong>Taking away social media means giving kids alternatives</strong></h2>
For Bangladeshis, the regulatory issue comes with the same concerns that worry privacy advocates everywhere. The Star <a href="https://www.thedailystar.net/rising-stars/news/social-media-ban-young-users-what-would-it-look-the-context-bangladesh-4144966">quotes</a> Meem Arafat Manab, a researcher on tech policy, who believes “enforcing a ban will require monitoring, which could result in excessive monitoring of online activity and raise concerns about data privacy that may prove to be unpopular with the public.”

He also raises an increasingly pressing concern, arguing that “the government does not have the necessary leverage over social media apps to force them to use IDs to verify age, as the social media companies aren’t local.” The jurisdictional breadth of online safety laws has been a <a href="https://www.biometricupdate.com/202604/4chan-wont-play-by-uk-age-check-rules-raising-question-about-enforcement-potency">sore point</a> for U.S. companies who don’t think they should be subject to foreign laws. But it is the <a href="https://www.biometricupdate.com/202605/meta-challenges-uk-online-safety-act-fines-tied-to-global-revenue">question of leverage</a> that matters most, as whole <a href="https://www.biometricupdate.com/202605/survey-shows-social-media-firms-ignoring-australias-minimum-age-law">nations</a> and <a href="https://www.biometricupdate.com/202604/governments-want-fast-action-on-social-media-age-checks-but-compliance-lags">continents</a> reckon with the question of how to make the world’s most powerful companies bend to their legislative will.

Other concerns accumulate: easy workarounds in the form of VPNs, restricted access to educational platforms, a lack of alternatives for youth in Bangladesh’s densely packed cities. Manab notes the lack of playgrounds and parks in Dhaka, the capital. “Many children don’t have the habit of reading or watching movies. Children need something to do, and so, they turn to social media. We need to give them more options.” More support, too, in the form of a strong social support system that can address mental health crises in youth.

Ultimately, the paper argues that “a direct transplantation of a foreign <a href="https://www.biometricupdate.com/202510/social-media-regulation-goes-viral-as-nz-eu-look-to-follow-australia">social media restriction</a> model into Bangladesh would likely face structural and social barriers.” It advocates for stronger digital literacy to “help children understand online risks, privacy concerns, cyberbullying and responsible engagement.” That puts the onus for keeping kids safe on educators, while offering no direct intervention – and fails to account for the ways in which social platforms purposefully design their products to manipulate kids.
<h2>Blocking URLs ineffective in wider digital context</h2>
Digital education alone is insufficient protection against predatory social platforms. But one does not simply shut down Facebook, either. An opinion piece in the Daily Star sees authors Khan Khalid Adnan and Azfar Adib suggesting that “Bangladesh’s digital child protection policy still rests on a dangerously comforting illusion: that harmful online content can be managed by blocking websites.”

It’s a politically convenient approach, the <a href="https://www.thedailystar.net/opinion/views/news/online-child-safety-needs-age-assurance-not-age-policing-4172616">authors say</a>, “because it allows the state to appear decisive without confronting the actual architecture of digital harm.”

Sadly, “it is also a technically weak approach.”

In 2019, Bangladesh blocked 1,279 adult content websites in a would-be <a href="https://www.thedailystar.net/online/bangladesh-government-blocks-1279-more-pornographic-websites-1702015">war on porn</a>. But, the authors say, “a blocked URL does not protect anyone. Children do not experience the internet only through a list of prohibited websites. It happens through phones, feeds, games, livestreams, messaging apps, search results, advertising systems, influencer content, and increasingly, <a href="https://www.biometricupdate.com/202603/companion-chatbots-not-doing-enough-to-protect-kids-esafety-report">AI interfaces</a>. A policy designed for static websites is badly mismatched with a digital environment built around algorithmic exposure.”

A reactionary approach, they argue, has no long-term impact. “The state blocks after panic, prosecutes after harm, and announces crackdowns after public outrage. What it does not do is require platforms, <a href="https://www.biometricupdate.com/202604/app-stores-reveal-age-verification-estimation-methods-to-meet-singapore-requirements">app stores</a>, payment systems, gaming environments, and AI services to design age-appropriate access into their systems before harm occurs.”

“Bangladesh is not behind merely because it lacks a specific <a href="https://www.biometricupdate.com/202601/australian-age-assurance-law-prompts-removal-of-4-7m-underage-accounts">age assurance law</a>. It is behind because its regulatory instinct remains reactive, moralistic, and enforcement-heavy. Criminal law can punish an offender, but it cannot by itself stop a 12-year-old from entering an adult content site, joining an unsafe stranger chat, being nudged into gambling, or receiving self-harm content through recommendation systems.”
<h2>Bangladesh needs a coherent age assurance framework</h2>
Bangladesh’s online safety law, the <a href="https://www.researchgate.net/publication/398270759_The_Cyber_Security_Ordinance_2025_An_Unofficial_English_Translation_1">Cyber Security Ordinance 2025</a> includes provisions to address online gambling, the sexual harassment of women and children online, and the recognition of internet access as a civic right. But, say Adnan and Adib, it does not have a coherent age assurance framework.

What’s needed is a serious framework that “defines platform duties, minimum age thresholds, verification standards, independent audits, appeal mechanisms, data minimisation rules, and penalties for negligent design.”

In effect, Bangladesh needs age assurance, “not crude age policing.” Digital trust is thin in Bangladesh, which was under military rule for 15 years and has seen its government slide <a href="https://www.humanrightsresearch.org/post/a-damaged-democracy-sheikh-hasina-s-authoritarian-rule-in-bangladesh">toward autocracy</a> since the mid-1990s. “Bangladesh’s history of digital regulation gives citizens every reason to fear that a child safety policy could become another surveillance instrument.”

That said, the authors note the availability of highly effective, privacy preserving age assurance tools that provide only a yes or no answer to age queries. They believe Bangladesh should follow the lead of other nations like <a href="https://www.biometricupdate.com/202603/social-media-giants-not-properly-following-australia-age-check-rules-says-esafety">Australia</a> and the UK with a law that would “place legal duties on high-risk services, require privacy-preserving age checks for adult content and gambling, demand stronger protections in social media and gaming environments, and prohibit platforms from using children’s data to optimize addictive engagement.”

But first, Bangladesh has to give up on blocking websites. It “can either remain trapped in a censorship-based model that is easy to announce and easy to bypass, or it can build a rights-respecting age assurance regime that protects children without turning every citizen into a monitored subject,” the authors say. “The first option is familiar, ineffective, and politically lazy. The second is difficult, technical, and institutionally demanding – but necessary.”
<h2>Ordinance should impose duties of care on platforms</h2>
A paper by Mohammad Yamin Hoque of the <a href="https://baiust.ac.bd/">Bangladesh Army International University of Science and Technology</a> interrogates the Cyber Security Ordinance 2025, which criminalizes online child sexual abuse material (CSAM), revenge porn and sextortion, with harsher penalties for offenses against minors. The law, the <a href="https://recordoflaw.in/child-safety-in-bangladeshs-cyberspace-evaluating-the-cyber-security-ordinance-2025-and-the-path-forward/">author says</a>, “reveals gaps in platform accountability, victim support, and prevention.”

One critical weakness is that the Ordinance “imposes no duties on social media platforms, messaging services, or content hosts. International best practices increasingly recognize that intermediaries, not just end-users, must take responsibility for child safety.”

The paper includes several recommendations. Bangladesh could impose duties of care on platforms operating in the country, establish mandatory transparency reporting, mandate age assurance for child-accessed services, establish a Child Online Safety Division within the National Cyber Security Agency, and impose “meaningful financial penalties for non-compliance.” <a href="https://www.biometricupdate.com/202603/ofcom-loads-fines-on-4chan-for-failing-to-implement-age-assurance-under-osa">Ofcom in the UK</a> is cited as a useful regulatory model.

“The Cyber Security Ordinance 2025 provides a foundation,” it says. “Building upon it a comprehensive ecosystem for child online safety will determine whether Bangladesh’s digital future is one of opportunity or exploitation for its youngest citizens.”]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/age-assurance-debate-arrives-in-bangladesh/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342026</post-id>	</item>
		<item>
		<title>Et tu, browser? Security experts ring bell over browser fingerprinting</title>
		<link>https://www.biometricupdate.com/202605/et-tu-browser-security-experts-ring-bell-over-browser-fingerprinting</link>
					<comments>https://www.biometricupdate.com/202605/et-tu-browser-security-experts-ring-bell-over-browser-fingerprinting#respond</comments>
		
		<dc:creator><![CDATA[Joel R. McConvey]]></dc:creator>
		<pubDate>Fri, 15 May 2026 17:25:40 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Surveillance]]></category>
		<category><![CDATA[behavioral analysis]]></category>
		<category><![CDATA[browser fingerprinting]]></category>
		<category><![CDATA[device fingerprinting]]></category>
		<category><![CDATA[digital identity]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342084</guid>

					<description><![CDATA[
		<img width="2048" height="1365" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-1024x683.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-768x512.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2023/03/20155817/passwordless-authentication-1536x1024.jpg 1536w" sizes="auto, (max-width: 2048px) 100vw, 2048px" />
		Your web browser wants you to think it’s on your side. It’s your helpful window into the online universe, and if it logs your history, so be it: there’s always incognito mode, or clearing things out every once in a while. It does not wish you ill. That bit of code hidden in its pocket? Nothing to worry about.

Alas, says That Privacy Guy, “the browser you are using right now is almost certainly betraying you.”

The accusation comes in a <a href="https://www.thatprivacyguy.com/blog/the-beast-behind-the-browser/">post</a> in which the tech blogger aims to provide “a comprehensive, technically accurate and forensically useful reference covering every known client-side privacy vulnerability in Google Chrome.”

There are more of them than you might think. The Register has insights from privacy consultant Alexander Hanff, who notes that Chrome does not protect against browser fingerprinting, which monitors browser activity to create a unique footprint.

Hanff <a href="https://www.theregister.com/security/2026/04/16/google-chrome-lacks-browser-fingerprinting-defenses/5229136">claims</a> “there are at least thirty distinct fingerprinting techniques that work in Chrome right now, today, as you read this. Not theoretical attacks from academic papers that might work under laboratory conditions – real, production techniques deployed on millions of websites to identify and track you without your knowledge or consent.”
<h2>Fonts, screen resolution, other small details combine for unique identifier</h2>
On a recent episode of the <em>Biometric Update Podcast</em>, Valentin Vasilyev, chief technology officer of Fingerprint, explains <a href="https://www.biometricupdate.com/202605/biometric-update-podcast-explores-identification-at-scale-using-browser-fingerprinting">how browser fingerprinting works</a>. Browsers, he says, “expose so much information that can be used potentially to identify devices that you can combine that information to have enough identification accuracy to uniquely identify browsers and <a href="https://www.biometricupdate.com/202504/incognia-id-transcends-traditional-device-fingerprinting-with-location-behavior-tech">mobile devices</a>.” Fingerprinting considers “things like screen resolution, fonts, the size of your dock, maybe, in macOS, and other things that are unique to your browser or your environment.”

On the device level, fingerprinting can scan CPU core count and available memory, screen resolution and display characteristics, timezone and language settings, battery status, audio configuration, storage capabilities, and other features.

Fingerprint the company uses browser fingerprinting for fraud protection. But, says the Register, the technique itself poses a significant privacy risk. “A <a href="https://www.nature.com/articles/s41598-025-19950-3">study</a> published in Nature last October found that just knowing the four websites an individual visits the most – a behavioral fingerprint as opposed to a browser fingerprint – is enough to identify 95 percent of people.”

Google was initially opposed to fingerprinting, writing <a href="https://www.theguardian.com/technology/2024/dec/19/google-advertisers-digital-fingerprints-ico-uk-data-regulator">in 2019</a> that, “unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.”

That opinion has clearly changed.

“Chrome ships almost no built-in anti-fingerprinting defenses,” says Hanff. “Let me say that again because it matters – Google’s browser, the most popular browser in the world, does essentially nothing to prevent websites from building a unique profile of your device.”

“The technologies described in this document are not theoretical – they are deployed at scale against billions of people every single day. Understanding them is the first step. Building the tools to detect and expose them is the next.”
<h2>LinkedIn embroiled in ‘BrowserGate’ scandal</h2>
<a href="https://www.biometricupdate.com/tag/linkedin">LinkedIn</a> is now a part of the browser fingerprinting controversy, after an investigation alleged that the professional networking platform is deploying hidden browser scripts capable of scanning thousands of installed extensions and collecting detailed device data from users.

A recent edition of Cyber Security Hub Newsletter – published <a href="https://www.linkedin.com/pulse/linkedin-accused-extensive-browser-surveillance-pdfze/">on LinkedIn</a> – says the so-called “BrowserGate” report, published by a group claiming to represent commercial users, “accuses LinkedIn, owned by <a href="https://www.biometricupdate.com/companies/microsoft">Microsoft</a>, of engaging in large-scale browser fingerprinting that could expose sensitive corporate and personal information.”

“According to the report, LinkedIn injects concealed JavaScript into user sessions that actively probes browsers for installed extensions – tools that can range from productivity add-ons to enterprise sales software.” Analysis by cybersecurity outlet BleepingComputer “suggests the script checks for more than 6,200 browser extensions, a sharp increase from earlier findings in 2025, when roughly 2,000 extensions were reportedly targeted.”

“More recent public code repositories indicate a steady expansion of this detection capability, underscoring how rapidly the scope has grown.”

BrowserGate floats the idea that LinkedIn is using data for competitive intelligence, scanning for tools that directly compete with its own services – meaning the company could theoretically map which organizations rely on competing software.

LinkedIn rejects the allegations, and says the BrowserGate report “originates from an individual whose account was restricted for policy violations, including scraping.”
<h2>Friends, Romans, countrymen: give us your data</h2>
Many companies rely on browser-level signals to detect fraud, enforce policies, and protect digital platforms. Citibank, TD Bank, eBay, Equifax and Chick-fil-A are a few of the bigger names.

But the privacy concerns underscore “a broader tension in the modern internet: platforms seek to protect themselves from scraping, fraud, and misuse,” while “users and regulators demand <a href="https://www.biometricupdate.com/202605/california-nears-vote-on-social-media-age-checks-amid-privacy-clash">transparency and privacy</a> safeguards.”

“As browser fingerprinting techniques become more sophisticated, the line between security measures and surveillance continues to blur.” The fault, dear users, is not in ourselves, but in our czars.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/et-tu-browser-security-experts-ring-bell-over-browser-fingerprinting/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342084</post-id>	</item>
		<item>
		<title>Suprema’s BioStation 3 Max supports on-device biometric credential storage</title>
		<link>https://www.biometricupdate.com/202605/supremas-biostation-3-max-supports-on-device-biometric-credential-storage</link>
					<comments>https://www.biometricupdate.com/202605/supremas-biostation-3-max-supports-on-device-biometric-credential-storage#respond</comments>
		
		<dc:creator><![CDATA[Joel R. McConvey]]></dc:creator>
		<pubDate>Fri, 15 May 2026 16:56:51 +0000</pubDate>
				<category><![CDATA[Access Control]]></category>
		<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Facial Recognition]]></category>
		<category><![CDATA[access control]]></category>
		<category><![CDATA[biometric liveness detection]]></category>
		<category><![CDATA[biometric template protection]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[multimodal biometrics]]></category>
		<category><![CDATA[Suprema]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342065</guid>

					<description><![CDATA[
		<img width="1297" height="731" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access.png" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access.png 1297w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access-300x169.png 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access-1024x577.png 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access-150x85.png 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2022/04/19145759/suprema-biometric-access-768x433.png 768w" sizes="auto, (max-width: 1297px) 100vw, 1297px" />
		<a href="https://www.biometricupdate.com/companies/suprema-inc">Suprema</a> has launched BioStation 3 Max, a biometric access control terminal that combines AI-powered facial recognition, fingerprint authentication and hardened edge security for enterprise and critical infrastructure environments.

The terminal supports multiple authentication methods including face and fingerprint biometrics, RFID, mobile credentials, QR codes and PIN verification.

BioStation 3 Max features a 7-inch IPS touchscreen with integrated VoIP intercom functionality and carries IP65 and IK06 ratings for dust, water and impact protection.

The platform uses a dedicated neural processing unit (NPU) to accelerate biometric matching, with Suprema claiming facial authentication times of under 0.3 seconds, including for users in motion. Its dual 2MP visual and infrared cameras support liveness detection and anti-spoofing capabilities designed to detect presentation attacks. The terminal can scale to 100,000 enrolled users.

Security features include a hardened Linux operating system with Secure Boot, a CC EAL6+ certified secure element for credential protection and encrypted biometric templates. Device communications are protected with TLS encryption, while tamper detection triggers alerts if physical compromise is detected.

The platform also supports Template-on-Mobile and Template-on-Card architectures, allowing biometric templates to remain stored on user devices or smart cards instead of centralized servers.

“BioStation 3 Max defines the new standard for identity-first security,” says Hanchul Kim, CEO of Suprema Inc.

The launch aligns with broader industry trends identified in a <a href="https://www.biometricupdate.com/2026-biometric-physical-access-control-market-report-and-buyers-guide">recent biometric physical access control report</a> from Goode Intelligence and <em>Biometric Update</em>, which pointed to growing demand for AI-enabled authentication, anti-spoofing capabilities and privacy-preserving biometric architectures. <em>Biometric Update</em> will discuss the report findings in a live <a href="https://us02web.zoom.us/webinar/register/7817780955074/WN_hZqMX6F3SxK1SWzxXSyTzw#/registration">webinar on May 19</a>.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/supremas-biostation-3-max-supports-on-device-biometric-credential-storage/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342065</post-id>	</item>
		<item>
		<title>NIST, Air Force move to sole-source biometric testing and monitoring contracts</title>
		<link>https://www.biometricupdate.com/202605/nist-air-force-move-to-sole-source-biometric-testing-and-monitoring-contracts</link>
					<comments>https://www.biometricupdate.com/202605/nist-air-force-move-to-sole-source-biometric-testing-and-monitoring-contracts#respond</comments>
		
		<dc:creator><![CDATA[Anthony Kimery]]></dc:creator>
		<pubDate>Fri, 15 May 2026 15:44:47 +0000</pubDate>
				<category><![CDATA[Biometric R&D]]></category>
		<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Fingerprint Recognition]]></category>
		<category><![CDATA[biometric testing]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[ELFT]]></category>
		<category><![CDATA[government purchasing]]></category>
		<category><![CDATA[NIST]]></category>
		<category><![CDATA[Oura]]></category>
		<category><![CDATA[procurement]]></category>
		<category><![CDATA[U.S. Air Force]]></category>
		<category><![CDATA[U.S. Government]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342036</guid>

					<description><![CDATA[
		<img width="2048" height="1365" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="" decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-1024x683.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-768x512.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2025/01/29124037/fingerprint-cloud-service-1536x1024.jpg 1536w" sizes="auto, (max-width: 2048px) 100vw, 2048px" />
		The National Institute of Standards and Technology (NIST) and the U.S. Air Force Academy are pursuing separate sole-source contracts tied to biometric evaluation and biometric monitoring systems, underscoring growing government focus on trusted testing environments and controlled biometric data handling.

NIST <a href="https://sam.gov/workspace/contract/opp/3a7bd5d230e5472f883a0d3464694b5e/view">intends to negotiate</a> on a sole-source basis with Schwarz Forensic Enterprises Inc. for latent fingerprint ground truth determination support services.

The work would support NIST’s Evaluation of Latent Friction Ridge Technology program, known as <a href="https://www.biometricupdate.com/tag/elft">ELFT</a>, which is used to measure automated identification algorithm error rates.

NIST says its Information Technology Laboratory, Information Access Division, Image Group needs support to determine the ground truth source finger position for 3,000 latent impression images.

The contract would be tied to a body of friction ridge imagery NIST acquired as part of the Intelligence Advanced Research Projects Activity’s Nail-to-Nail Challenge.

Part of that data was released as NIST Special Database 302, but a critical portion was intentionally kept sequestered by NIST for biometric technology evaluations.

According to the notice, keeping that data unseen by technology evaluation participants is necessary to preserve confidence in benchmark results. Because the latent fingerprints requiring correlation came from larger regional photographs containing additional test prints, NIST says strict sequestration remains imperative.

The agency argues that introducing a new contractor would risk the integrity of the ELFT test corpus.

Schwarz Forensic Enterprises, according to NIST, has already been trusted with keeping both derived and raw data sequestered and was involved in the original work as certified latent print examiners and human subjects protection research experts responsible for the initial data collection.

NIST says Schwarz is uniquely positioned because it possesses the raw context and Institutional Review Board-approved data needed to verify source determinations without requiring NIST to release raw, unredacted imagery to another third party. The agency says that release would further jeopardize the sequestered nature of the evaluation data.

Meanwhile, the Air Force Academy <a href="https://sam.gov/workspace/contract/opp/04a54c78473f4a07866f8f3e89ccfedc/view">says</a> it intends to award a firm-fixed-price sole-source contract to Ouraring Inc. for 50 wearable smart ring performance monitoring devices.

The devices would be used by cadets through the Dean of Faculty Department of Biology to measure biometric data and provide insights into human performance and sleep quality and quantity.

The requirement calls for rings that can measure blood oxygen levels, heart rate, heart rate variability, body temperature variation, movement and activity, respiratory rates, and sleep quality and quantity.

The devices must accommodate ring sizes 6 through 15, have rechargeable batteries lasting at least five days, and produce data that shows at least 75 percent agreement with polysomnography sleep lab testing.

The academy also requires an enterprise software solution that can collect and analyze data from all devices in a single instance for fleet research.

According to the notice, the devices must connect to the enterprise application rather than to individual users, allowing rings and associated software access to be transferred from person to person.

One of the more notable requirements is that the associated software must not have AI features or capabilities.

The Air Force Academy says market research found that standard commercial smart rings generally rely on AI or algorithmic processing, but that <a href="https://www.biometricupdate.com/tag/oura">Oura</a> is the only manufacturer capable of providing a specialized configuration through the “Oura Research Application” that creates a blinded user experience and disables prohibited AI features to comply with Air Force cybersecurity mandates.

The notice also says Oura is the only known source capable of providing mandatory ring sizes 14 and 15 while meeting the government’s polysomnography agreement standard.

The academy says it found no authorized third-party enterprise resellers capable of meeting the requirement without creating unacceptable supply chain risk.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/nist-air-force-move-to-sole-source-biometric-testing-and-monitoring-contracts/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342036</post-id>	</item>
		<item>
		<title>AI fraud crackdown risks locking blind users out of biometric identity systems</title>
		<link>https://www.biometricupdate.com/202605/ai-fraud-crackdown-risks-locking-blind-users-out-of-biometric-identity-systems</link>
					<comments>https://www.biometricupdate.com/202605/ai-fraud-crackdown-risks-locking-blind-users-out-of-biometric-identity-systems#respond</comments>
		
		<dc:creator><![CDATA[Lu-Hai Liang]]></dc:creator>
		<pubDate>Fri, 15 May 2026 14:24:27 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Facial Recognition]]></category>
		<category><![CDATA[Government Services]]></category>
		<category><![CDATA[accessibility]]></category>
		<category><![CDATA[AI fraud]]></category>
		<category><![CDATA[biometric liveness detection]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[identity verification]]></category>
		<category><![CDATA[selfie biometrics]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=342015</guid>

					<description><![CDATA[
		<img width="2048" height="1363" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="Woman wearing sunglasses and denim overalls sits on a park bench outdoors, talking on a mint green smartphone with a white cane resting at her side." decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-1024x681.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-768x511.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15102133/blind-selfie-biometrics-1536x1022.jpg 1536w" sizes="auto, (max-width: 2048px) 100vw, 2048px" />
		Government identity verification systems are increasingly locking blind and low-vision (BLV) Americans out of essential services as agencies deploy stricter biometric checks to counter AI-enabled fraud, according to a new study.

The research documents accessibility failures in both digital and in-person verification workflows used for programs such as Social Security, Medicare and disability benefits. Researchers argue the failures are not simply usability problems but structural security barriers that push users toward less secure verification channels.

In “<a href="https://arxiv.org/html/2604.28166v1">Essential, Yet Overlooked: Identity Verification Barriers for Blind and Low Vision People in Government Services</a>,” researchers analyzed 219 Reddit posts and conducted interviews with 16 blind and low-vision participants.

Participants said common biometric verification steps such as selfie-to-ID matching, liveness checks and document uploads were often impossible to complete independently because systems assumed visual interaction.

When automated tools failed, users were pushed toward workarounds such as sharing sensitive information with sighted helpers or relying on phone‑based verification, which many described as their only accessible option.

Researchers warn those fallback channels are increasingly vulnerable to AI-enabled attacks. Participants recounted real incidents of <a href="https://www.biometricupdate.com/202604/ai-voice-fraud-draws-new-congressional-scrutiny">AI‑generated voice impersonation</a>, deepfake‑enabled scams and social engineering attacks.

Several said they could no longer distinguish cloned voices from real ones, raising concerns about the security of telephone verification as agencies phase out alternative methods.

Biometrics emerged as both a necessary defence and a new point of exclusion. Participants supported stronger biometric checks and liveness detection to counter synthetic identity fraud, but said accessibility varied dramatically by implementation.

Systems with audio or haptic guidance such as Apple’s Face ID were usable, while visually guided workflows like ID.me’s facial recognition were effectively inaccessible. Some preferred fingerprint verification as a tactile modality, though legal concerns about compelled biometric unlocking tempered enthusiasm.

Participants also questioned whether biometrics themselves could withstand AI‑enabled spoofing and called for privacy‑preserving architectures, such as local secure enclaves that store biometric data on‑device rather than centrally.

Across the study, BLV users articulated clear expectations for identity systems that are both AI‑resilient and accessible. They called for multiple verification pathways rather than mandatory single‑modality checks, human‑assisted fallback options as a standard feature, accessible feedback at every step of the process, and reusable digital credentials to avoid repeated document submission.

The findings arrive as U.S. agencies tighten identity proofing in response to rising synthetic identity fraud. In early 2025, the Social Security Administration removed <a href="https://www.biometricupdate.com/202508/social-security-admin-eases-phone-id-verification-rules-but-confusion-persists">phone‑based verification</a> for many services and shifted users toward online biometric workflows or in‑person visits. These changes disproportionately harm BLV, elderly and rural populations, disability advocates argue.

The study concludes that accessibility must be treated as a core security requirement, not an afterthought. Without redesign, the authors warn, every new layer of biometric or AI‑driven verification risks deepening exclusion for the very populations government systems are meant to serve.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/ai-fraud-crackdown-risks-locking-blind-users-out-of-biometric-identity-systems/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">342015</post-id>	</item>
		<item>
		<title>Police use of AI ‘outrageous and unforgivable privacy invasion’ – say the police</title>
		<link>https://www.biometricupdate.com/202605/police-use-of-ai-outrageous-and-unforgivable-privacy-invasion-say-the-police</link>
					<comments>https://www.biometricupdate.com/202605/police-use-of-ai-outrageous-and-unforgivable-privacy-invasion-say-the-police#respond</comments>
		
		<dc:creator><![CDATA[Fraser Sampson]]></dc:creator>
		<pubDate>Fri, 15 May 2026 13:03:52 +0000</pubDate>
				<category><![CDATA[Biometrics News]]></category>
		<category><![CDATA[Law Enforcement]]></category>
		<category><![CDATA[Surveillance]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[biometric monitoring]]></category>
		<category><![CDATA[biometrics]]></category>
		<category><![CDATA[data privacy]]></category>
		<category><![CDATA[Fraser Sampson]]></category>
		<category><![CDATA[law enforcement]]></category>
		<category><![CDATA[London Metropolitan Police]]></category>
		<category><![CDATA[surveillance]]></category>
		<guid isPermaLink="false">https://www.biometricupdate.com/?p=341978</guid>

					<description><![CDATA[
		<img width="2048" height="1366" src="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-scaled.jpg" class="attachment-post-thumbnail size-post-thumbnail wp-post-image" alt="Back of a Metropolitan Police officer in uniform standing on a city street, blue-and-white checkered patch visible" decoding="async" loading="lazy" srcset="https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-scaled.jpg 2048w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-300x200.jpg 300w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-1024x683.jpg 1024w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-150x100.jpg 150w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-768x512.jpg 768w, https://d1sr9z1pdl3mb7.cloudfront.net/wp-content/uploads/2026/05/15085341/met-police-1536x1024.jpg 1536w" sizes="auto, (max-width: 2048px) 100vw, 2048px" />
		<em>By Professor <a href="https://www.gov.uk/government/people/fraser-sampson" target="_blank" rel="noopener">Fraser Sampson</a>, former UK Biometrics &amp; Surveillance Camera Commissioner</em>

Condemnation of police forces deploying ‘opaque and untested’ surveillance tools is nothing new. A cursory online rummage will reveal almost daily coverage of public concerns about AI-enabled technology like facial recognition being piloted by police forces. But last week’s challenge to the latest covert deployment of new technology came from within policing itself.

<a href="https://metfed.org.uk/news/mpf-statement-on-use-of-ai-to-monitor-police-officers">Damning comments</a> from the Police Federation of England and Wales followed the revelation that the Metropolitan Police Service has been using covert AI-powered technology to monitor its officers’ movements, communications and data access. Almost 600 cases have reportedly been highlighted - 42 of them involving senior ranks. After what the General Secretary of the staff association called ‘an outrageous and unforgivable invasion of privacy', 100 officers are now under investigation for gross misconduct with another 30 being ‘flagged for suspicious behaviour’.

Police use of AI-enabled technology has two aspects. The first is law enforcement and other operational functions – the bit that gets the headlines. The second is tackling the more mundane administrative tasks shared by all big organisations with workforce, estate, logistics and finance issues. But, given their investigative powers and duties, where are the boundaries for the use of covert internal monitoring to catch rule breakers? Is this ‘policing’? Intelligence gathering is a key police function and the company supplying the software - <a href="https://www.biometricupdate.com/tag/palantir">Palantir</a>- is named after the magical stones used in Tolkien’s Lord of the Rings trilogy to gather intelligence so perhaps there’s a clue. Either way, the covert internal deployment of AI-enabled technology in the police workplace matters for a few reasons.

First, the wider security case for using new technology in this way is compelling. The public expect high standards of conduct from the police and abuses to be rooted out. The same is true of other vital public services but the argument extends to many privately operated entities delivering critical functions. Conflicts in Ukraine and the Gulf are illustrating the fragility of our critical infrastructure. Mitigating threats to the water, transportation, food, energy, finance and communications sectors might equally <a href="https://www.biometricupdate.com/202502/biometrics-are-the-key-to-unlocking-nuclear-power">justify the use of internal biometric surveillance</a>.

Second, AI is in a perpetual <em>beta </em>state. Once you’ve bought the kit you will discover that it can do other things equally well - if it was procured with public funds, you’re duty bound to maximise its value for money. With AI, function creep comes as standard. Against a backdrop of relentless spending pressures and resource challenges, it’s not a hypothetical question to wonder when police forces will be using their facial recognition technology to flag employees suspected of pulling a sicky or interpreting working from home too liberally.

Third, there are convincing efficiency arguments for all organisations to use AI-powered tools in the workplace to monitor compliance with policy. Once they start to turbocharge their processing of employee records with AI, employers’ data will be invaluable in the investigation of crime or gathering of intelligence – will they share it when the police come and ask for it?

And fourth, if it’s OK for staff efficiency, the case for installing biometric technology in safety-critical functions like <a href="https://www.biometricupdate.com/202603/are-we-ready-for-biometric-tachographs">biometric tachographs</a> becomes irresistible.

Transferring the bots to HR was inevitable - they might be good at <a href="https://www.biometricupdate.com/202605/met-police-tout-arrests-crime-drop-from-permanent-lfr-camera-pilot">recognising wanted people</a> on the street, but their game changing strength is in combing and combining diffuse datasets. When you need iterative crunching of vast amounts of live data, bots are your MVPs in <a href="https://dictionary.cambridge.org/dictionary/english/mvp">both senses</a> of the term. Weigh the cost of humans checking compliance across multiple layers of organisational policy versus automation of highly transactional processes and handing it to AI becomes a no brainer – which is why the same US company is helping the Ukrainian army process <a href="https://militarywatchmagazine.com/article/ai-giant-palantir-war-operating-system-ukraine">real time intelligence data to improve the efficiency</a> of its strikes against Russian forces.

It’s interesting to see the police on the other side of this technology argument - hopefully no one will reach for the weary “if you’ve done nothing wrong you needn’t worry” platitude but I wouldn’t bet on it. Thus far, the revelations only extend to the Metropolitan Police and it remains to be seen how many other UK forces will follow.

Employers will rightly want to explore the benefits that AI can bring but all workers - police and non-police alike - need <a href="https://www.biometricupdate.com/202605/certainty-vs-flexibility-does-the-uk-need-a-biometric-surveillance-act">safeguards and assurances</a>. Where and when, by whom and for what purposes can their biometric and related data be accessed? Boilerplate phrases limiting it to auditing ‘compliance with relevant policies’ will probably not be enough to assuage fears of staff and their trade unions. The General Secretary of the Metropolitan Police Federation is asking: 'where is the transparency… and the reassurance that the correct checks and balances are there?’

I have highlighted that this is what under regulation looks like and the police are now seeing it from the other end of the AI telescope. Once public watchdogs start turning their biometric technology inwards, people’s views of the need for clearer regulation may change. Some are already asking ‘if the police do that to each other, what chance have the rest of us got?’ While we wait to see, picture this: if you were to get a request from your employer to comment on a detailed analysis of your every meeting, journey and call, your hours worked, breaks taken and buildings visited over the last year, would you be in any position to respond? There’s a certain inequality of arms in gainsaying content produced by military-grade software - and if you’re unsure why that should be a concern, ask a <a href="https://www.postofficehorizoninquiry.org.uk/">UK sub postmaster</a>.

What might this all look like in the future? That’s unclear but the <a href="https://www.biometricupdate.com/202512/pandemic-surveillance-how-ai-will-police-the-next-global-health-crisis">next global pandemic</a> will show us just how far and fast the aggregated surveillance capability of employers and governments has evolved.
<h2>About the author</h2>
Fraser Sampson, former UK Biometrics &amp; Surveillance Camera Commissioner, is Professor of Governance and National Security at <a href="http://research.shu.ac.uk/centric" target="_blank" rel="noopener">CENTRIC</a> (Centre for Excellence in Terrorism, Resilience, Intelligence &amp; Organised Crime Research) and a non-executive director at <a href="https://www.biometricupdate.com/companies/facewatch">Facewatch</a>.]]></description>
		
					<wfw:commentRss>https://www.biometricupdate.com/202605/police-use-of-ai-outrageous-and-unforgivable-privacy-invasion-say-the-police/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">341978</post-id>	</item>
	</channel>
</rss>
