<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Privacy &amp; Data Security Archives - LexBlog</title>
	<atom:link href="https://www.lexblog.com/privacy-data-security/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.lexblog.com/privacy-data-security/</link>
	<description>Legal news and opinions that matter</description>
	<lastBuildDate>Wed, 18 Jun 2025 20:41:08 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8</generator>

 
	<item>
		<title>GT’s Data Privacy &#038; Cybersecurity Practice Recognized in The Legal 500 United States 2025 Edition</title>
		<link>https://www.lexblog.com/2025/06/18/gts-data-privacy-cybersecurity-practice-recognized-in-the-legal-500-united-states-2025-edition/</link>
		
		<dc:creator><![CDATA[Greenberg Traurig, LLP]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 20:40:39 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/gts-data-privacy-cybersecurity-practice-recognized-in-the-legal-500-united-states-2025-edition/</guid>

					<description><![CDATA[Greenberg Traurig&#8217;s&#160;Data Privacy &#38; Cybersecurity Practice&#160;is recognized in&#160;The Legal 500&#160;United States 2025 edition for Media, technology and telecoms&#62; Cyber law (including data privacy and data protection). The group is praised as &#8220;a large, highly qualified, geographically-dispersed team&#8221; that supports clients across industries including e-commerce, financial services, health care, retail, and entertainment. The group is recognized...]]></description>
										<content:encoded><![CDATA[
<p>Greenberg Traurig&rsquo;s&nbsp;<a href="https://www.gtlaw.com/en/capabilities/data-privacy-cybersecurity" target="_blank" rel="noreferrer noopener">Data Privacy &amp; Cybersecurity Practice</a>&nbsp;is recognized in&nbsp;<em>The Legal 500</em>&nbsp;United States 2025 edition for <a href="https://www.legal500.com/rankings/ranking/c-united-states/media-technology-and-telecoms/cyber-law-including-data-privacy-and-data-protection/51354-greenberg-traurig-llp" target="_blank" rel="noreferrer noopener">Media, technology and telecoms&gt; Cyber law (including data privacy and data protection)</a>. The group is praised as &ldquo;a large, highly qualified, geographically-dispersed team&rdquo; that supports clients across industries including e-commerce, financial services, health care, retail, and entertainment. The group is recognized for its deep experience in privacy and security compliance across jurisdictions, with particular strength in cross-border data transfers.</p>



<p><em>The Legal 500</em>&nbsp;United States recognizes practice area teams and practitioners who are &ldquo;providing the most cutting edge and innovative advice to corporate counsel,&rdquo; according to the publisher. The recognitions are based on feedback from more than 300,000 clients worldwide, law firm submissions, and interviews with private practice lawyers, in addition to&nbsp;<em>The Legal 500</em>&rsquo;s&nbsp;independent research.</p>



<p><a href="https://www.gtlaw.com/en/news/2025/06/press-releases/188-greenberg-traurig-attorneys-46-practices-recognized-in-the-legal-500-united-states-2025-edition" target="_blank" rel="noreferrer noopener">Click here to see the full list of Greenberg Traurig attorneys and practices recognized in&nbsp;<em>The Legal 500</em>&nbsp;United States 2025 edition.</a></p>

]]></content:encoded>
					
		
		
		<source url='https://www.gtlaw-dataprivacydish.com/'>Data Privacy Dish</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/data-privacy-legal-500-USA2025.jpg' type='image/jpeg' length='142171' />	</item>
		<item>
		<title>DOJ’s Data Security Program: Key Compliance Considerations for Impacted Entities</title>
		<link>https://www.lexblog.com/2025/06/18/dojs-data-security-program-key-compliance-considerations-for-impacted-entities/</link>
		
		<dc:creator><![CDATA[Eleanor M. Ross, Cassidy Kim and Olivia Bellini]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 20:22:03 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/dojs-data-security-program-key-compliance-considerations-for-impacted-entities/</guid>

					<description><![CDATA[DOJ’s new Data Security Program (DSP), effective April 8, 2025, imposes significant restrictions on U.S. government contractors and global companies that handle sensitive U.S. personal or government-related data. The DSP is currently subject to a 90-day initial enforcement period, After July 8, 2025, NSD will implement full enforcement of the DSP.]]></description>
										<content:encoded><![CDATA[
<p>On April 11, 2025, the DOJ&rsquo;s National Security Division (NSD) issued a&nbsp;<a href="https://www.justice.gov/opa/media/1396346/dl?inline" target="_blank" rel="noreferrer noopener">Compliance Guide, Implementation and Enforcement Policy</a>, and&nbsp;<a href="https://www.justice.gov/opa/media/1396351/dl" target="_blank" rel="noreferrer noopener">FAQs</a>&nbsp;for its Data Security Program (DSP), finalized pursuant to&nbsp;<a href="https://www.federalregister.gov/documents/2024/03/01/2024-04573/preventing-access-to-americans-bulk-sensitive-personal-data-and-united-states-government-related" target="_blank" rel="noreferrer noopener">Executive Order 14117</a>&nbsp;and the&nbsp;<a href="https://www.govinfo.gov/content/pkg/FR-2025-01-08/pdf/2024-31486.pdf" target="_blank" rel="noreferrer noopener">28 C.F.R. Part 202</a>. The DSP is primarily designed to prevent certain cross-border data flows and transactions. Individuals and companies subject to the DSP are required to comply with new security requirements, reporting and recordkeeping duties, and due diligence rules.</p>



<p>The recently issued guidance makes evident NSD&rsquo;s intent to make the DSP an enforcement priority for this administration. Access to Americans&rsquo; bulk sensitive or personal data or U.S. government-related data increases the ability of countries of concern to engage in a wide range of malicious activities. The DSP is currently subject to a 90-day initial enforcement period, which is a limited enforcement window to give individuals and companies additional time to bring their transactions and processes into compliance with the DSP. After July 8, 2025, NSD will implement full enforcement of the DSP.</p>



<p><a href="https://www.gtlaw.com/en/insights/2025/6/dojs-data-security-program-key-compliance-considerations-for-impacted-entities" target="_blank" rel="noreferrer noopener"><strong>Click here to continue reading the full GT Alert.</strong></a></p>

]]></content:encoded>
					
		
		
		<source url='https://www.gtlaw-dataprivacydish.com/'>Data Privacy Dish</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Sign-at-the-United-States-Department-of-Justice-DOJ-in-Washington-DC-Shutterstock_2262073553.jpg' type='image/jpeg' length='189741' />	</item>
		<item>
		<title>Developments in Online Safety and Data Privacy for Minors</title>
		<link>https://www.lexblog.com/2025/06/18/developments-in-online-safety-and-data-privacy-for-minors/</link>
		
		<dc:creator><![CDATA[Jill Steinberg and Kelly McGlynn]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 18:23:46 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/developments-in-online-safety-and-data-privacy-for-minors/</guid>

					<description><![CDATA[There have been numerous developments in the online safety and data privacy space for minors in particular over the last few months. Here we cover some notable decisions in the federal courts and cases with nationwide implications in addition to final and pending legislative and regulatory action by the Federal government. Notable Court Decisions The...]]></description>
										<content:encoded><![CDATA[
<p>There have been numerous developments in the online safety and data privacy space for minors in particular over the last few months. Here we cover some notable decisions in the federal courts and cases with nationwide implications in addition to final and pending legislative and regulatory action by the Federal government.</p>



<h2 class="wp-block-heading"><strong>Notable Court Decisions</strong></h2>



<h4 class="wp-block-heading">The Salesforce Decision</h4>



<p>A recent decision by the Fifth Circuit held that a suit brought by sex trafficking victims against Salesforce for allegedly participating in a sex trafficking venture could move forward. The Court ruled that under certain circumstances, companies such as Salesforce that provide web-based business services to entities or individuals engaged in sex trafficking may be civilly liable as a beneficiary of a sex trafficking venture. The decision interprets Section 230 of the Communications Decency Act (&ldquo;Section 230&rdquo;), which generally protects web platform hosts from liability for content created by users. This is the most recent in a series of decisions limiting Section 230&rsquo;s protections for entities that fail to take measures to prevent the use of their services by criminal actors engaged in sex trafficking.</p>



<p>The plaintiffs in <em>Doe v. Salesforce</em> are a group of sex trafficking victims who were trafficked through Backpage.com (&ldquo;Backpage&rdquo;), a Craigslist-type platform notorious for its permissiveness and encouragement of sex trafficking advertisements. They seek to hold Salesforce civilly liable under 18 U.S.C. &sect; 1595, which creates a cause of action for victims against anyone who &ldquo;knowingly benefits ... from participation in a [sex trafficking] venture.&rdquo; Salesforce allegedly provided Backpage cloud-based software tools and related services, including customer relationship management support. The Plaintiffs allege that Salesforce was aware that Backpage was engaged in sex trafficking, citing, <em>inter alia</em>, emails between Salesforce employees and a highly publicized Congressional report that found that Backpage actively facilitated prostitution and child sex trafficking.</p>



<p>Salesforce moved for summary judgment, arguing that Section 230 served as a complete bar to liability. While courts have generally interpreted Section 230 broadly in dismissing claims against internet platform hosts that are premised on the ways in which others use those platforms, the statute has been increasingly under fire by legislators and courts alike. Lawmakers on both sides of the aisle have discussed amending or repealing Section 230 in recent years and courts have slowly chipped away at the broad immunity by interpreting the statute more narrowly. This trend has been especially stark in cases dealing with sex trafficking and child sexual abuse. The Fifth Circuit&rsquo;s decision in <em>Doe v. Salesforce</em> is a prime example of this, and a substantial step away from the breadth of protections afforded under earlier interpretations of Section 230.</p>



<p>The Fifth Circuit rejected a &ldquo;but-for test,&rdquo; which would shield a defendant if a cause of action would not have accrued without content created and posted by a third party. Salesforce advocated for what the court dubbed the &ldquo;only-link&rdquo; test, which would protect defendants when the only link between the defendant and the victims is the publication of third-party content. The Court rejected that argument, instead ruling that &ldquo;the proper standard is whether the duty the defendant allegedly violated derives from their status as a publisher or speaker or requires the exercise of functions traditionally associated with publication.&rdquo; The key question is whether the claim treats the defendant as a publisher or speaker. The Fifth Circuit found that the duty the plaintiffs alleged Salesforce breached was &ldquo;a statutory duty to not knowingly benefit from participation in a sex-trafficking venture.&rdquo; Because this duty is unrelated to traditional publishing functions, Section 230 does not serve as a shield. This decision underscores the need for companies to establish processes that will identify potential dangers of trafficking in or in relation to their businesses including but not limited to facilitation of trafficking using online platforms. Without proper safeguards, even businesses providing neutral tools and operations support may be held civilly liable for the harms the users of their services perpetrate.</p>



<h4 class="wp-block-heading">Garcia v. Character Technologies, Inc. et al.</h4>



<p>The mother of a fourteen year old boy prevailed on a motion to dismiss her lawsuit against Character Technologies, Google, Alphabet, and two individual defendants in connection with the suicide of her child. The plaintiff alleged that her son was a user of Character A.I., which the Court describes as &ldquo;an app that allows users to interact with various A.I. chatbots, referred to as &lsquo;Characters.&rsquo;&rdquo; The Court also describes these &ldquo;Characters&rdquo; as &ldquo;anthropomorphic; users interactions with Characters are meant to mirror interactions a user might have with another user on an ordinary messaging app.&rdquo; In other words, it is intended to and gives the impression to the user that he is communicating with a real person. The plaintiff alleged that the app had its intended impact on her child; she asserted that her son was addicted to the app and could not go one day without communicating with his Characters, resulting in severe mental health issues and problems in school. When his parents threatened to take away his phone, he took his own life. The plaintiff filed suit asserting numerous tort claims, along with an alleged violation of Florida&rsquo;s Unfair Trade and Deceptive Practices Act and under a theory of unjust enrichment</p>



<p>In denying the motion to dismiss, Judge Anne Conway, District Court Judge for the Middle District of Florida, made several notable rulings. Among them, she found that the plaintiff had adequately pled that Google is liable for the &ldquo;harms caused by Character A.I. because Google was a component part manufacturer&rdquo; of the app, deeming it sufficient that plaintiff pled that Google &ldquo;substantially participated in integrating its models&rdquo; into the app, which allegedly was necessary to build and maintain the platform. She also found that the plaintiff sufficiently pled that Google was potentially liable for aiding and abetting the tortious conduct because the amended complaint supported a &ldquo;plausible inference&rdquo; that Google possessed actual knowledge that Character&rsquo;s product was defective. The Court further found that the app was a product, not a service, and that Character A.I.&rsquo;s output is not speech protected by the First Amendment. The Court determined that plaintiff had sufficiently pled all her tort claims with the exception of her claim of intentional infliction of emotional distress, along with allowing her claims to go forward under Florida&rsquo;s Deceptive and Unfair Trade Practices Act, and a theory of unjust enrichment.</p>



<h4 class="wp-block-heading">New York v. TikTok</h4>



<p>In October 2024, the Attorney General for State of New York filed suit against TikTok to hold it &ldquo;accountable for the harms it has inflicted on the youngest New Yorkers by falsely marketing and promoting&rdquo; its products. The following day, Attorney General James released a statement indicating that she was co-leading a coalition of 14 state Attorneys General each filing suit against TikTok for allegedly &ldquo;misleading the public&rdquo; about the safety of the platform and harming the mental health of children. Lawsuits were filed individually by each member of the coalition and all allege that TikTok violated the law &ldquo;by falsely claiming its platform is safe for young people.&rdquo; The press release can be found <a href="https://ag.ny.gov/press-release/2024/attorney-general-james-sues-tiktok-harming-childrens-mental-health" target="_blank" rel="noreferrer noopener">here</a>.</p>



<p>The New York complaint includes allegations regarding the addictive nature of the app and its marketing and targeting of children, causing substantial mental health harm to minors. The complaint additionally includes allegations that TikTok resisted safety improvements to its app to boost profits, made false statements about the safety of the app for minors, and misrepresented the efficacy of certain safety features. The complaint asserts nine causes of action, including violations of New York law relating to fraudulent business conduct, deceptive business practices, and false advertising, along with claims asserting design defects, failure to warn, and ordinary negligence. In late May, Supreme Court Justice Anar Rathod Patel mostly denied TikTok&rsquo;s motion to dismiss in a brief order that did not include her reasoning, allowing the case to proceed.</p>



<h2 class="wp-block-heading"><strong>Federal Legislative and Regulatory Developments</strong></h2>



<h4 class="wp-block-heading">President Trump Signs the TAKE IT DOWN Act; The Kids Online Safety Act (KOSA) is reintroduced</h4>



<p>President Trump signed the &#8220;TAKE IT DOWN Act&#8221; on May 19, 2025. The bill criminalizes the online posting of nonconsensual intimate visual images of adults and minors and the publication of digital forgeries, defined as the intimate visual depiction of an identifiable individual created through various digital means that, when viewed as a whole, is indistinguishable from an authentic visual depiction. The statute also criminalizes threats to publish such images. The bill additionally requires online platforms to establish no later than one year from enactment clear processes by which individuals can notify companies of the existence of these images and a requirement that the images be removed &#8220;as soon as possible, but not later than 48 hours&#8221; after receiving a request. The bill in its entirety can be found <a href="https://www.congress.gov/119/bills/s146/BILLS-119s146es.pdf" target="_blank" rel="noreferrer noopener">here</a>.</p>



<p>Also in May, the Kids Online Safety Act (KOSA) was reintroduced in the Senate by a bipartisan group of legislators. In connection with their announcement of the revised version of KOSA, Senators Blackburn and Blumenthal thanked Elon Musk and others at X for their partnership in modifying KOSA&rsquo;s language to &ldquo;strengthen the bill while safeguarding free speech online and ensuring it&rsquo;s not used to stifle expression&rdquo; and noted the support of Musk and X to pass the legislation by the end of 2025. In its May announcement, the senators noted that the legislation is supported by over 250 national, state and local organizations and further gained the support of Apple. KOSA provides that platforms &ldquo;shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate&rdquo; listed harms to minors where those harms were reasonably foreseeable. Those harms include eating disorders, depressive and anxiety disorders, compulsive use, online harassment, and sexual and financial exploitation. It requires that platforms provide minors (and parents) with readily accessible and easy to use safety tools that limit communication with the minor and limit by default access to and use of certain design features by minors. The legislation further mandates reporting tools for users and the establishment of internal processes to receive and substantively review all reports. The current version of KOSA is lengthy and contains numerous additional mandates and notice requirements including third party audits and public reporting regarding compliance. The most recent version of KOSA can be found <a href="https://www.blackburn.senate.gov/services/files/ED1BAF20-1CE6-40FA-A4B1-3E98F034DD4F" target="_blank" rel="noreferrer noopener">here</a>.</p>



<h4 class="wp-block-heading">New COPPA Rule Takes Effect June 23, 2025</h4>



<p>The Federal Trade Commissions (FTC) has amended the Children&rsquo;s Online Privacy Protection Rule (&ldquo;COPPA Rule&rdquo;) effective June 23, 2025. COPPA imposes obligations on entities operating online that collect the personal information of children under the age of thirteen. The new COPPA Rule seeks to address new challenges in the digital landscape.</p>



<p>Under the new COPPA Rule, the FTC will consider additional evidence in determining whether a website or online service is directed at children. COPPA applies wherever children under the age of thirteen are a website or service&rsquo;s intended or actual audience, and the FTC applies a multifactor test for assessing this. Under the new COPPA Rule, the FTC will now consider &ldquo;marketing or promotional materials or plans, representations to consumers or to third parties, reviews by users or third parties, and the age of users on similar websites or services.&rdquo; While FTC has stated that this amendment simply serves to clarify how it analyzes the question of whether a website is child-directed (rather than acting as a change in policy), online operators should note that whether they are subject to COPPA depends in part on elements outside of their control&mdash;such as online reviews and the age of users of their peer websites and services.</p>



<p>The type of information protected by COPPA will also expand. COPPA mandates that websites and online services directed at children under the age of thirteen obtain verifiable parental consent before collecting, using, or disclosing any personal information from children. To date, this has included details like names, addresses, phone numbers, email addresses, and other identifiable data. The new COPPA Rule expands this definition to include biometric identifiers &ldquo;that can be used for the automated or semi-automated recognition of an individual, such as finger prints; handprints; retina patterns; iris patterns; genetic data, including a DNA sequence; voiceprints; gait patterns; facial templates; or faceprints[.]&rdquo; The definition will also include government identifiers such as social security or passport numbers, and birth certificates.</p>



<p>Data security requirements have also been enhanced. Operators subject to COPPA must maintain a written data security program, designate one or more employees to coordinate it, and conduct an annual assessment of risks. If they share any protected data with third parties, the disclosing party must ensure that the third party has sufficient capability and policies in place to maintain the data securely and within the bounds of COPPA regulations. Notably, the new COPPA Rule forbids indefinite retention of data, requiring that operators only retain protected information as long as is reasonably necessary to serve the purposes for which it was collected.</p>



<p>The new COPPA Rule contains a number of other policy changes, such as enhanced requirements for parental notice and control regarding the data collected, stored, and shared with third parties, new mechanisms for obtaining parental consent, and changes to an exception to the bar on collecting children&rsquo;s data without parental consent for the limited purpose of determining whether a user is a child under the age of thirteen.</p>



<p>Entities operating a business or service online that may be used by children under the age of thirteen&mdash;even where children are not the intended audience&mdash;should carefully review the new rule, and take steps to ensure they are in full compliance. The new rule underscores the FTC&rsquo;s continued interest in this space and its desire to take action against online services for practices it views as posing unacceptable risks to children&rsquo;s privacy and online safety.</p>



<h4 class="wp-block-heading">Senate Judiciary Committee, Subcommittee on Privacy, Technology, and the Law Holds Hearing on AI-Generated Deep Fakes</h4>



<p>On May 21, the Senate Judiciary Committee&rsquo;s subcommittee on privacy, technology and the law held a hearing titled, &ldquo;The Good, the Bad, and the Ugly: AI-Generated Deep Fakes in 2025.&rdquo; Witnesses included representatives of the Recording Industry Association of America, Consumer Reports, and YouTube, along with multi-award winning musician Martina McBride. They all testified about the potential benefits of AI, but also the potential harms to creators, including musicians, and different but substantial harms to consumers. The witnesses discussed specific examples of the images and voices of both known and lesser-known innocent individuals used to defraud and exploit others, impacting reputations and livelihoods. A representative from the National Center on Sexual Exploitation (NCOSE) also testified about the pervasive and harmful impact of deep fakes on adults and children when their images are used to create pornography, which is then spread worldwide and unchecked on the internet. All of the witnesses testified in support of the NO FAKES Act of 2025, a bipartisan bill and a compliment to the TAKE IT DOWN Act. The language of the current legislation can be found <a href="https://www.congress.gov/119/bills/s1367/BILLS-119s1367is.pdf" target="_blank" rel="noreferrer noopener">here</a>. The bill currently provides for a civil cause of action with a detailed penalty regime for individuals who have their image or voice used without their permission and protects online service providers from liability if those providers have systems in place to identify and address the publication and dissemination of deep fakes. The bill also provides for legal process for individuals to obtain information from providers regarding the source of the published materials. The current version additionally endeavors to preempt state law, stating that the &ldquo;rights established under this Act shall preempt any cause of action under State law for the protection of an individual&rsquo;s voice and visual likeness rights in connection with a digital replica, as defined in this Act, in an expressive work.&rdquo;</p>

]]></content:encoded>
					
		
		
		<source url='https://www.cyberadviserblog.com/'>CyberAdviser</source>
	</item>
		<item>
		<title>The Vermont Age-Appropriate Design Code Act: What You Need to Know</title>
		<link>https://www.lexblog.com/2025/06/18/the-vermont-age-appropriate-design-code-act-what-you-need-to-know/</link>
		
		<dc:creator><![CDATA[Odia Kagan]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 16:50:41 +0000</pubDate>
				<category><![CDATA[Featured Posts]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/the-vermont-age-appropriate-design-code-act-what-you-need-to-know/</guid>

					<description><![CDATA[Vermont recently adopted the Vermont Age-Appropriate Design Code Act, which goes into effect on January 1, 2027.&#160;The law is enforceable by the Vermont Attorney General as an unfair or deceptive act or practice. The Attorney General&#8217;s Office may draft regulations. What are we discussing with clients? In more detail It applies if you: The law...]]></description>
										<content:encoded><![CDATA[
<p>Vermont recently adopted the Vermont Age-Appropriate Design Code Act, which goes into effect on January 1, 2027.&nbsp;The law is enforceable by the Vermont Attorney General as an unfair or deceptive act or practice. The Attorney General&#8217;s Office may draft regulations.</p>



<h2 class="wp-block-heading">What are we discussing with clients?</h2>



<ul class="wp-block-list">
<li>Note what is NOT publicly available data. That might impact interpretation of other laws with this term.</li>



<li>Note what IS &#8220;reasonably likely to be accessed by minors&#8221; for the same reason. It is a very broad term here.</li>



<li>The duty of care is vague (result based) and thus a tall order to follow.</li>



<li>Note the legal obligation to have terms of use and community standards (a first?).</li>



<li>Note the very extensive transparency obligations. This is a privacy notice PLUS PLUS, with detail and retention terms for every feature and information on the algorithms.</li>
</ul>



<h2 class="wp-block-heading">In more detail</h2>



<h2 class="wp-block-heading">It applies if you:</h2>



<ul class="wp-block-list">
<li>Conduct business in Vermont.</li>



<li>Generate a majority of your annual revenue from online services.</li>



<li>Are reasonably likely to be accessed by a minor.</li>



<li>Determine the purpose and means of processing of personal data.</li>
</ul>



<h2 class="wp-block-heading">The law includes a lot of unique carve outs from &#8220;publicly available information,&#8221; so the following is still personal data:</h2>



<ul class="wp-block-list">
<li>Biometric data collected without knowledge.</li>



<li>Information that is collated and combined to create a consumer profile.</li>



<li>Information that is made available for sale.</li>



<li>Genetic data.</li>
</ul>



<h2 class="wp-block-heading">Is the data &#8220;reasonably likely to be accessed by minors?&#8221; Consider:</h2>



<ul class="wp-block-list">
<li>Directed at children under COPPA.</li>



<li>The service or product is determined, based on either (1) competent and reliable evidence regarding audience composition, or on (2) internal company research or (3) knew or should have known audience to be routinely accessed by an audience that is composed of at least&nbsp;TWO PERCENT minors.</li>



<li>Data minimization:&nbsp;Only collect/retain/share data that is necessary to provide the service/feature with which the covered minor is actively and knowingly engaged&nbsp;with additional limitation on algorithmic recommendations.</li>
</ul>



<h2 class="wp-block-heading">Minimum duty of care</h2>



<p>The use of the personal data and the design of an online service, product, or feature or the content of the media viewed will not result in:</p>



<ul class="wp-block-list">
<li>Reasonably foreseeable emotional distress to a covered minor.</li>



<li>Reasonably foreseeable compulsive use of the online service, product, or feature by a covered minor.</li>



<li>Discrimination against a covered minor.</li>



<li>Default settings set to highest levels of privacy which includes blocking known adult users from seeing the minor&#8217;s account or specific content or commenting or direct messaging and not displaying the covered minor&rsquo;s location to other users.</li>
</ul>



<h2 class="wp-block-heading">Provide transparency which includes:</h2>



<ul class="wp-block-list">
<li>Terms of use, privacy notice and community standards.</li>



<li>Detailed disclosure per feature, including data retention.</li>



<li>Purpose of each algorithmic recommendation system.</li>



<li>Inputs used by the algorithmic recommendation system and how they are used, influence recommendations.</li>
</ul>

]]></content:encoded>
					
		
		
		<source url='https://dataprivacy.foxrothschild.com/'>Privacy Compliance &amp; Data Security</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/PUB_Privacy10_1998603458.jpg' type='image/jpeg' length='113211' />	</item>
		<item>
		<title>Oregon Extends Privacy Law to Specifically List Auto Makers</title>
		<link>https://www.lexblog.com/2025/06/18/oregon-extends-privacy-law-to-specifically-list-auto-makers/</link>
		
		<dc:creator><![CDATA[Liisa Thomas]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 16:36:37 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/oregon-extends-privacy-law-to-specifically-list-auto-makers/</guid>

					<description><![CDATA[In ongoing tweaks to state privacy laws, Oregon has amended its state privacy law to cover auto manufacturers. Specifically, those that process or control personal information that they get from a person&#8217;s use of a car. As most are aware, the law requires disclosures when collecting personal information, provision of rights to consumers (including the...]]></description>
										<content:encoded><![CDATA[
<p>In ongoing tweaks to state privacy laws, Oregon has amended its <a href="https://www.eyeonprivacy.com/2023/07/state-comprehensive-privacy-laws-beaver-state-makes-a-dozen/">state privacy law</a> to cover auto manufacturers. Specifically, those that process or control personal information that they get from a person&rsquo;s use of a car. As most are aware, the law requires disclosures when collecting personal information, provision of rights to consumers (including the ability to delete and port personal information), and limits on profiling among other things. While the Oregon law, like most state &ldquo;comprehensive&rdquo; laws, includes applicability thresholds, there are no thresholds for this new applicability to car manufacturers. The law is slated to go into effect in September of this year.</p>



<span id="more-3235182"></span>



<p><strong>Putting It Into Practice: This amendment demonstrates a growing concern by law makers and regulators around data collected in motor vehicles. We anticipate seeing similar developments in coming months.</strong></p>



<p></p>

]]></content:encoded>
					
		
		
		<source url='https://www.eyeonprivacy.com/'>Eye On Privacy</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Insurance-Blog-Image-Cars-660x283-1.png' type='image/jpeg' length='327796' />	</item>
		<item>
		<title>Connecticut Amends Privacy Law</title>
		<link>https://www.lexblog.com/2025/06/18/connecticut-amends-privacy-law/</link>
		
		<dc:creator><![CDATA[Sofia Reed, Gregory P. Szewczyk and Kelsey Fayer]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 16:18:24 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/connecticut-amends-privacy-law/</guid>

					<description><![CDATA[On June 11, 2025, Connecticut passed Senate Bill 01295 (SB 01295). If signed by the governor, SB 01295 will amend the existing Connecticut Data Privacy Act (CTDPA) in several important ways, with the amendments going into effect on July 1, 2026. Expanded Scope: In what is seen as a general trend, SB 01295 broadens the...]]></description>
										<content:encoded><![CDATA[
<p>On June 11, 2025, Connecticut passed <a href="https://www.cga.ct.gov/2025/ACT/PA/PDF/2025PA-00113-R00SB-01295-PA.PDF">Senate Bill 01295 (SB 01295)</a>.  If signed by the governor, SB 01295 will amend the existing Connecticut Data Privacy Act (CTDPA) in several important ways, with the amendments going into effect on July 1, 2026.</p>



<p><strong>Expanded Scope</strong>: In what is seen as a general trend, SB 01295 broadens the reach of the CTDPA by lowering exemption thresholds:&nbsp;The law will apply to organizations that control or process the personal data of 35,000 consumers or more, controls or processes any sensitive data, or engage in the sale of personal data. The bill also expands the definition of sensitive data, thereby increasing the number of covered entities. &nbsp;</p>



<p>Signaling another important trend, the amendment would remove the entity-level exemption for financial institutions under the Gramm-Leach-Bliley Act (GLBA), and instead only exempt data subject to the GLBA.  Notably, however, certain types of financial institutions may continue to enjoy entity-level exemptions.</p>



<p><strong>Stricter Regulations for Minors</strong>: Social media platforms and online services targeting minors (individuals under 18) would also be subject to heightened obligations and standards, including restrictions related to processing minors&rsquo; personal data related to certain risks and automated decisions.</p>



<p><strong>Additional Changes</strong>: Additionally, the amended changes would include additional responsibilities placed on data controllers, including those related to consumer rights requests, data protection assessments and privacy notices and disclosures.</p>



<p class="has-text-align-center">***</p>



<p>Although this legislative season has not seen revolutionary new laws passed, amendments in states like Connecticut, Colorado, and Montana are important reminders that changes to existing laws can have significant impacts&ndash;both in broadening the scope of their application and in current compliance regimes.   </p>

]]></content:encoded>
					
		
		
		<source url='https://www.cyberadviserblog.com/'>CyberAdviser</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/1750198663-4333-2978-lxb_photoMAIq7eiPLEQlxb_photo-.jpg' type='image/jpeg' length='562250' />	</item>
		<item>
		<title>Privacy in the Big Sky State: Montana’s Consumer Privacy Law Gets Amended</title>
		<link>https://www.lexblog.com/2025/06/18/privacy-in-the-big-sky-state-montanas-consumer-privacy-law-gets-amended/</link>
		
		<dc:creator><![CDATA[Philip M. Duclos]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 12:20:30 +0000</pubDate>
				<category><![CDATA[Employment & Labor]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/privacy-in-the-big-sky-state-montanas-consumer-privacy-law-gets-amended/</guid>

					<description><![CDATA[Montana recently amended its privacy law through Senate Bill 297, effective October 1, 2025, strengthening consumer protections and requiring businesses to revisit their privacy policies that apply to citizens of Montana. Importantly, it lowered the threshold for applicability to persons and businesses who control or process the personal data of 25,000 or more consumers (previously...]]></description>
										<content:encoded><![CDATA[
<p>Montana recently amended its <a href="https://www.workplaceprivacyreport.com/2023/05/articles/consumer-privacy/montana-passes-9th-comprehensive-consumer-privacy-law-in-the-u-s/">privacy law</a> through <a href="https://bills.legmt.gov/#/laws/bill/2/LC0372?open_tab=sum">Senate Bill 297</a>, effective <strong>October 1, 2025</strong>, strengthening  consumer protections and requiring businesses to revisit their privacy policies that apply to citizens of Montana. Importantly, it lowered the threshold for applicability to persons and businesses who control or process the personal data of 25,000 or more consumers (previously 50,000), unless the controller uses that data solely for completing payments. For those who derive more than 25% of gross revenue from the sale of personal data, the threshold is now 15,000 or more consumers (previously 25,000).</p>



<p>With the amendments, nonprofits are no longer exempt unless they are set up to detect and prevent insurance fraud. Insurers are now similarly exempt.</p>



<p>When a consumer requests confirmation that a controller is processing their data, the controller can no longer disclose but must identify possession of: (1)&nbsp;social security numbers, (2)&nbsp;ID numbers, (3)&nbsp;financial account numbers, (4)&nbsp;health insurance or medical identification numbers, (5)&nbsp;passwords, security questions, or answers, or (6)&nbsp;biometric data.</p>



<p>Privacy notices must now include: (1)&nbsp;personal data categories, (2)&nbsp;controller&rsquo;s purpose in possessing personal data, (3)&nbsp;categories controller sells or shares with third parties, (4)&nbsp;categories of third parties, (5)&nbsp;contact information for the controller, (6)&nbsp;explanation of rights and how to exercise them, and (7)&nbsp;the date privacy notice was last updated. Privacy notices must be accessible to and usable to people with disabilities and available in each language in which the controller provides a product or service. Any material changes to the controller&rsquo;s privacy notice or practices require notices to affected consumers and the opportunity to withdraw consent. Notices need not be Montana-specific, but controllers must conspicuously post them on websites, in mobile applications, or through whatever medium the controller interacts with customers.</p>



<p>The amendments further clarified information the attorney general must publicly provide, including an online mechanism for consumers to file complaints. Further, the attorney general may now issue civil investigative demands and need not issue any notice of violation or provide a 60-day period for the controller to correct the violation.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.workplaceprivacyreport.com/'>Workplace Privacy, Data Management &amp; Security Report</source>
	</item>
		<item>
		<title>Agentic AI for Software Security: Eliminate More Vulnerabilities, Triage Less</title>
		<link>https://www.lexblog.com/2025/06/18/agentic-ai-for-software-security-eliminate-more-vulnerabilities-triage-less/</link>
		
		<dc:creator><![CDATA[Jodi Daniels]]></dc:creator>
		<pubDate>Wed, 18 Jun 2025 08:14:31 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[AI]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/18/agentic-ai-for-software-security-eliminate-more-vulnerabilities-triage-less/</guid>

					<description><![CDATA[]]></description>
										<content:encoded><![CDATA[]]></content:encoded>
					
		
		
		<source url='https://redcloveradvisors.com/feed/'>Red Clover Advisors Blog</source>
	</item>
		<item>
		<title>North Dakota Passes New Data Security Law for “Financial Corporations”</title>
		<link>https://www.lexblog.com/2025/06/17/north-dakota-passes-new-data-security-law-for-financial-corporations/</link>
		
		<dc:creator><![CDATA[Liisa Thomas, Kathryn Smith and James O&#039;Reilly*]]></dc:creator>
		<pubDate>Tue, 17 Jun 2025 16:24:09 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/17/north-dakota-passes-new-data-security-law-for-financial-corporations/</guid>

					<description><![CDATA[North Dakota recently passed a law establishing new rules for certain financial companies operating in the state &#8211; specifically &#8220;financial corporations.&#8221; The new obligations will take effect on August 1, 2025. They will apply to businesses that the North Dakota department of financial institutions regulates. Financial institutions (like banks and loan companies) and credit unions...]]></description>
										<content:encoded><![CDATA[
<p>North Dakota recently passed a <a href="https://legiscan.com/ND/bill/HB1127/2025">law</a> establishing new rules for certain financial companies operating in the state &ndash; specifically &ldquo;financial corporations.&rdquo; The new obligations will take effect on August 1, 2025. They will apply to businesses that the North Dakota department of financial institutions regulates. Financial institutions (like banks and loan companies) and credit unions are not regulated by that entity.</p>



<span id="more-3233997"></span>



<p>Under the new requirements, covered entities must create a written information security program and designate a person to oversee that program. Covered entities must base their information security programs on a written risk assessment that identifies risks to their customers&rsquo; information. The program includes breach response and reporting provisions for incidents that impact customer information. Covered entities will also have to periodically complete new risk assessments to evaluate their security measures and monitor the efficacy of the program.</p>



<p>The law also creates new rules for reporting data breaches. Namely, covered financial companies must notify the North Dakota Commissioner of the Department of Financial Institutions if there is a &ldquo;notification event.&rdquo; A notification event occurs when an unauthorized person accesses unencrypted customer information. If this event involves the information of at least 500 customers, the company must notify the Commissioner as soon as possible, but no later than 45 days after discovering the issue. The law states that a covered entity &ldquo;discovers&rdquo; an event as soon as any employee, officer, or agent of the corporation learns about it.</p>



<p><strong>Putting it Into Practice: <a></a>Financial corporations regulated by the North Dakota department of financial institutions should take note of these changes and make updates as might be needed to their security program and incident response plan prior to August 1<sup>st</sup>.</strong></p>

]]></content:encoded>
					
		
		
		<source url='https://www.eyeonprivacy.com/'>Eye On Privacy</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Privacy-Security-Blog-Image-660x283-1.png' type='image/jpeg' length='281504' />	</item>
		<item>
		<title>Cybersecurity in the Era of Generative and Agentic AI: Six Observations</title>
		<link>https://www.lexblog.com/2025/06/17/cybersecurity-in-the-era-of-generative-and-agentic-ai-six-observations/</link>
		
		<dc:creator><![CDATA[Zachary Heck]]></dc:creator>
		<pubDate>Tue, 17 Jun 2025 14:21:09 +0000</pubDate>
				<category><![CDATA[Featured Posts]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/17/cybersecurity-in-the-era-of-generative-and-agentic-ai-six-observations/</guid>

					<description><![CDATA[Last week, I had the privilege to attend one of the Midwest&#8217;s largest artificial intelligence conferences dedicated to AI developers, users, and enthusiasts: Cincy AI Week. During the three-day event, which brought together over 950 local professionals, I spoke on a panel entitled &#8220;Managing Risk in the Age of AI and Automation.&#8221; Here are six...]]></description>
										<content:encoded><![CDATA[
<figure style=" max-width: 100%; height: auto; " class="wp-block-image alignleft size-large is-resized"><img decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/black-microphone-conference-room-filtered-image-processed-v-640x427-1.jpg" alt="" class="wp-image-2883" style=" max-width: 100%; height: auto; width:260px;height:auto"></figure>



<p>Last week, I had the privilege to attend one of the Midwest&rsquo;s largest artificial intelligence conferences dedicated to AI developers, users, and enthusiasts: Cincy AI Week. During the three-day event, which brought together over 950 local professionals, I spoke on a panel entitled &ldquo;Managing Risk in the Age of AI and Automation.&rdquo; <br><br>Here are six important observations I shared during that panel:</p>



<span id="more-3233777"></span>



<p><strong>1.&nbsp; Organizations needs to shift risk posture from reactive to proactive.</strong></p>



<p>AI technologies are increasingly influencing decision-making, data handling, and customer interactions. Accordingly, legal risks &ndash; from security incidents and data breaches to algorithmic bias &ndash; are no longer hypothetical, but imminent. Organizations must embed legal and ethical oversight into each stage of AI development and deployment, rather than addressing issues only after they arise. This means engaging legal counsel early, implementing clear accountability frameworks (see below), and ensuring transparency and auditability in AI systems. Doing so not only mitigates risk, but also builds trust with regulators, clients, and stakeholders in a rapidly evolving legal landscape.</p>



<p><strong>2. Data governance, vendor risk, and internal misuse remain major vulnerabilities for organizations adopting AI tools.</strong></p>



<p>As organizations rapidly adopt AI tools, many overlook risk exposure in data governance, vendor risk, and internal misuse; each posing significant legal and regulatory exposure. Inadequate data governance can lead to incidents of unauthorized use of sensitive or non-compliant data in model training. Such unauthorized use could constitute violations of data protection laws such as GDPR, CCPA, HIPAA, and various state consumer privacy laws. Meanwhile, reliance on third-party AI vendors without thorough due diligence or contractual safeguards can expose organizations to hidden liability such as intellectual property infringement and undisclosed model risks. Internally, the misuse of AI &ndash; whether through employee error, lack of oversight, or improper prompts &ndash; can result in biased outputs, misinformation, or breaches of confidentiality. Without a comprehensive legal strategy to address these gaps, businesses risk serious reputational, financial, and legal consequences.</p>



<p><strong>3. Legal and technical teams need to collaborate on AI risk management.</strong></p>



<p>Effective AI risk management demands close collaboration between legal and technical teams because the challenges posed by AI straddle both legal obligations and technical complexities. Legal teams bring critical expertise in regulatory compliance, intellectual property, privacy, and liability. Technical teams, on the other hand, understand how AI models are built, trained, and deployed; thereby giving technical teams insight into an AI model&rsquo;s limitations and potential for unintended outcomes. Without alignment, organizations risk (a) legal teams overlooking how a system actually functions, and (b) technical teams missing key regulatory implications. Instead, by working together from the beginning of the AI development or adoption lifecycle, these teams can design and procure AI systems that are not only innovative and efficient, but also legally sound, transparent, and defensible under regulatory scrutiny. This integrated approach is necessary to mitigate risk, ensure accountability, and maintain trust in an increasingly AI-driven environment.</p>



<p><strong>4.&nbsp; NIST and ISO frameworks are a great starting point for AI-specific cybersecurity planning.</strong></p>



<p>The National Institute of Standards and Technology (NIST) AI Risk Framework offers structured guidance on identifying, assessing, and mitigating risks unique to AI systems, such as data poisoning, model inversion, and adversarial attacks. Adapting cybersecurity policies, such as those modeled on ISO/IEC 27001 or the NIST Cybersecurity Framework, to address AI-specific vulnerabilities ensures a more resilient security posture. In-house and outside counsel play a critical role in this process by ensuring that these frameworks are aligned with regulatory requirements and contractual obligations; particularly in sectors handling sensitive or high-risk data. Organizations that leverage these frameworks into their AI development lifecycle are better positioned to manage evolving threats while demonstrating compliance and due diligence to regulators, customers, and stakeholders.</p>



<p><strong>5.&nbsp; Transparency and explainability are essential for managing AI risk.</strong></p>



<p>Legal scrutiny around algorithmic decision-making has intensified over the last two years. Transparency and explainability are not just ethical ideals; they are practical safeguards.&nbsp; Transparent AI systems allow organizations to understand how decisions are made, which is essential for identifying and mitigating bias, ensuring compliance with anti-discrimination and data protection laws, and responding effectively to audits and litigation. Explainability strengthens accountability by enabling organizations to justify automated decisions to regulators, courts, and impacted individuals in high-risk environments like finance, healthcare, and employment. Lack of explainability, on the other hand, can lead to regulatory penalties, reputational damage, or invalidated outcomes. Building AI systems with transparency and explainability from the outset is a key step in reducing legal risk and fostering trust.</p>



<p><strong>6. Security-first cultures do not need to slow innovation.</strong></p>



<p>Building a security-first culture does not mean stifling progress. Instead, a security-first approach means adopting smart, scalable safeguards into the innovation process from day one (in other words, security-by-design). By aligning legal, security, and technical teams early in the development or adoption lifecycle, organizations can proactively address risks without creating bottlenecks. Clear policies, ongoing employee training, and cross-functional collaboration through an AI Governance Committee ensure that security is treated as a shared responsibility instead of a final checkpoint. This approach empowers teams to innovate with confidence by knowing that privacy, compliance, and risk management are integrated rather than obstructive.&nbsp;</p>



<p>As AI reshapes how organizations operate, the intersection of cybersecurity, legal compliance, and responsible innovation has never been more critical. The six observations I shared during Cincy AI Week underscore a central theme: managing AI risk is not just a technical or legal challenge; instead, it is a strategic imperative that demands proactive, cross-disciplinary coordination. By merging legal oversight into the development process, embracing transparency, leveraging established frameworks, and fostering a culture where security and innovation go hand-in-hand, organizations can mitigate risk and build resilient, trustworthy AI systems. In other words, as AI continues to evolve, so too must risk strategies. </p>



<p>For more information on data privacy and security regulations, AI, and other data privacy questions, please visit <a href="https://www.privacyanddatasecurityinsight.com/">Taft&rsquo;s Privacy and Data Security Insights blog</a></p>

]]></content:encoded>
					
		
		
		<source url='https://www.privacyanddatasecurityinsight.com/'>Taft Privacy &amp; Data Security Insights</source>
	</item>
		<item>
		<title>AI and Job Postings: Navigating Ontario’s Upcoming Requirements</title>
		<link>https://www.lexblog.com/2025/06/16/ai-and-job-postings-navigating-ontarios-upcoming-requirements/</link>
		
		<dc:creator><![CDATA[Imran Ahmad (CA), Domenic Presta (CA), Joseph Cohen-Lyons and Humna Shaikh]]></dc:creator>
		<pubDate>Mon, 16 Jun 2025 20:19:13 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/16/ai-and-job-postings-navigating-ontarios-upcoming-requirements/</guid>

					<description><![CDATA[On March 21, the Ontario’s Bill 149, Working for Workers Four Act, 2024 (“Bill 149”) received Royal Assent. ]]></description>
										<content:encoded><![CDATA[
<p>On March 21, the Ontario&rsquo;s Bill 149, <em>Working for Workers Four Act</em>, 2024<a href="#_ftn1" id="_ftnref1">[1]</a> (&ldquo;<strong>Bill 149</strong>&rdquo;) received Royal Assent. It introduced significant amendments to the <em>Employment Standards Act</em>, 2000 (&ldquo;<strong>ESA</strong>&rdquo;), including a mandate for employers to disclose the use of artificial intelligence (&ldquo;<strong>AI</strong>&rdquo;) in publicly advertised job postings. As of January 1, 2026, employers with twenty-five (25) or more employees must include a statement on job postings if AI is used to screen, assess, or select applicants.</p>



<p>As employers prepare for this change, compliance should align hiring practices with broader legal and ethical standards. Transparency and fairness are important considerations associated with the responsible use of AI, and Ontario&rsquo;s mandatory disclosure requirements take these into account. By being mindful of transparency and fairness when using AI, employers can reduce the risk of running afoul of disclosure requirements while demonstrating a commitment to ethical hiring practices in an increasingly automated landscape.</p>



<p><strong>AI Disclosure Requirements Under Bill 149</strong></p>



<p>Bill 149 amends the ESA by introducing several new requirements, including requirements with respect to AI, which it defines broadly as &ldquo;<em>a machine-based system that, for explicit or implicit objectives, infers from the input it receives in order to generate outputs such as predictions, content, recommendations or decisions that can influence physical or virtual environment.</em>&rdquo;<a id="_ftnref2" href="#_ftn2">[2]</a></p>



<p>With respect to AI, Bill 149 provides that employers must disclose if AI is used in assessing or selecting candidates in any publicly posted job.</p>



<p>For a detailed summary of the various changes implemented by Bill 149, see our blog post: <a href="https://www.nortonrosefulbright.com/en/knowledge/publications/68599b7e/ontarios-working-for-workers-four-act-receives-royal-assent">Ontario&rsquo;s Working for Workers Four Act receives royal assent | Global law firm | Norton Rose Fulbright</a>.</p>



<p><strong>Scope and Exceptions</strong></p>



<p>These rules apply to external job postings accessible to the public. Notably, Bill 149 exempts:</p>



<ul class="wp-block-list">
<li>Internal postings (for current employees only).</li>



<li>General &ldquo;help wanted&rdquo; signs.</li>



<li>Recruitment for work performed outside Ontario.</li>
</ul>



<p>Employers with fewer than twenty-five (25) employees are not currently subject to these obligations.<a href="#_ftn3" id="_ftnref3">[3]</a></p>



<p><strong>Legal and Human Rights Considerations</strong></p>



<p>Automated decision-making is increasingly being used by organizations to attract top talent, streamline the hiring process or aid in performance evaluations.<a href="#_ftn4" id="_ftnref4">[4]</a> However, potential impact of algorithmic bias and the need for transparency generally remain important considerations around the fair and responsible use of AI.</p>



<p>The Ontario Human Rights Commission (OHRC) has flagged AI use in employment as a growing risk, citing the potential for indirect discrimination through algorithmic bias.<a href="#_ftn5" id="_ftnref5">[5]</a> For example, AI tools trained on historical hiring data could replicate unfair requirements and biased language in job advertisements. This could unintentionally favor certain demographics and exclude others with certain characteristics, thereby potentially infringing rights under the Ontario <em>Human Rights Code</em> (the &ldquo;<strong>Code</strong>&rdquo;).</p>



<p>This means failure to adhere to Bill 149&rsquo;s AI disclosure requirements may not only trigger ESA enforcement but also human rights complaints under the Code, particularly if AI-driven decisions lead to discriminatory outcomes.</p>



<p><strong>Practical Risk Mitigation Strategies</strong></p>



<p>Although a regulation accompanying Bill 149 was released in late 2024<a href="#_ftn6" id="_ftnref6">[6]</a> and provides a definition of AI, it does not clarify which specific tools, systems, or processes fall within this definition and would therefore trigger the disclosure requirement for publicly advertised job postings.</p>



<p>The current lack of clarity around what constitutes AI use in job postings under Bill 149 may lead to inconsistent reporting (either underreporting or overreporting) which could inadvertently increase the risk of non-compliance. In the absence of further guidance from the Government of Ontario and given the AI disclosure requirements that are expected to take effect next January, employers should use the intervening months to prepare for compliance and consider taking the following steps:</p>



<ul class="wp-block-list">
<li><strong>Conduct Algorithmic Impact Assessments: </strong>Regular audits can identify biased outputs or data gaps in AI tools. Employers should document AI decision logic and test for discriminatory patterns.</li>



<li><strong>Develop Transparent Disclosure Practices: </strong>While Bill 149 does not clarify the exact language to meet the AI disclosure requirements, to err on the side of caution, job postings should clearly state how AI is used in the recruitment and onboarding process, and at what stages.</li>



<li><strong>Integrate Human Oversight: </strong>Employers already using AI should consider a &ldquo;human-in-the-loop&rdquo; approach for hiring practices. This helps ensure final hiring decisions are reviewed and contextualized by HR professionals in order to limit overreliance on algorithmic judgment and prevent undesired results (for example, inadvertently excluding certain demographics from the hiring pool due to poor training data relied-upon by AI models). The OHRC also recommends that employers demonstrate a reasonable degree of transparency associated with the AI solutions that are leveraged during the course of the hiring process.</li>



<li><strong>Train HR and Legal Teams: </strong>Staff should be possessed of a clear understanding of how AI is deployed by their organizations, as well as having a firm grasp of what their associated legal disclosure and privacy obligations are. This includes reviewing and auditing existing practices and policies and conducting gap analyses to prepare for new obligations under Bill 149, such as the adoption of an AI governance framework for the responsible use of AI.</li>



<li><strong>Review Vendor Agreements: </strong>If an employer is using third-party AI solutions to assist with hiring, it would be advisable to confirm that such solutions will be compliant with Bill 149, and to the extent there are compliance gaps, remediate same by way of contractual amendments.</li>
</ul>



<p><strong>Key Takeaways</strong></p>



<p>Ontario&rsquo;s Bill 149 introduces new AI disclosure requirements that reflect the growing role of AI use in hiring practices. As these rules come into effect, businesses should begin assessing how AI is used in their recruitment processes and take steps to align with evolving legal expectations. While the focus is on transparency and fairness, these principles now carry legal weight across employment, human rights, and privacy frameworks. This is particularly important in the context of a gatekeeping function such as the hiring of employees. Employers must remain aware of and make efforts to mitigate against the risks associated with AI systems that may have a material adverse impact on individuals, whether by act or omission. Preparing early can help organizations navigate these changes smoothly and responsibly.</p>



<hr class="wp-block-separator has-alpha-channel-opacity">



<p><a href="#_ftnref1" id="_ftn1">[1]</a> <a href="https://www.ola.org/en/legislative-business/bills/parliament-43/session-1/bill-149">Bill 149, Working for Workers Four Act, 2024</a>.</p>



<p><a href="#_ftnref2" id="_ftn2">[2]</a> <a href="https://www.ontario.ca/laws/regulation/r24476">Section 2(1), Employment Standards Act, Ontario regulation 476/24, 2024</a>.</p>



<p><a href="#_ftnref3" id="_ftn3">[3]</a> <a href="https://www.ontario.ca/laws/regulation/r24476">Section 2(1), Employment Standards Act, Ontario regulation 476/24, 2024</a>.</p>



<p><a href="#_ftnref4" id="_ftn4">[4]</a> <a href="https://www.theglobeandmail.com/business/article-ai-eases-the-burden-of-repetitive-hr-work-but-the-human-touch-is-still/">AI eases the burden of repetitive HR work, but the human touch is still needed, Globe and Mail (2024)</a>.</p>



<p><a href="#_ftnref5" id="_ftn5">[5]</a> <a href="https://www3.ohrc.on.ca/en/news-center/ontario-human-rights-commission-submission-standing-committee-social-policy-regarding">Ontario Human Rights Commission Submission to the Standing Committee on Social Policy Regarding Bill 149, Working for Workers Four Act, 2023</a>.</p>



<p><a href="#_ftnref6" id="_ftn6">[6]</a> <a href="https://www.ontario.ca/laws/regulation/r24476">Section 2(1), Employment Standards Act, Ontario regulation 476/24, 2024</a>.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.dataprotectionreport.com/'>Data Protection Report</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Print-AdobeStock_788399040-scaled-1.png' type='image/jpeg' length='905671' />	</item>
		<item>
		<title>Growing List of States Attempting to Regulate Kids’ Social Media Accounts: Nebraska Husks Up</title>
		<link>https://www.lexblog.com/2025/06/16/growing-list-of-states-attempting-to-regulate-kids-social-media-accounts-nebraska-husks-up/</link>
		
		<dc:creator><![CDATA[Liisa Thomas, Kathryn Smith and James O&#039;Reilly*]]></dc:creator>
		<pubDate>Mon, 16 Jun 2025 16:22:20 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/16/growing-list-of-states-attempting-to-regulate-kids-social-media-accounts-nebraska-husks-up/</guid>

					<description><![CDATA[Nebraska&#8217;s governor signed a bill into law that, among other things, creates the Parental Rights in Social Media Act. The provisions of the law will go into effect July 1, 2026, unless challenged. The law is similar to several other states, most of which have been challenged (including Arkansas, California, and Utah) and some struck...]]></description>
										<content:encoded><![CDATA[
<p>Nebraska&rsquo;s governor signed a <a href="https://legiscan.com/NE/text/LB383/id/3241848">bill</a> into law that, among other things, creates the Parental Rights in Social Media Act. The provisions of the law will go into effect July 1, 2026, unless challenged. The law is similar to several other states, most of which have been challenged (including <a href="https://www.eyeonprivacy.com/2025/04/arkansas-kids-social-media-law-another-one-bites-the-dust/">Arkansas</a>, <a href="https://www.eyeonprivacy.com/2025/02/californias-kids-social-media-law-wrangling-continues-and-maryland-too/">California</a>, and <a href="https://www.eyeonprivacy.com/2024/04/mother-may-i-florida-and-utah-recently-regulations-for-minor-use-of-social-media-platforms/">Utah</a>) and some struck down.</p>



<span id="more-3232812"></span>



<p>If the law goes unchallenged, unlike other states it creates a private right of action. Anyone who violates the act may be subject to a lawsuit brought by an injured party. They may be ordered to pay damages, attorney&rsquo;s fees, and other relief. In addition, the Nebraska Attorney General can enforce the law and seek penalties of up to $2,500 per violation.</p>



<p>Obligations placed on social media companies under the law include:</p>



<ul class="wp-block-list">
<li><strong>Age verification</strong>: Social media companies (or their vendors) will need to verify the ages of all people that attempt to create an account. It would restrict anyone under 18 from creating an account. And, the law specifically requires that social media companies delete identifying information they get when checking user ages.</li>



<li><strong>Parental consent</strong>: The law requires parental consent before minors can create social media accounts. They must also give parents mechanisms to revoke their consent. If a parent revokes their consent, the social media company must remove the account of that parent&rsquo;s child and must stop a child from creating a new account unless the parent provides consent.</li>



<li><strong>Parental supervision</strong>: Parents will need to be given a way to supervise their children&rsquo;s social media use. This includes access to their children&rsquo;s posts and messages, and controls over their privacy and account settings. In addition, parents must be able to monitor and limit the amount of time the minor spends using the social media site.</li>
</ul>



<p><strong>Putting it Into Practice: <a></a>Nebraska joins a growing number of states attempting to regulate children&rsquo;s use of social media. We will continue to monitor the status of this new Nebraska law before mid-2026, but anticipate seeing other similar legislation from other states.</strong></p>



<hr class="wp-block-separator has-alpha-channel-opacity">

]]></content:encoded>
					
		
		
		<source url='https://www.eyeonprivacy.com/'>Eye On Privacy</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Social-Media-and-Games-Blog-Image_Social-Media-Icons_660x283.png' type='image/jpeg' length='149549' />	</item>
		<item>
		<title>The Growing Cyber Risks from AI — and How Organizations Can Fight Back</title>
		<link>https://www.lexblog.com/2025/06/16/the-growing-cyber-risks-from-ai-and-how-organizations-can-fight-back/</link>
		
		<dc:creator><![CDATA[Joseph J. Lazzarotti]]></dc:creator>
		<pubDate>Mon, 16 Jun 2025 13:37:44 +0000</pubDate>
				<category><![CDATA[Employment & Labor]]></category>
		<category><![CDATA[Featured Posts]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[AI]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/16/the-growing-cyber-risks-from-ai-and-how-organizations-can-fight-back/</guid>

					<description><![CDATA[Artificial Intelligence (AI) is transforming businesses&#8212;automating tasks, powering analytics, and reshaping customer interactions. But like any powerful tool, AI is a double-edged sword. While some adopt AI for protection, attackers are using it to scale and intensify cybercrime. Here&#8217;s a high-level discussion at emerging AI-powered cyber risks in 2025&#8212;and steps organizations can take to defend....]]></description>
										<content:encoded><![CDATA[
<p>Artificial Intelligence (AI) is transforming businesses&mdash;automating tasks, powering analytics, and reshaping customer interactions. But like any powerful tool, AI is a double-edged sword. While some adopt AI for protection, attackers are using it to scale and intensify cybercrime. Here&rsquo;s a high-level discussion at emerging AI-powered cyber risks in 2025&mdash;and steps organizations can take to defend.</p>



<p><strong>AI-Generated Phishing &amp; Social Engineering</strong></p>



<p>Cybercriminals now use generative AI to craft near-perfect phishing messages&mdash;complete with accurate tone, logos, and language&mdash;making them hard to distinguish from real communications . Voice cloning tools enable &ldquo;deepfake&rdquo; calls from executives, while deepfake video can simulate someone giving fraudulent instructions.</p>



<p>Thanks to AI, <a href="https://tech-adv.com/blog/ai-cyber-attack-statistics/">according to Tech Advisors</a>, phishing attacks are skyrocketing&mdash;phishing surged 202% in late 2024, and over 80% of phishing emails now incorporate AI, with nearly 80% of recipients opening them. These messages are bypassing filters and fooling employees.</p>



<p><strong>Adaptive AI-Malware &amp; Autonomous Attacks</strong></p>



<p>It is not just the threat actors but the AI itself that drives the attack. According to <a href="https://www.cyberdefensemagazine.com/ai-powered-cyber-attacks-and-data-privacy-in-the-age-of-big-data/">Cyber Defense Magazine</a> reporting: &nbsp;</p>



<p class="is-style-callout"><em>Compared to the traditional process of cyber-attacks, the attacks driven by AI have the capability to automatically learn, adapt, and develop strategies with a minimum number of human interventions. These attacks proactively utilize the algorithms of machine learning, natural language processing, and deep learning models. They leverage these algorithms in the process of determining and analyzing issues or vulnerabilities, avoiding security and detection systems, and developing phishing campaigns that are believable.</em></p>



<p>As a result, attacks that once took days now unfold in minutes, and detection technology struggles to keep up, permitting faster, smarter strikes to slip through traditional defenses.</p>



<p><strong>Attacks Against AI Models Themselves</strong></p>



<p>Cyberattacks are not limited to business email compromises designed to effect fraudulent transfers or to demand a ransom payment in order to suppress sensitive and compromising personal information. Attackers are going after AI systems themselves. Techniques include:</p>



<ul class="wp-block-list">
<li><strong>Data poisoning</strong> &ndash; adding harmful or misleading data into AI training sets, leading to flawed outputs or missed threats.</li>



<li><strong>Prompt injection</strong> &ndash; embedding malicious instructions in user inputs to hijack AI behavior.</li>



<li><strong>Model theft/inversion</strong> &ndash; extracting proprietary data or reconstructing sensitive training datasets.</li>
</ul>



<p>Compromised AI can lead to skipped fraud alerts, leaked sensitive data, or disclosure of confidential corporate information. Guidance from NIST, <a href="https://nvlpubs.nist.gov/nistpubs/ai/NIST.AI.100-2e2023.pdf">Adversarial Machine Learning <em>A Taxonomy and Terminology of Attacks and Mitigations</em></a>, digs into these quite a bit more, and outlines helpful mitigation measures.</p>



<p><strong>Deepfakes &amp; Identity Fraud</strong></p>



<p>Deepfake audio and video are being used to mimic executives or trusted contacts, instructing staff to transfer funds, disclose passwords, or bypass security protocols.</p>



<p>Deepfakes have exploded&mdash;some reports indicate a 3,000% increase in deepfake fraud activity. These attacks can erode trust, fuel financial crime, and disrupt decision-making.</p>



<p><strong>Supply Chain &amp; Third-Party Attacks</strong></p>



<p>AI accelerates supply chain attacks, enabling automated scanning and compromise of vendor infrastructures. <a href="https://www.cyberdefensemagazine.com/table-stakes-in-2025-threat-intelligence-management-to-counter-emerging-challenges/">Attackers can breach a small provider and rapidly move across interconnected systems</a>. These ripple-effect attacks can disrupt entire industries and critical infrastructure, far beyond the initial target. We have seen these effects with more traditional supply chain cyberattacks. AI will only amplify these attacks. &nbsp;</p>



<p><strong>Enhancing Cyber Resilience, Including Against AI Risks</strong></p>



<p>Here&rsquo;s some suggestions for stepping up defenses and mitigating risk:</p>



<ol start="1" class="wp-block-list">
<li><strong>Enhance Phishing Training for AI-level deception</strong><br>Employees should recognize not just misspellings, but hyper-realistic phishing, voice calls, and video&#8232; impersonations. Simulations should evolve to reflect current AI tactics.</li>



<li><strong>Inventory, vet, and govern AI systems</strong><br>Know which AI platforms you use&mdash;especially third-party tools. Vet them for data protection, model integrity, and update protocols. Keep a detailed registry and check vendor security practices. Relying on a vendor&rsquo;s SOC report simply may not be sufficient, particularly is not read carefully and in context.</li>



<li><strong>Validate AI inputs and monitor outputs</strong><br>Check training data for poisoning. Test and stress AI models to spot vulnerabilities. Use filters and anomaly detection to flag suspicious inputs or outputs.</li>



<li><strong>Use AI to defend against AI</strong><br>Deploy AI-driven defensive tools&mdash;like behavior-based detection, anomaly hunting, and automated response platforms&mdash;so you react in real time.</li>



<li><strong>Adopt zero trust and multi-factor authentication (MFA)</strong><br>Require authentication for every access, limit internal privileges, and verify every step&mdash;even when actions appear internal.</li>



<li><strong>Plan for AI-targeted incidents</strong><br>Update your incident response plan with scenarios like model poisoning, deepfake impersonation, or AI-driven malware. Include legal, communications, and other relevant stakeholders in your response teams.</li>



<li><strong>Share intelligence and collaborate</strong><br>Tap into threat intelligence communities, &ldquo;Information Sharing and Analysis Centers&rdquo; or &ldquo;ISACs&rdquo;, to share and receive knowledge of emerging AI threats.</li>
</ol>



<p>Organizations that can adapt to a rapidly changing threat landscape will be better position to defend against these emerging attack vectors and mitigate harm.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.workplaceprivacyreport.com/'>Workplace Privacy, Data Management &amp; Security Report</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Futuristic-3D-Render-1750102876-3232598-4983-lxb_photo_0iV9LmPDn0lxb_photo--550x550.jpg' type='image/jpeg' length='275549' />	</item>
		<item>
		<title>UK: Data (Use and Access) Bill passes through Parliament</title>
		<link>https://www.lexblog.com/2025/06/16/uk-data-use-and-access-bill-passes-through-parliament/</link>
		
		<dc:creator><![CDATA[Rachel de Souza]]></dc:creator>
		<pubDate>Mon, 16 Jun 2025 08:21:53 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/16/uk-data-use-and-access-bill-passes-through-parliament/</guid>

					<description><![CDATA[On 11 June 2025, the UK&#8217;s Data (Use and Access) Act 2025 (&#8220;DUA Act&#8220;) was passed and now awaits Royal Assent. The government first announced plans for the new DUA Act in the King&#8217;s speech back in July 2024. The DUA Act introduces reforms to data protection and e-privacy laws and also includes a strong...]]></description>
										<content:encoded><![CDATA[
<p>On 11 June 2025, the UK&#8217;s Data (Use and Access) Act 2025 (&#8220;<strong>DUA Act</strong>&#8220;) was passed and now awaits Royal Assent.</p>



<p>The government first announced plans for the new DUA Act in the King&rsquo;s speech back in July 2024. The DUA Act introduces reforms to data protection and e-privacy laws and also includes a strong emphasis on wider data related policy initiatives, focussed on facilitating digital identities and securing access to &lsquo;smart&rsquo; or &lsquo;open&rsquo; data sets.&nbsp;</p>



<p>The long awaited DUA Act proposes very limited changes to the UK data protection regime which are unlikely to have a material impact on day-to-day compliance for most businesses operating in the UK.</p>



<p>The specific areas of reform proposed include (see our <a href="https://privacymatters.dlapiper.com/2024/11/uk-data-use-and-access-bill-newcomer-or-a-familiar-face/">previous Privacy Matters blog</a> for more information on the changes) :</p>



<ul class="wp-block-list">
<li>Creating a statutory <strong>definition of scientific research</strong> to help clarify how the various provisions in the UK GDPR which refer to &lsquo;research&rsquo; are intended to be applied.</li>



<li>Introducing the <strong>concept of &lsquo;recognised legitimate interests&rsquo;</strong> to provide a presumption of legitimacy to certain processing activities that a controller may wish to carry out under Article 6(1)(f) (legitimate interests).</li>



<li><strong>Removing the requirement to establish a qualifying lawful basis before conducting automated decision making</strong> (the requirement currently at Article 21(2) UK GDPR), except where special category data is used. &nbsp;</li>



<li><strong>Granting the Secretary of State the authority to designate new special categories of personal data</strong> and additional processing activities that fall under the prohibition of processing special category data in Article 9(1) of the UK GDPR.</li>



<li>Introducing <strong>reforms to the rules on cookie consent</strong>.</li>



<li><strong>Aligning the UK GDPR / DPA and PECR</strong> enforcement regimes.</li>



<li>Introducing <strong>amendments that are designed to clarify the UK&rsquo;s approach to the transfer of personal data</strong> internationally and the UK&rsquo;s approach to conduct of adequacy assessments.</li>



<li>Introducing <strong>reforms to the ICO</strong>, including a name change to an Information Commission, rather than a Commissioner, introducing a formal Board structure with an appointed CEO.</li>
</ul>



<p>In the final stages of the DUA Act&#8217;s passage through Parliament, much of the debate focused on the House of Lords&#8217; repeated proposed amendments to ensure transparency around data scraping and the use of text and data to train GPAI models. The proposed amendments required developers of AI models to publish information used in the pre-training, training, fine-tuning and retrieval-augmented generation of the AI model, and to provide an effective mechanism to allow copyright owners to identify all individual works that they own. However, the government argued that the DUA Act was not the correct place to deal with this difficult issue and the amendments were ultimately dropped.</p>



<p><strong>What next?</strong></p>



<p>The DUA Act now awaits royal assent but could potentially come into law before Parliament breaks for summer recess in July.</p>



<p>Although some key changes have been introduced, the majority of the DUA Act simply reflects established principles or guidance and introduces minor changes around the edges of existing governance requirements, without overhauling them completely. Some of the more innovative elements (around smart data access and use) are still unclear as we await the detail of secondary legislation.</p>

]]></content:encoded>
					
		
		
		<source url='https://privacymatters.dlapiper.com/'>Privacy Matters</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Document_Pages_S_0243-copy.jpg' type='image/jpeg' length='100093' />	</item>
		<item>
		<title>Proposed State Privacy Law Update: June 16, 2025</title>
		<link>https://www.lexblog.com/2025/06/15/proposed-state-privacy-law-update-june-16-2025/</link>
		
		<dc:creator><![CDATA[David Stauss]]></dc:creator>
		<pubDate>Sun, 15 Jun 2025 14:42:46 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/15/proposed-state-privacy-law-update-june-16-2025/</guid>

					<description><![CDATA[Keypoint: Last week, the Vermont Governor signed the Vermont Age-Appropriate Design Code Act into law. Below is the twenty third weekly update on the status of proposed state privacy legislation in 2025. As always, the contents provided below are time-sensitive and subject to change. Table of Contents 1. What&#8217;s New The big news this week...]]></description>
										<content:encoded><![CDATA[
<p><strong><em>Keypoint: Last week, the Vermont Governor signed the Vermont Age-Appropriate Design Code Act into law.</em></strong></p>



<p>Below is the twenty third weekly update on the status of proposed state privacy legislation in 2025. As always, the contents provided below are time-sensitive and subject to change.</p>



<p><strong>Table of Contents</strong></p>



<ol class="wp-block-list">
<li><strong>What&rsquo;s New</strong></li>



<li><strong>AI Bills</strong></li>



<li><strong>Bill Tracker Chart</strong></li>
</ol>



<p><strong>1. What&rsquo;s New</strong></p>



<p>The big news this week comes out of <strong>Vermont </strong>where Governor Phil Scott signed the Vermont Age-Appropriate Design Code Act (<a href="https://legislature.vermont.gov/bill/status/2026/S.69">S.69</a>) into law.  </p>



<p>In other children&#8217;s privacy news, <strong>New York&#8217;s</strong> <a href="https://www.nysenate.gov/legislation/bills/2025/S4505">S.4505</a> (social media warning labels) passed the Senate on June 12. The New York Senate adjourned <a href="https://nysfocus.com/2025/06/13/prison-reform-hochul-new-york-doccs">as scheduled</a> on June 12. However, the Assembly <a href="https://nysfocus.com/2025/06/13/prison-reform-hochul-new-york-doccs">reportedly</a> <a href="https://www.wbng.com/2025/06/06/state-lawmakers-preparing-end-session/">extended</a> its session to <a href="https://spectrumlocalnews.com/nys/central-ny/politics/2025/06/11/assembly-may-pass-epr-bill">June 17</a>. </p>



<p>Meanwhile, <strong>Louisiana&#8217;s </strong>legislature passed <a href="https://www.legis.la.gov/legis/BillInfo.aspx?s=25rs&amp;b=HB570&amp;sbi=y">HB 570</a> (app stores). This is the third legislature &#8211; following Utah and Texas &#8211; to pass an app store bill this year.</p>



<p>Turning to consumer data privacy bills, <strong>Michigan&#8217;s</strong> bill (<a href="https://legislature.mi.gov/Bills/Bill?ObjectName=2025-SB-0359">SB 359</a>) was voted out of committee on June 12 and referred to the Senate floor. </p>



<p>Finally, last week we reported that the <strong>Nevada </strong>legislature passed <a href="https://www.leg.state.nv.us/App/NELIS/REL/83rd2025/Bill/11863/Overview">SB 63</a> (children&#8217;s privacy). However, it appears that there was a late amendment in the Assembly and the bill was unable to repass the Senate prior to the legislature closing.</p>



<p><strong>2. AI Bills</strong></p>



<p>Our latest edition of&nbsp;<a href="https://byteback.beehiiv.com/">Byte Back AI</a>&nbsp;is now available to subscribers. Subscriptions start as low as $50/month. In this edition, we provide:</p>



<figure style=" max-width: 100%; height: auto; " class="wp-block-image alignright size-large is-resized"><img decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/Byte-Back-AI-for-Blog-656x331-3.jpg" alt="" class="wp-image-5600" style=" max-width: 100%; height: auto; width:341px;height:auto"></figure>



<ul class="wp-block-list">
<li>Updates on new laws in Nevada and Maine, bills passing the legislatures in New York and Oregon, and bills crossing chambers in Rhode Island and New York. </li>



<li>A summary of the work session for Oregon&rsquo;s HB 3592 (AI commission and Chief AI Officer). </li>



<li>Our special feature this week &#8211; a summary of Nebraska&rsquo;s LB 77 (Ensuring Transparency in Prior Authorization Act).</li>



<li>Our &ldquo;three things to know this week.&rdquo;</li>



<li>An updated state AI bill tracker chart.</li>
</ul>



<p>Click&nbsp;<a href="https://byteback.beehiiv.com/">here</a>&nbsp;for more information on paid subscriptions.</p>



<p><strong>3. Bill Tracker Chart</strong></p>



<p>For more information on all of the privacy bills introduced to date, including links to the bills, bill status, last action, and hearing dates, please see our bill tracker&nbsp;<a href="https://www.bytebacklaw.com/wp-content/uploads/sites/631/2025/06/2025-Privacy-Tracker-June-16.pdf">chart</a>.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.bytebacklaw.com/'>Byte Back</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/State-Privacy-Update_660x440-2.jpg' type='image/jpeg' length='352790' />	</item>
		<item>
		<title>Multiple States Enact Genetic Privacy Legislation in a Busy Start to 2025</title>
		<link>https://www.lexblog.com/2025/06/12/multiple-states-enact-genetic-privacy-legislation-in-a-busy-start-to-2025/</link>
		
		<dc:creator><![CDATA[Libbie Canter, Elizabeth Brim and Natalie Maas]]></dc:creator>
		<pubDate>Thu, 12 Jun 2025 20:47:28 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/12/multiple-states-enact-genetic-privacy-legislation-in-a-busy-start-to-2025/</guid>

					<description><![CDATA[Since the beginning of 2025, there have been a flurry of bills introduced at the state and federal level related to genetic privacy, which follows a similar trend over the past several years.&#160; These bills have focused on a range of issues, including general genetic privacy, national security implications of &#8220;foreign adversaries&#8221; accessing genetic information,...]]></description>
										<content:encoded><![CDATA[
<p>Since the beginning of 2025, there have been a flurry of bills introduced at the state and federal level related to genetic privacy, which follows a <a href="https://www.insideprivacy.com/health-privacy/alabama-enacts-genetic-privacy-bill/">similar trend</a> over the past several years.&nbsp; These bills have focused on a range of issues, including general genetic privacy, national security implications of &ldquo;foreign adversaries&rdquo; accessing genetic information, the privacy practices of direct-to-consumer (&ldquo;DTC&rdquo;) genetic testing companies, and the transfer of genetic data as part of bankruptcy proceedings, among others. &nbsp;We summarize a subset of such bills moving through state and federal legislatures below.</p>



<span id="more-3231095"></span>



<p><strong><u>State Legislation</u></strong></p>



<p><span style="text-decoration: underline">Montana SB 163</span></p>



<p>On May 1, the Montana governor signed <a href="https://bills.legmt.gov/#/laws/bill/2/LC0005?open_tab=sum">SB 163</a> to amend the state&rsquo;s Genetic Information Privacy Act (&ldquo;MT GIPA&rdquo;), which was originally enacted in 2023. &nbsp;Effective October 1, 2025, there will be several changes to the law, including:</p>



<ul class="wp-block-list">
<li><strong>Creating Deidentification Exemption</strong>: The original version of MT GIPA did not contain an express exemption for deidentified data.&nbsp; SB 163 amends the law to include an express exemption for the use of deidentified genetic data for certain research purposes. &nbsp;Specifically, SB 163 includes an exemption for &ldquo;deidentified genetic data obtained from a third party to the extent that the data is used to conduct internal, medical, or scientific research.&rdquo;&nbsp; The deidentification standard is similar to the standard adopted under many comprehensive state privacy laws and other state DTC genetic privacy laws.</li>



<li><strong>Waiver of Certain Rights in the Clinical Trial Context</strong>: The law provides that consumers&rsquo; rights to access and delete data, destroy samples, and revoke consent must be waived in a limited context related to the collection of genetic data as part of a clinical trial if certain conditions are met, including prescriptive requirements for consent. &nbsp;Specifically:<ul><li>The relevant entity generally must obtain express and informed written consent for participation in a clinical research trial, including the collection and use of any genetic data, which must, among others, be in accordance with the good clinical practice (&ldquo;GCP&rdquo;) guideline issued by the international council for harmonisation of technical requirements for pharmaceuticals for home use and include the entity&rsquo;s biological sample and data retention, sharing, and use policies.</li></ul>
<ul class="wp-block-list">
<li>The biological sample and genetic data must be utilized for clinical research purposes only.</li>
</ul>
</li>
</ul>



<p class="is-style-indented">SB 163 states that these requirements are meant to &ldquo;supersede all exceptions to, and waivers of&rdquo; informed consent pursuant to the federal Common Rule.&nbsp; However, it is not clear how this new limited exemption is meant to interact with the existing exemption for entities that are engaged in collecting, using, or analyzing genetic data or biological samples in the context of scientific or clinical research with express consent of the individual and in accordance with human subject research frameworks, including GCP, the federal Common Rule, or FDA&rsquo;s human subjects research regulations at 21 C.F.R. parts 50 and 56.</p>



<ul class="wp-block-list">
<li><strong>Neural Data</strong>: The obligations under the law will now also apply to &ldquo;neurotechnology data&rdquo; defined as &ldquo;information that is captured by neurotechnologies, is generated by measuring the activity of an individual&#8217;s central or peripheral nervous systems, or is data associated with neural activity, which means the activity of neurons or glial cells in the central or peripheral nervous system, and that is not nonneural information.&rdquo;&nbsp; Montana becomes the <a href="https://www.insideprivacy.com/uncategorized/california-enacts-health-ai-bill-and-protections-for-neural-data/">third state</a> to enact a privacy law that specifically protects neural data. &nbsp;</li>
</ul>



<p><span style="text-decoration: underline">Texas HB 130</span></p>



<p>On May 23, the Texas legislature passed <a href="https://capitol.texas.gov/tlodocs/89R/billtext/pdf/HB00130F.pdf#navpanes=0">HB 130</a> (the &ldquo;Texas Genomic Act of 2025&rdquo;), which seeks to protect genetic information of Texas residents by regulating the collection, storage, and use of genome sequencing data. &nbsp;The bill is focused on ensuring that &ldquo;foreign adversaries&rdquo; (as defined in <a href="https://www.ecfr.gov/current/title-15/subtitle-B/chapter-VII/subchapter-E/part-791/subpart-A/section-791.4">15 C.F.R. &sect; 791.4(a)</a> and includes China, Cuba, Iran, North Korea, Russia, and Venezuela) are unable to access the genetic information of Texas residents.&nbsp; Notably, the definitions of &ldquo;foreign adversaries&rdquo; aligns with the countries that are identified as &ldquo;countries of concern&rdquo; in the U.S. Department of Justice&rsquo;s recently finalized Data Security Program (&ldquo;DOJ DSP&rdquo;), which focuses on access to bulk U.S. sensitive personal data (including human genomic data) by these countries and certain &ldquo;covered persons.&rdquo; &nbsp;</p>



<p>The bill would go into effect on September 1, 2025. Key provisions are summarized below:</p>



<ul class="wp-block-list">
<li><strong>Broad Applicability</strong>: The bill would apply to a &ldquo;medical facility, research facility, company, or nonprofit organization that conducts research on or testing of genome sequencing or the human genome&rdquo; in Texas.</li>



<li><strong>Storage and Access to Genome Sequencing Data</strong>: The bill would impose obligations on regulated entities with respect to their storage of genome sequencing data of Texas residents within the borders of a foreign adversary, including access to such data within the borders of a foreign adversary. &nbsp;Notably, these requirements do not apply to the storage of genome sequencing data by the regulated entities that is collected as part of a clinical trial or biomedical research study subject to, or conducted in accordance with, the DOJ DSP.&nbsp; The bill also contains substantive provisions related to the use of genomic sequencing equipment or software produced by or on behalf of a foreign adversary or certain related parties.</li>



<li><strong>Sale or Transfer of Genomic Sequencing Data</strong>: The bill would prohibit the sale or transfer of genomic sequencing data of Texas residents as part of a bankruptcy proceeding to a foreign adversary, state-owned enterprise of a foreign adversary, or a company domiciled in a country that is a foreign adversary or its subsidiary or affiliate.&nbsp;</li>



<li><strong>Annual Certification: </strong>By December 31 of each year, each regulated entity must certify its compliance with the Texas Genomic Act of 2025 to the Texas Attorney General.</li>



<li><strong>Private Right of Action and AG Enforcement: </strong>Texas residents can seek recovery for alleged violations, including actual damages or statutory damages up to $5,000 per violation.&nbsp; The Attorney General can also seek up to $10,000 for each violation.&nbsp;</li>
</ul>



<p><span style="text-decoration: underline">Florida SB 768</span></p>



<p>Notably, the Florida governor recently signed a bill, <a href="https://www.flsenate.gov/Session/Bill/2025/768/?Tab=BillText">SB 786</a>, which is similar to certain provisions of the Texas Genomic Act of 2025, but is generally narrower in scope.&nbsp; Specifically, the bill amends the provisions of Florida law that apply to the licensing of laboratories to require that the laboratory not use any operational or research software for genetic sequencing that is produced by a &ldquo;foreign country of concern.&rdquo;&nbsp; &ldquo;Foreign country of concern&rdquo; aligns to &ldquo;foreign adversaries&rdquo; under the Texas Genomic Act of 2025, though also includes Syria.&nbsp;</p>



<p><strong><u>Federal Legislation</u></strong></p>



<p>On May 22, 2025, a bipartisan group of Senators introduced the &ldquo;<a href="https://www.grassley.senate.gov/imo/media/doc/dont_sell_my_dna_act.pdf">Don&rsquo;t Sell My DNA Act</a>&rdquo; which would amend the federal Bankruptcy Code, including to impose a notice and affirmative consumer consent requirement before genetic information is used, sold, or leased in a bankruptcy proceeding. </p>



<p>This is the second genetic privacy bill introduced during this Congress, with Senators Cassidy (R-LA) and Peters (D-MI) having introduced the Genomic Data Protection Act (&ldquo;GDPA&rdquo;) earlier this session in March, which specifically focuses on the privacy practices of DTC genomic testing companies.&nbsp; Shortly after being introduced, the GDPA was referred to committee and has not advanced further.&nbsp; We covered the GDPA in a prior Inside Privacy post, available <a href="https://www.insideprivacy.com/united-states/u-s-federal-and-state-legislative-initiatives/u-s-senate-introduces-genomic-data-protection-act/">here</a>.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.insideprivacy.com/'>Inside Privacy</source>
	</item>
		<item>
		<title>Operationalizing Privacy Across Teams, Tools, and Tech</title>
		<link>https://www.lexblog.com/2025/06/12/operationalizing-privacy-across-teams-tools-and-tech/</link>
		
		<dc:creator><![CDATA[Jodi Daniels]]></dc:creator>
		<pubDate>Thu, 12 Jun 2025 09:00:00 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/12/operationalizing-privacy-across-teams-tools-and-tech/</guid>

					<description><![CDATA[@media screen and (max-width: 1023px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{ }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{ }}@media screen and (min-width: 1366px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{ }} Sarah Stalnecker is the Global Privacy Director at New Balance Athletics, Inc., where she leads the integration of privacy principles across the organization, driving awareness and compliance through education, streamlined processes, and technology solutions. &#160; &#65279;&#65279;&#65279;&#65279;...]]></description>
										<content:encoded><![CDATA[<p>@media screen and (max-width: 1023px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{   }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{   }}@media screen and (min-width: 1366px){section[data-id=&#8221;block_ae3260af8f6cc66274932b4db35d547d&#8221;]{   }}</p>
<section class="plain-text-section has-background has-background-color has-text-color has-transparent-background-color has-black-color default text-left" data-id="block_ae3260af8f6cc66274932b4db35d547d" data-aos="fade-in">
<div class="wrapper">
<div class="grid justify-center">
<div class="col-grid lg-col-8-12">
<div class="has-text-color has-black-color has-red-link-color" data-aos="fade-up">
<p><img style=" max-width: 100%; height: auto; " fetchpriority="high" decoding="async" class="alignright wp-image-39906 size-full" src="https://www.lexblog.com/wp-content/uploads/2025/06/sarah-stalnecker.jpg" alt="Sarah Stalnecker" width="225" height="225"></p>
<p>Sarah Stalnecker is the Global Privacy Director at New Balance Athletics, Inc., where she leads the integration of privacy principles across the organization, driving awareness and compliance through education, streamlined processes, and technology solutions.</p>
<h2></h2>
<p>&nbsp;</p>
<div></div>
<div class="iframe-container"><iframe src="https://www.youtube.com/embed/JRoaYtp1x0w?si=eMAiYGm2s4gWlBOs" width="560" height="315" frameborder="0"><span data-mce-type="bookmark" class="mce_SELRES_start">&#65279;</span><span data-mce-type="bookmark" class="mce_SELRES_start">&#65279;</span><span data-mce-type="bookmark" class="mce_SELRES_start">&#65279;</span><span data-mce-type="bookmark" class="mce_SELRES_start">&#65279;</span></iframe></div>
<p><iframe src="//html5-player.libsyn.com/embed/episode/id/36948285/height/90/theme/custom/thumbnail/yes/direction/backward/render-playlist/no/custom-color/87A93A/" width="100%" height="90" scrolling="no"></iframe></p>
<div class="center-block">
<div class="podwrap">
<div><a href="https://podcasts.apple.com/us/podcast/she-said-privacy-he-said-security/id1536859760" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1313" src="https://www.lexblog.com/wp-content/uploads/2025/06/apple-1-1.png" alt="Available_Black copy" width="170"></a></div>
<div><a href="https://tunein.com/podcasts/Business--Economics-Podcasts/She-Said-PrivacyHe-Said-Security-p1375421/" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1313" src="https://www.lexblog.com/wp-content/uploads/2025/06/tunein-1.png" alt="Tunein" width="170"></a></div>
<div><a href="https://open.spotify.com/show/5q8B2oYUPajIvmvZiLa4K4" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1313 size-full" src="https://www.lexblog.com/wp-content/uploads/2025/06/spotify-1.png" alt="Available_Black copy" width="170"></a></div>
<div><a href="https://www.pandora.com/podcast/she-said-privacy-he-said-security/PC:1000580980" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " loading="lazy" decoding="async" class="alignleft wp-image-1318" src="https://www.lexblog.com/wp-content/uploads/2025/06/pandora-btn-1-1.png" width="170" height="150"></a></div>
</div>
<div class="podwrap last">
<div><a href="https://www.deezer.com/show/1857852" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1318" src="https://www.lexblog.com/wp-content/uploads/2025/06/deezer-1.png" alt="partner-share-lg" width="170"></a></div>
<div><a href="https://music.amazon.com/podcasts/b62761d1-45cb-4ea9-afda-9fe9a44b601a/She-Said-PrivacyHe-Said-Security" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1318" src="https://www.lexblog.com/wp-content/uploads/2025/06/amazonmusic-1.png" alt="partner-share-lg" width="170px"></a></div>
<div><a href="https://www.iheart.com/podcast/263-she-said-privacy-he-said-s-75291815/" target="_blank" rel="noopener noreferrer"><img style=" max-width: 100%; height: auto; " decoding="async" class="alignleft wp-image-1318" src="https://www.lexblog.com/wp-content/uploads/2025/06/iheartradio-1.png" alt="partner-share-lg" width="170px"></a></div>
</div>
</div>
<p>body.single-post p, body.single-post li{131313;}  body.single-post li a{E33E2B; font-weight: 400 !important;} .center-block{margin:0 auto;float:none;display:block;clear: both; margin-bottom: 0px;text-align: center;} .podwrap {margin-top:20px; }.podwrap img{margin-right:10px; width:98%; margin: 0px; } .podwrap.last{margin-bottom:12px; margin-top: 0px !important;}.podwrap.pod1{margin-bottom:0px;} .podwrap div{display:inline-block; width:21%;} iframe{text-align: center;display: block; margin: 20 auto; float : none;} .iframe-container{ position: relative;width: 100%;padding-bottom: 56.25%; height: 0;}.iframe-container iframe{position: absolute;top:0;left: 0;width: 100%;height: 100%;}<br /> @media screen and (max-width: 640px){  .podwrap { width: 100%; position: relative; display: inline-block!important;}.podwrap div{width:36%;}.podwrap img{margin-bottom: 0px !important;} } </p>
<div></div>
<h3>Here&rsquo;s a glimpse of what you&rsquo;ll learn:</h3>
<ul>
<li>Sarah Stalnecker&rsquo;s 20-year career journey from digital marketing to Global Privacy Director at New Balance</li>
<li>How to build privacy programs that balance business needs with regulatory compliance</li>
<li>Tips for using consumer personalization expectations to guide privacy conversations</li>
<li>Why privacy teams are naturally positioned to lead AI governance and mitigate AI risks</li>
<li>Methods to embed privacy requirements into company workflows</li>
<li>How to evaluate and select privacy technology tools</li>
<li>Tips for measuring privacy program success beyond traditional metrics</li>
</ul>
<h3>In this episode...</h3>
<p>Operationalizing privacy programs starts with translating legal requirements into actions that work across teams. This means aligning privacy with existing tools and workflows while meeting evolving privacy regulations and adapting to new technologies. Today&rsquo;s consumers also demand both personalization and privacy, and building trust means fulfilling these expectations without crossing the line. So, how can companies build a privacy program that meets regulatory requirements, integrates into daily operations, and earns consumer trust?</p>
<p>Embedding privacy into business operations involves more than just meeting regulatory requirements. It requires cultural change, leadership buy-in, and teamwork. Rather than forcing company teams to adapt to new privacy processes, organizations need to embed privacy requirements into existing workflows and systems that departments already use. Leading with consumer expectations instead of legal mandates helps shift mindsets and encourages collaborative dialogue about responsible data use. Documenting AI use cases and establishing an AI governance program also helps assess risks without reactive scrambling. Teams should also leverage privacy technology to scale processes and streamline compliance to ensure privacy becomes an embedded, organization-wide function rather than a siloed concern.</p>
<p>In this episode of <em>She Said Privacy/He Said Security, </em>Jodi and Justin Daniels chat with Sarah Stalnecker, Global Privacy Director at New Balance Athletics, about operationalizing privacy programs. Sarah shares how her team approaches data collection, embeds privacy into existing workflows, and uses consumer expectations to drive internal engagement. She also highlights the importance of documenting AI use cases and establishing AI governance to assess risk. Sarah provides tips on selecting and evaluating privacy technology and how to measure privacy program success beyond traditional metrics.</p>
<h3>Resources Mentioned in this episode</h3>
<ul>
<li><a href="https://www.linkedin.com/in/jodihoffmandaniels/" target="_blank" rel="noopener noreferrer">Jodi Daniels on LinkedIn</a></li>
<li><a href="https://www.linkedin.com/in/justinsdaniels/" target="_blank" rel="noopener noreferrer">Justin Daniels on LinkedIn</a></li>
<li><a href="https://redcloveradvisors.com/" target="_blank" rel="noopener noreferrer">Red Clover Advisors&rsquo; website</a></li>
<li><a href="https://www.linkedin.com/company/redcloveradvisors/" target="_blank" rel="noopener noreferrer">Red Clover Advisors on LinkedIn</a></li>
<li><a href="https://www.facebook.com/RedCloverAdvisors/" target="_blank" rel="noopener noreferrer">Red Clover Advisors on Facebook</a></li>
<li>Red Clover Advisors&rsquo; email: <a href="mailto:info@redcloveradvisors.com" target="_blank" rel="noopener noreferrer">info@redcloveradvisors.com</a></li>
<li><a href="https://redcloveradvisors.com/book-sales/" target="_blank" rel="noopener noreferrer"><em>Data Reimagined: Building Trust One Byte at a Time </em>by Jodi and Justin Daniels</a></li>
<li><a href="https://www.linkedin.com/in/sarah-stocke-stalnecker-a382957/" target="_blank" rel="noopener noreferrer">Sarah Stalnecker on LinkedIn</a></li>
<li><a href="https://www.newbalance.com/" target="_blank" rel="noopener noreferrer">New Balance</a></li>
</ul>
<h3>Sponsor for this episode...</h3>
<p>This episode is brought to you by <a href="https://redcloveradvisors.com/" target="_blank" rel="noopener">Red Clover Advisors</a>.</p>
<p>Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.</p>
<p>Founded by <a href="https://www.linkedin.com/in/jodihoffmandaniels/" target="_blank" rel="noopener">Jodi Daniels</a>, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.</p>
<p>To learn more, and to check out their Wall Street Journal best-selling book, <em>Data Reimagined: Building Trust One Byte At a Time</em>, visit <a href="http://www.redcloveradvisors.com/" target="_blank" rel="noopener">www.redcloveradvisors.com.</a></p>
</div></div>
</div></div>
</section>
<p>@media screen and (max-width: 1023px){section[data-id=&#8221;block_a90991ffb9b62e2105117a77adedbfcf&#8221;]{  margin-top: -100px; margin-bottom: -50px;}}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=&#8221;block_a90991ffb9b62e2105117a77adedbfcf&#8221;]{  margin-top: -100px; margin-bottom: -50px;}}@media screen and (min-width: 1366px){section[data-id=&#8221;block_a90991ffb9b62e2105117a77adedbfcf&#8221;]{  margin-top: -100px; margin-bottom: -50px;}}</p>
<section class="accordion-section has-background has-background-color has-text-color has-transparent-background-color has-black-color default" data-id="block_a90991ffb9b62e2105117a77adedbfcf" data-aos="fade-in">
<div class="wrapper">
<div class="grid justify-center">
<div class="col-grid xxl-col-10-12">
<div class="accordion">
<div class="accordion-title">Click for Full Transcript</div>
<div class="accordion-content">
<p><strong>Intro&nbsp; 0:01&nbsp;&nbsp;</strong></p>
<p>Welcome to the <em>She Said Privacy/He Said Security</em> Podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 0:21&nbsp;&nbsp;</strong></p>
<p>Hi, Jodi Daniels here, I&rsquo;m the Founder and CEO of Red Clover Advisors, a certified women&rsquo;s privacy consultancy. I&rsquo;m a privacy consultant and certified informational privacy professional providing practical privacy advice to overwhelmed companies.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 0:36&nbsp;&nbsp;</strong></p>
<p>Hi. I&rsquo;m Justin Daniels, I am a shareholder and corporate M&amp;A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cybersecurity risk. And when needed, I lead the legal cyber data breach response brigade.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 1:00&nbsp;&nbsp;</strong></p>
<p>And this episode is brought to you by Red Clover Advisors. We help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology e commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together. We&rsquo;re creating a future where there&rsquo;s greater trust between companies and consumers to learn more and to check out our best selling book data reimagined, building trust one bite at a time. Visit redcloveradvisors.com, well, we&rsquo;ve had so much fun in our pre show. I know this episode is going to be fantabulous. Why are you laughing at me. It&rsquo;s fun. I don&rsquo;t think it&rsquo;s fun to laugh at me. And I like the word fantabulous. Okay, the real world and fabulous. That&rsquo;s totally a real world.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 1:49&nbsp;&nbsp;</strong></p>
<p>I think we&rsquo;re gonna have a really good show. We are. I enjoy the company. We&rsquo;ve had a great pre-show discussion. So, Jodi why don&rsquo;t you hop to it?</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 1:57&nbsp;&nbsp;</strong></p>
<p>Oh yes. Well, today we have Sarah Stalnecker who is the global privacy director at New Balance Athletics, I might have quite a collection of New Balance shoes in our household. Sarah is the global privacy director, as I just said, where she leads the integration of privacy principles across the organization, driving awareness and compliance through education, streamline processes and technology solutions. Well, Sarah, we&rsquo;re so excited that you were here. Thank you guys for having me.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 2:26&nbsp;&nbsp;</strong></p>
<p>So why don&rsquo;t you tell us a little bit about your career journey to where you&rsquo;re at now.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 2:32&nbsp;&nbsp;</strong></p>
<p>So this is my favorite question, because I started my career 20 years ago. I promise this won&rsquo;t be me hitting tick for tick. Every experience that I&rsquo;ve had in those last 20 years. But if you would have told me I would have ended up in a legal department as a non-attorney, I would have laughed. And I think, you know, what&rsquo;s really interesting about the privacy space is you see so many people coming from either a legal background or a non legal background. But essentially, I got my start in marketing, so I spent the better part of 15 years, really, in media agencies thinking about strategically, how do we make sure that we&rsquo;re showing up right for brands like McDonald&rsquo;s and Oracle? So what was really fascinating is, when I started my career, digital marketing was just getting going, and so it was a time where everybody bought TV and print, and that was very well known. And then there was this disruption of digital marketing and figuring out, how do we embrace it, and how do we make sure that we&rsquo;re using it in a way that makes sense? And so that kind of continued to be a theme. I then went over to brands like Luxottica and Anheuser Busch, and there I worked in a variety of functions, but it was this idea of really understanding how and why things like digital marketing works within companies, and really this explosion of data. So, you know, you went from I&rsquo;m going to buy a piece of content and make sure that my brand is adjacent to a piece of content, to really shifting to say I&rsquo;m buying audiences. And so the world, I think, significantly changed, where suddenly there was first party data, there was third party data, and I think a lot of brands were grappling, how do we make sense of it all, and how do we make sure that, you know, we are staying on the cusp of what, of what&rsquo;s happening so in those roles was really interesting is like you start to go from things like search engine marketing, so I was doing things like search engine marketing, convincing brands that they no longer needed print Yellow Pages, to programmatic buying, buying fans on Facebook, which now feels really old. And then about 10 years ago, I started at New Balance. And I started at New Balance in a marketing analytics function. It was a new function within the organization, and they were trying to figure out, how do we measure brand performance? And so it was putting together the frame of work, of what&rsquo;s working, what&rsquo;s not working in our marketing activities, and why that really quickly evolved into consumer data. So. Really understanding our customer. And in order to do that, you have a bunch of different disparate data sets that we had to bring together in order to say, Okay, I know who Jodi is, for example, because I know how she interacts on our website. I know that she&rsquo;s visiting a retail store, and then I know, like, baseline demographic profiles of her. But the reality is all of that was stuck in different systems, and so bringing that together was really the focus that I had for the better part of three or four years of creating infrastructure where we could drive those kinds of insights, because we had a consolidated view of people, and then also enable things like personalization. And while we were doing that, obviously all of these privacy regulations started to pop up. And so what happened was, I was working on a task force where they were like, look, we&rsquo;ve got to figure out, how do we maintain our ability to deliver relevance to our consumers? Because we know that&rsquo;s what they&rsquo;re demanding in a world that&rsquo;s changing very, very rapidly. And so we had, at the time, something that was called pods, but essentially it was like a cross functional working group that was figuring out, how do we tackle this thing of privacy? And through that work, the legal team said, hey, look, we&rsquo;re actually hiring somebody to help build the privacy program. Is that something that you want to do? And so that&rsquo;s where I found myself building a privacy program about three years ago. And you know, I think what&rsquo;s really interesting, and I was saying this in the pre show, is in my analytics function, it was really this idea of, how do you translate between what the business is trying to do and what technology is capable of delivering, and in order to do that, you really needed a translator function, someone that was able to navigate between those two worlds in order to deliver against the goals. And I see privacy sort of filling in that same but just adding the third piece, which is the legal landscape, is how do we make sure we&rsquo;re continuing to enable the business to progress the way that it wants to progress, leveraging the existing technology set, but obviously abiding by the fact that there&rsquo;s these really changing privacy laws happening in the background.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 7:07&nbsp;&nbsp;</strong></p>
<p>Well, Sarah, I love your background, and we are very similar. So I had worked at media companies so very much. Remember looking at charts of when digital was going to overtake the traditional media of perhaps I had newspaper, print and radio, it was very, very fascinating to look at where it is, and then also had the opportunity in marketing. So as you had just mentioned, you&rsquo;re kind of being that translator. Help us share a little bit. What is your approach to building a privacy program that&rsquo;s trying to balance all these new privacy laws and retail is trying to move quickly.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 7:45&nbsp;&nbsp;</strong></p>
<p>Yeah, I mean, you know, I think what&rsquo;s been the number one challenge is making people care, making people care about privacy. And what&rsquo;s been effective, I think, is recognizing the fact that the world has changed dramatically in the last 10 years. We consistently show a chart that says, you know, 10 years ago, like 10% of the population was covered under my modern privacy reg, it&rsquo;s now close to 80% so that&rsquo;s such a seismic shift that&rsquo;s happened in a very short amount of time. And for companies, what that means is, I think we used to feel like once the data entered our four walls, it was our data, and we could do whatever we want with it. And I think now this reality is set in it&rsquo;s like the No, the data still belongs to the individual. They still have rights with respect to that data. So how do we start going about the change management that needs to occur in particular, it&rsquo;s not saying you can&rsquo;t deliver data driven experience, you can&rsquo;t deliver relevance, but you need to do it in a way that is transparent and really thinks about, what are the data? What is the data that we actually need, versus collecting everything that we can right? So I think companies used to be like, Let&rsquo;s collect everything, because we might need it down the line, and the analytics team might want to use it for something now, everything is really purpose built and purpose driven, and so I think just getting people comfortable with the fact that the world has changed, and that means we now have obligations, and we have to be even more thoughtful about the way that we collect and otherwise process data. And I think it&rsquo;s just a learning curve that every company, I think is going, you know, going through right now. And none of us are going to be perfect from the jump, but I think it&rsquo;s this recognition that we just need to make sure we&rsquo;re changing the way that we think about data and think about the people and the experiences we&rsquo;re trying to power with it.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 9:39&nbsp;&nbsp;</strong></p>
<p>Is there a story that you might be able to share, that you were able to get people to care? Because I hear that a lot, and I had the exact same thing. It was very hard to get people to care about privacy. Just curious if you have something maybe that you can offer listeners who are probably listening.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 9:57&nbsp;&nbsp;</strong></p>
<p>Yeah, I got the same problem. Yeah. Well, no one really wants to talk about the law. Yes, that&rsquo;s number one. But I think talking about consumer has been a really helpful way in and so really thinking about, what are consumers expectations? And so I consistently share like, here&rsquo;s consumer feedback as it relates to personalization and privacy, and it&rsquo;s a really dynamic tension of people expect privacy and relevance simultaneously, right? Like it&rsquo;s dead center. And so what that means for, I think, brands and businesses is this idea of, how do you deliver against that relevance expectation, but do it in a way that honors their privacy? And one of the things that we say is we have an unofficial don&rsquo;t be creepy policy, and that seems to really resonate, where you repeat back, like someone goes, Hey, we want to do x, y, z, this is the personalization experience that we want to power. And you say, play that back. And as a user, would you find that creepy, right? Like, is this an experience that&rsquo;s valuable enough, and you think is relevant enough to the consumer, where they&rsquo;re going to go, yes, that&rsquo;s an expectation, and I&rsquo;m okay, right with the data exchange that&rsquo;s required to power that. So I think that starting with consumers and really, then putting people back in the position where they&rsquo;re thinking about it through their own lens, versus I&rsquo;m a brand. I want to do this. It&rsquo;s Hey, if you were on the receiving end of that, how would that play? And I think that&rsquo;s driven a lot of traction and actually shifted. I think the mindset of seeing privacy as the roadblock versus privacy is sort of an enabler, right? Like, let&rsquo;s have the conversation about how to do it in a way that honors that consumer expectation. I also use a lot of memes. Memes and gifts seem to really work in this world. So the more you can find those, I think it just like kind of puts people into a more comfortable place of having conversations about this topic.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 11:59&nbsp;&nbsp;</strong></p>
<p>I love it, memes and gifts. Why are you giggling over here?</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 12:02&nbsp;&nbsp;</strong></p>
<p>Because it&rsquo;s like privacy to forestall the creep factor.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 12:05&nbsp;&nbsp;</strong></p>
<p>Yeah, don&rsquo;t be creepy. Is also an official word phrase in the privacy space, absolutely.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 12:12&nbsp;&nbsp;</strong></p>
<p>Well, I actually want to pull a bit of an audible and kind of peel away a little bit of what we talked about in our pre show, which is, at least in my practice, every day, I&rsquo;m either using encountering, advising on things related to artificial intelligence. And one of the things that&rsquo;s really interesting to me is how much, particularly at enterprise clients, privacy is playing a real central leadership role. And Sarah, I was wondering if you&rsquo;d talk a little bit about how that has played out within the confines of of your organization?</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 12:47&nbsp;&nbsp;</strong></p>
<p>Yeah, I think, as I was mentioning, you know, privacy teams had done a lot of work on what was happening in terms of data processing across organizations, right? So to catch up with all the documentation requirements that are privacy laws, we were doing a lot of quick lot of quick work to make sure we were understanding what was happening across, in our case, a global organization, and then you have aI coming, and what we were noticing is that every vendor that was being reviewed had some sort of AI capability or AI component, and it got to the point where we really needed to say, look, we can look at this through the lens of risk, but we also need to look at it through the lens of what&rsquo;s the strategic benefit to the organization. And so really creating a governance process where the right people are getting in the room to talk through risk, risk management strategies, and then also just ensuring there&rsquo;s transparency, so that when we are taking some sort of a risk, that it&rsquo;s a strategic risk versus something that&rsquo;s just happening in the background, and people don&rsquo;t even know, right, that they&rsquo;re assuming risk on behalf of the org.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 13:56&nbsp;&nbsp;</strong></p>
<p>I feel like you have no thoughts brewing in that head of yours.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 13:59&nbsp;&nbsp;</strong></p>
<p>No, I just wanted to hear more about this, because the challenge with AI is similar to our podcast, is you have to think about it across privacy, cybersecurity, intellectual property, bias, transparency, and then you have to figure out how all of that applies to various different use cases, because you might be able to use AI, and I think it&rsquo;s being used for your privacy program, but then the risks start to shift if you&rsquo;re going to use it to say, be customer facing, or you want to create intellectual</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 14:34&nbsp;&nbsp;</strong></p>
<p>property, yeah, and I think we learned a lot, you know, coming out of what happened when all the privacy laws passed, I think a lot of companies were doing a hurry to figure out what was happening. How were we, you know, using data across the organization? And I think the AI piece gave us an opportunity to say, wait a minute, like, let&rsquo;s take a step back and make sure that we just at least have an idea of all of the use cases. It is related to AI. So we have that documentation as these laws pass, as they&rsquo;re enforced. We&rsquo;re not taking some sort of knee jerk reaction to oh no, like we don&rsquo;t even know how AI is being deployed across the organization. No, we have that transparency, and we can make adjustments as needed. And so I think that piece is really powerful to the extent that you have that level of documentation so that you aren&rsquo;t overreacting, because this space is going to change. It&rsquo;s going to change rapidly, and we can&rsquo;t govern what we don&rsquo;t know about.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 15:31&nbsp;&nbsp;</strong></p>
<p>Yeah, Sarah, you mentioned earlier that there&rsquo;s this balance of trust and relevancy. So yeah, while it and that you also are moving towards purpose driven data collection, which could still be a lot of data, sure, and we&rsquo;re trying to make privacy kind of infused all the time, yeah. How have you tackled that? Just like AI, is this multi disciplinary approach privacy is two and curious, if you can share a little bit about your experiences and how that&rsquo;s working.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 16:04&nbsp;&nbsp;</strong></p>
<p>We so I&rsquo;d love to say we have one process, and it meets everybody&rsquo;s needs. The reality is that&rsquo;s not the case. Practically speaking, what we&rsquo;ve done is kind of met the different departments where they are. And what I mean by that is, if our e-commerce department, or a direct consumer team, uses a certain technology, and we need to embed a privacy process within that. We do that so we&rsquo;re not always saying you need to go to use our tools, but rather we say, hey, look, we know you work. In the case of our e-commerce team in JIRA. That&rsquo;s your ticketing system. So something like our website tracking, we have documentation of all the website tracking across our global websites. We put that in JIRA because we know that that&rsquo;s where the team is used to it. That&rsquo;s where they&rsquo;re going to work on this activity. And so if we can embed the privacy requirements there, versus forcing people to go to a different place, it just means that we&rsquo;re going to generate adoption much faster. And so I&rsquo;d say meeting people where they are has been, I think, the single biggest opportunity that we&rsquo;ve kind of taken advantage of so that we&rsquo;re not trying to push but we rather are trying to kind of bring people along, similarly for marketing, we make sure that we have discussions again. We lead with consumer, we lead with trust, and we try to make sure that we&rsquo;re having a conversation in a way and using language that makes sense for the audience, so that it does not feel again like my goal is to make sure we&rsquo;re not a roadblock, because everyone wants to circumvent a roadblock, but rather get them to understand the why, ensure that our processes are adapted for them. And then I think that has been a much more successful way to just generate adoption.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 17:58&nbsp;&nbsp;</strong></p>
<p>Yeah, change management for people is hard, so hard. And what you&rsquo;ve shared is that instead of just trying to completely change how everyone has to do everything, you&rsquo;re adding in some change to an existing process. And I love how you said using language. I imagine that is true throughout the whole organization. When you&rsquo;re talking to the people team or the finance team, you&rsquo;re using different language that they use, because, again, it&rsquo;s really hard to change. It&rsquo;s hard to and we&rsquo;re not trying to make all these different functions be privacy experts. That&rsquo;s your team, right? We just need them to understand a little bit. They&rsquo;re very, very helpful in what you shared. So thank you.</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 18:39&nbsp;&nbsp;</strong></p>
<p>So what role does privacy technology play in your program today? And then how do you decide when you&rsquo;re gonna adopt a new tool versus sticking with an existing process? And to me, AI would be a great example. I can imagine you could dump all kinds of data about Jodi buying online, Jodi coming to your store and her demographics, and start to make using AI, start to make new associations that could create a marketing program. So then you&rsquo;re saying, Oh, this AI seems great, but wait a second, how do I vet this tool? Or maybe we&rsquo;re not quite ready for that because it&rsquo;s unknown. How do you approach that?</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 19:16&nbsp;&nbsp;</strong></p>
<p>Yeah, so to clarify, are you talking about other people&rsquo;s use of tools and how we ensure privacy is embedded, or the privacy tools that we use and how we evaluate our privacy tools?</p>
<p>&nbsp;</p>
<p><strong>Justin Daniels&nbsp; 19:30&nbsp;&nbsp;</strong></p>
<p>I think it&rsquo;s what privacy tools do you use? And if you come to say, Hey, do I need a new tool, or do I have a process? And maybe I inartfully tried to use AI as an example of used as a tool, but has significant privacy implications.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 19:43&nbsp;&nbsp;</strong></p>
<p>Yeah, so, you know, I think for a lot of us, we got started with some of the price privacy tools just to meet the needs for things like cookie compliance and that very quickly evolved to things like, how do you want our privacy rights required? Yes. So the idea is, where does the technology come into play, where it&rsquo;s actually making our lives easier? And in our case, we use it because we need, we know we need to scale process globally, and to do that using some of these privacy tools is really the only way to achieve that scale. We&rsquo;re constantly reevaluating the providers. One, the space is changing really quickly. Like, I think of this as very similar to how marketing technology was in the early, like, 2010s right? Whereas, like, you saw that terrifying Luna scape of 3000 vendors that had entered into the space, sort of feel like the privacy world over the course of the last five years is in that similar domain, just not obviously, the scale of that, but but that same idea of all of the software popping up to try to solve these problems of complying with the laws we&rsquo;re in a space now where we&rsquo;re reevaluating what we have in place, and then how do we streamline process? Right? Because I think what you were talking about before, information security has its own process, particularly as it relates to third party vendor management. Are there ways for us to collapse and use similar processes, similar vendors, and both achieve what we need to achieve through that work, and then also have a centralized framework where we&rsquo;re thinking about risk management? So to the extent where we can consolidate, that&rsquo;s also something that we&rsquo;re looking at. It also makes it easier for the business, because we&rsquo;re using common lexicon, they sort of understand what the vendor is, what the vendor is doing, and we&rsquo;re not trying to introduce so many disparate vendors in this space. So I&rsquo;d say that&rsquo;s really our goal right now is consolidate and make sure we have the right need and then also always making sure that the technology is helping us versus just making it more difficult for us to work through some problems.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 21:52&nbsp;&nbsp;</strong></p>
<p>I always compare, or often compare the privacy tech space to the marketing space. Maybe it&rsquo;s our marketing backgrounds. Probably I just see incredible commonalities between them. And I had this little smile he started to compare. Because I think it&rsquo;s so true, and I think it&rsquo;s going to remain that way. I think you&rsquo;re going to end up with a few big behemoths, kind of like what you have in the marketing space, and then some really niche products that might bolt on, they might integrate, they might not. Some people really like a smaller niche. And the other thing I have found is people, some prefer an all in one it&rsquo;s a little bit simpler from a vendor perspective, and then others prefer, nope. I want product A, product B, product C, because it meets their needs a little bit. And there&rsquo;s not a right or wrong, it&rsquo;s just literally the way the company operates best. So I think it&rsquo;ll be interesting to continue to see that privacy tech market evolve. Which brings us to, how do you measure success of this program? You You have tech, you have process. It&rsquo;s continuously changing. My prediction of more laws did not happen quite this moment, but instead we got amendment city. So now we have to go back and figure out, how do all these amendments impact things? We have new AI regulations, so it&rsquo;s constantly changing and right, we&rsquo;re kind of like building a foundation and trying to evolve it. So how do you measure any of this? It&rsquo;s</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 23:20&nbsp;&nbsp;</strong></p>
<p>a great question. And I think a lot of companies start with the same thing, because the things that you would traditionally look at like, how many privacy requests are we fulfilling? Like things that are inherently very measurable, I don&rsquo;t know. Do they tell you meaningfully? Are you changing the trajectory of how your brand or your organization is embracing privacy? I&rsquo;m always heartened when I hear someone and I&rsquo;m not on the call, but someone will tell me. Someone said, data minimization. We were talking through an integration of two systems, and someone said, Well, wait a minute, we can&rsquo;t send all of that data because data minimization principles need to be followed. And so a lot of what we spent, you know, I think of as success is this, are we generating adoption of these principles? Is it just becoming part of workflow, part of process, versus people trying to circumvent us? And so that&rsquo;s what I think we&rsquo;ve paid the most attention to. You know, our information security team always goes, well, privacy is a thing here, which is good. So to me, it&rsquo;s like that unofficial. But helps to say, Are we changing the culture? And I think that, for me, is the biggest win when I see this adoption of principles and not seeing us as, hey, we&rsquo;re going to have to tick off that box at the end with the people that you know want to kill all of our fun. But it&rsquo;s rather oh, we&rsquo;re doing this process soup to nuts. We need to involve privacy. And I&rsquo;ve seen more and more traction over that over the course of the last couple of years. And that, to me, says change. Change is happening. It&rsquo;s not going to be overnight, but I&rsquo;m heartened to see how much change we&rsquo;ve seen in, you know, the last three years,</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 25:09&nbsp;&nbsp;</strong></p>
<p>That&rsquo;s so exciting and probably very rewarding.</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 25:13&nbsp;&nbsp;</strong></p>
<p>I mean, I think it&rsquo;s, you know, when you bring it back to human and you bring it back to trust people in an organization respond to that, because it&rsquo;s going we&rsquo;ve spent over 100 years building trust with customers and consumers. It can be undone so quickly, as marketers all talk about all the time, but it&rsquo;s this idea of if there&rsquo;s a value exchange and we&rsquo;re collecting the data people expect us to have and use it in a way that people expect us to use it, that trust continues, and I think that&rsquo;s resonated very much with the brand and the business, because there&rsquo;s this level of understanding on a human level of like, yeah, I wouldn&rsquo;t want my data used for XYZ, or I wouldn&rsquo;t expect that company to have that piece of data. So that&rsquo;s where the focus, I think, has really been, is just making sure people understand. It&rsquo;s ultimately about trust. It&rsquo;s complying with laws, but it&rsquo;s also trust.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 26:10&nbsp;&nbsp;</strong></p>
<p>Yep. Now when you are not building a privacy program, what do you like to do for fun?</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 26:15&nbsp;&nbsp;</strong></p>
<p>Right now, I&rsquo;m just sitting on baseball fields endlessly because I have two boys playing baseball, so that&rsquo;s taking up pretty much every weekend, day and night. But outside of that, love to travel, love to walk around new cities, having time to wander is like the ultimate luxury, right? Where you have no destination in mind. So I&rsquo;d say if that&rsquo;s if that&rsquo;s available, that&rsquo;s the thing I want to do.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 26:44&nbsp;&nbsp;</strong></p>
<p>That sounds super fun. One day, maybe I&rsquo;ll be right. It&rsquo;s something to aspire to. There&rsquo;s a long list of always something to be thinking about. At least for me, there is, if people would like to connect and learn more, where could they go?</p>
<p>&nbsp;</p>
<p><strong>Sarah Stalnecker&nbsp; 26:59&nbsp;&nbsp;</strong></p>
<p>So I am active on LinkedIn, so if they more than happy for them to connect with me on LinkedIn, and I do respond to those messages. So I think that&rsquo;s probably the best spot.</p>
<p>&nbsp;</p>
<p><strong>Jodi Daniels&nbsp; 27:10&nbsp;&nbsp;</strong></p>
<p>Awesome proof of life is over there. It&rsquo;s proof of life. We&rsquo;re so excited that you joined us today and how you have been building your privacy program. I know it&rsquo;s been really valuable to everyone. So thank you so much. Thanks for having me.</p>
<p>&nbsp;</p>
<p><strong>Outro 27:30&nbsp;&nbsp;</strong></p>
<p>Thanks for listening to the<em> She Said Privacy, He Said Security</em> Podcast. If you haven&rsquo;t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.</p>
</p></div>
</p></div>
</p></div>
</div></div>
</section>
<p>	(function($){<br />
        $(&#8216;[data-id=&#8221;block_a90991ffb9b62e2105117a77adedbfcf&#8221;]&#8217;).find( &#8216;.accordion-title&#8217; ).on(&#8216;click&#8217;, function(e) {<br />
            e.preventDefault();<br />
            $(this).toggleClass(&#8216;active&#8217;);<br />
            $(this).next().slideToggle(&#8216;fast&#8217;);<br />
        });<br />
	})(jQuery);</p>
<p>@media screen and (max-width: 1023px){section[data-id=&#8221;block_1d18ca1eab812f4116b726b97f95eb4f&#8221;]{   }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=&#8221;block_1d18ca1eab812f4116b726b97f95eb4f&#8221;]{   }}@media screen and (min-width: 1366px){section[data-id=&#8221;block_1d18ca1eab812f4116b726b97f95eb4f&#8221;]{   }}</p>
<section class="cta-section has-background has-background-color has-transparent-background-color has-white-color default" data-id="block_1d18ca1eab812f4116b726b97f95eb4f" data-aos="fade-in">
<div class="wrapper">
<div class="cta has-background-color has-background has-dark-red-background-color has-image jarallax">
<div class="grid align-center">
<div class="col-grid lg-col-6-12 offset-lg-col-1-12">
<h3 class="xs-mb-16 has-text-color has-white-color" data-aos="fade-up">Privacy doesn&rsquo;t have to be complicated.</h3>
<div class="has-text-color has-white-color has-white-link-color" data-aos="fade-up">
<p>As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.</p>
</div></div>
<div class="col-grid lg-col-2-12 offset-lg-col-2-12">
<div class="buttons" data-aos="fade-up">
                    <a href="https://redcloveradvisors.com/contact/" target="_self" class="button button-primary-reverse"><span>Schedule Call</span></a>                </div>
</p></div>
</p></div>
<p>            <img style=" max-width: 100%; height: auto; " decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/red-wave-rt-1.png" alt=""></p></div>
</p></div>
</section>
<p>The post <a rel="nofollow" href="https://redcloveradvisors.com/operationalizing-privacy-across-teams-tools-and-tech/">Operationalizing Privacy Across Teams, Tools, and Tech</a> appeared first on <a rel="nofollow" href="https://redcloveradvisors.com">Red Clover Advisors</a>.</p>
]]></content:encoded>
					
		
		
		<source url='https://redcloveradvisors.com/feed/'>Red Clover Advisors Blog</source>
	</item>
		<item>
		<title>PART 3: Data Categories and Surveillance Pricing: Ferguson’s Nuanced Approach to Privacy Innovation</title>
		<link>https://www.lexblog.com/2025/06/11/part-3-data-categories-and-surveillance-pricing-fergusons-nuanced-approach-to-privacy-innovation/</link>
		
		<dc:creator><![CDATA[Danner Kline]]></dc:creator>
		<pubDate>Wed, 11 Jun 2025 18:29:09 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/11/part-3-data-categories-and-surveillance-pricing-fergusons-nuanced-approach-to-privacy-innovation/</guid>

					<description><![CDATA[As FTC Chair Andrew Ferguson establishes his enforcement priorities, his positions on data categorization and surveillance pricing reveal a consistent philosophy that balances privacy protection with innovation. This is the third post in our series on what to expect from the FTC under Ferguson as chair. Our previous posts examined Ferguson&#8217;s broad regulatory philosophy of...]]></description>
										<content:encoded><![CDATA[
<p>As FTC Chair Andrew Ferguson establishes his enforcement priorities, his positions on data categorization and surveillance pricing reveal a consistent philosophy that balances privacy protection with innovation. This is the third post in our series on what to expect from the FTC under Ferguson as chair.</p>



<p>Our previous posts examined Ferguson&rsquo;s broad regulatory philosophy of &ldquo;staying in our lane&rdquo; and his priority enforcement areas in children&rsquo;s privacy and location data. This post explores Ferguson&rsquo;s approach to emerging privacy issues that don&rsquo;t fit neatly into established legal frameworks.</p>



<h2 class="wp-block-heading"><strong>Skepticism of &ldquo;Sensitive Categories&rdquo; Designation</strong></h2>



<p>Ferguson has expressed significant skepticism about the FTC designating certain categories of data as inherently &ldquo;sensitive&rdquo; without clear statutory basis. In <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/ferguson-statement-social-media-6b.pdf" target="_blank" rel="noreferrer noopener">his September 2024 statement on the Social Media and Video Streaming Services Report</a>, Ferguson criticized this approach:</p>



<p>&ldquo;I am skeptical that this is the kind of injury the law should try to address&#8230; I doubt it could. Any such line would tend toward arbitrariness and is not a stable system on which to decide whether advertisements are illegal.&rdquo;</p>



<p>Ferguson&rsquo;s critique reflects his broader concern that creating subjective lists of &ldquo;sensitive&rdquo; data categories raises several problems:</p>



<ol start="1" class="wp-block-list">
<li><strong>Arbitrary line-drawing</strong> &#8211; Determining which categories qualify as &ldquo;sensitive&rdquo; is inherently subjective and potentially politicized.</li>



<li><strong>Lack of statutory basis</strong> &#8211; Section 5 does not provide clear guidance on which categories of data should receive special protection.</li>



<li><strong>Inconsistent application</strong> &#8211; When regulators decide which categories deserve protection, the resulting lists may reflect the decision-makers&rsquo; preferences rather than objective criteria.</li>
</ol>



<p>Ferguson&rsquo;s <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/gravy_-mobilewalla-ferguson-concurrence.pdf" target="_blank" rel="noreferrer noopener">December 2024 concurrence in the Mobilewalla case</a> provides the clearest view of his position on sensitive data categorization, where he wrote: &ldquo;The FTC Act does not limit how someone who lawfully acquired those data might choose to analyze those data, or the conclusions that one might draw from them.&rdquo; This reveals a fundamental distinction in his approach: While he believes the initial collection of sensitive data without consent may violate Section 5, he is skeptical that the FTC can regulate how lawfully obtained data is subsequently categorized or analyzed.</p>



<p>Ferguson&rsquo;s analogy to private investigators is particularly telling: Just as investigators may legally observe someone entering a church and conclude they practice that religion, Ferguson believes that drawing conclusions from lawfully collected data is not, in itself, a Section 5 violation.</p>



<h2 class="wp-block-heading"><strong>Surveillance Pricing: Fact-Finding Over Speculation</strong></h2>



<p>Ferguson has demonstrated a measured approach to emerging data practices like surveillance pricing &mdash; the use of consumer data to set personalized prices. In July 2024, <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/surveillance-pricing-6b-ferguson-concurrence.pdf" target="_blank" rel="noreferrer noopener">he supported the FTC&rsquo;s 6(b) study</a> into these practices, explaining:</p>



<p>&ldquo;One of the most important duties with which Congress has entrusted us is studying markets and industries and reporting to the public and Congress what we learn&#8230; These studies may inform future Commission enforcement actions, but they need not.&rdquo;</p>



<p>His statement emphasized the importance of thorough fact-finding before developing policy positions, noting:</p>



<p>&ldquo;Congress and the American people should be made aware of whether and how consumers&rsquo; private data may be used to affect their pocketbooks.&rdquo;</p>



<p>However, in January 2025, <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/surveillance-pricing-6b-research-summaries-ferguson-dissent-final.pdf" target="_blank" rel="noreferrer noopener">Ferguson joined Commissioner Melissa Holyoak in dissenting</a> from the release of preliminary &ldquo;research summaries&rdquo; on surveillance pricing. His dissent criticized the rushed release of early findings:</p>



<p>&ldquo;Issuing these research summaries degrades the Commission&rsquo;s Section 6(b) process. The Commission should not be releasing staff&rsquo;s early impressions that &lsquo;can be outdated with new information&rsquo; because the fact gathering process on the very issues being presented to the public is still underway.&rdquo;</p>



<p>This suggests a commitment by Ferguson to thorough investigation of privacy issues before regulation, particularly with emerging practices that implicate consumer data.</p>



<h2 class="wp-block-heading"><strong>Balancing Evidence and Action</strong></h2>



<p>Ferguson&rsquo;s approach to both sensitive data categories and surveillance pricing illustrates his broader privacy philosophy:</p>



<ol start="1" class="wp-block-list">
<li><strong>Demand robust evidence</strong> &#8211; Before taking regulatory action on privacy practices, Ferguson wants complete factual records that demonstrate actual harm.</li>



<li><strong>Favor established laws over novel theories</strong> &#8211; His skepticism of &ldquo;sensitive categories&rdquo; shows preference for established legal frameworks rather than expanding statutory interpretations.</li>



<li><strong>Emphasize procedural integrity</strong> &#8211; His objection to preliminary research summaries reveals concern with fair, thorough processes before reaching conclusions about data practices.</li>
</ol>



<p>Ferguson appears to maintain a genuine openness to evidence that might show consumer benefits from practices such as data categorization or personalized pricing. His insistence on completing thorough market studies reflects not just procedural formalism but a substantive commitment to evidence-based regulation that considers both potential harms and benefits.</p>



<h2 class="wp-block-heading"><strong>What This Means for Businesses</strong></h2>



<p>Based on Ferguson&rsquo;s positions, here are some considerations for businesses:</p>



<p><strong>For Data Categorization:</strong></p>



<ul class="wp-block-list">
<li>Focus on consent mechanisms for data collection rather than worrying about how lawfully collected data is analyzed.</li>



<li>Document legitimate business purposes for data analysis.</li>



<li>Keep watch for potential future legislation that might specifically designate certain data categories for special protection.</li>



<li>Distinguish clearly between initial data collection practices (which face greater scrutiny) and subsequent analysis of lawfully collected data (which faces less scrutiny).</li>
</ul>



<p><strong>For Surveillance Pricing and Similar Practices:</strong></p>



<ul class="wp-block-list">
<li>Expect continued scrutiny of personalized pricing practices, but through careful study rather than immediate regulation.</li>



<li>Maintain transparency about how customer data influences pricing.</li>



<li>Document how pricing algorithms use personal data.</li>



<li>Consider implementing clear opt-out mechanisms for data-based pricing.</li>



<li>Document instances where personalized pricing benefits consumers through lower prices or increased access, as Ferguson&rsquo;s evidence-based approach may be receptive to such benefits.</li>
</ul>



<h2 class="wp-block-heading"><strong>Evolution Rather Than Revolution</strong></h2>



<p>Ferguson&rsquo;s approach suggests the FTC under his leadership will maintain strong privacy enforcement but with a focus on clear statutory violations rather than expanding interpretations of unfairness. For data categorization and surveillance pricing, this means:</p>



<ol start="1" class="wp-block-list">
<li><strong>Continued fact-finding</strong> &#8211; The commission will likely invest in thorough market studies before developing policy positions.</li>



<li><strong>Focus on deception over unfairness</strong> &#8211; Companies making false or misleading claims about data practices will face scrutiny, while novel &ldquo;unfairness&rdquo; theories will receive more skepticism.</li>



<li><strong>Emphasis on consent and transparency</strong> &#8211; Proper notice, consent, and transparency will remain central to the FTC&rsquo;s privacy enforcement.</li>
</ol>



<p>This approach represents evolution rather than revolution in the commission&rsquo;s privacy work, with a measured path that balances consumer protection with business certainty and technological innovation.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.onlineandonpoint.com/'>Online and On Point</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Data-Folders-GettyImages-2163344994.jpg' type='image/jpeg' length='614576' />	</item>
		<item>
		<title>Using Facial Recognition? Regulators Expect Detailed Risk Assessments</title>
		<link>https://www.lexblog.com/2025/06/11/using-facial-recognition-regulators-expect-detailed-risk-assessments/</link>
		
		<dc:creator><![CDATA[Odia Kagan]]></dc:creator>
		<pubDate>Wed, 11 Jun 2025 16:34:48 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/11/using-facial-recognition-regulators-expect-detailed-risk-assessments/</guid>

					<description><![CDATA[Following the Federal Trade Commission&#8217;s decision in December 2023 to ban Rite Aid from using AI facial recognition, it has become crystal clear that U.S. regulators expect a risk assessment when a retailer uses facial recognition technology.A new, and detailed, report from the New Zealand privacy commission provides helpful considerations for such Data Protection Impact...]]></description>
										<content:encoded><![CDATA[
<p>Following the Federal Trade Commission&#8217;s decision in December 2023 to <a href="https://www.ftc.gov/news-events/news/press-releases/2023/12/rite-aid-banned-using-ai-facial-recognition-after-ftc-says-retailer-deployed-technology-without">ban Rite Aid from using AI facial recognition</a>, it has become crystal clear that U.S. regulators expect a risk assessment when a retailer uses facial recognition technology.<br>A <a href="https://www.privacy.org.nz/assets/DOCUMENTS/20250603-FRT-Inquiry-Report-A1082856.pdf">new, and detailed, report</a> from the New Zealand privacy commission provides helpful considerations for such Data Protection Impact Assessments (DPIAs). They include:</p>



<ul class="wp-block-list">
<li>Was the data trained on minorities?</li>



<li>How long will the retailer retained data that wasn&#8217;t matched?</li>



<li>Data minimization techniques (including when to share among stores and when to add to a watchlist).</li>



<li>How accurate should the match be to trigger consideration (92.5%)?</li>
</ul>

]]></content:encoded>
					
		
		
		<source url='https://dataprivacy.foxrothschild.com/'>Privacy Compliance &amp; Data Security</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/38419767_s.jpg' type='image/jpeg' length='162558' />	</item>
		<item>
		<title>Different Country, Same Challenges: Lessons from a Breach That Could Have Been Prevented</title>
		<link>https://www.lexblog.com/2025/06/11/different-country-same-challenges-lessons-from-a-breach-that-could-have-been-prevented/</link>
		
		<dc:creator><![CDATA[Joseph J. Lazzarotti]]></dc:creator>
		<pubDate>Wed, 11 Jun 2025 12:18:46 +0000</pubDate>
				<category><![CDATA[Employment & Labor]]></category>
		<category><![CDATA[Featured Posts]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/11/different-country-same-challenges-lessons-from-a-breach-that-could-have-been-prevented/</guid>

					<description><![CDATA[A recent breach involving Indian fintech company Kirana Pro serves as a reminder to organizations worldwide: even the most sophisticated cybersecurity technology cannot make up for poor administrative data security hygiene. According to a June 7 article in India Today, KiranaPro suffered a massive data wipe affecting critical business information and customer data. The company&#8217;s...]]></description>
										<content:encoded><![CDATA[
<p>A recent breach involving Indian fintech company Kirana Pro serves as a reminder to organizations worldwide: even the most sophisticated cybersecurity technology cannot make up for poor administrative data security hygiene.</p>



<p>According to a <a href="https://www.indiatoday.in/technology/news/story/kiranapro-blames-ex-employee-after-massive-data-wipe-but-will-not-rule-out-hack-2737264-2025-06-07">June 7 article in <em>India Today</em></a>, KiranaPro suffered a massive data wipe affecting critical business information and customer data. The company&rsquo;s CEO believes the incident was likely the result of a disgruntled former employee, though he has not ruled out the possibility of an external hack, according to reporting. <a href="https://techcrunch.com/2025/06/06/after-its-data-was-wiped-kiranapros-co-founder-cannot-rule-out-an-external-hack/">TechCrunch</a> explained:  </p>



<p class="is-style-callout">The company confirmed it did not remove the employee&rsquo;s access to its data and GitHub account following his departure. &ldquo;Employee offboarding was not being handled properly because there was no full-time HR,&rdquo; KiranaPro&rsquo;s chief technology officer, Saurav Kumar, confirmed to TechCrunch.</p>



<p>Unfortunately, this is not a uniquely Indian problem. Globally, organizations invest heavily in technical safeguards&mdash;firewalls, multi-factor authentication, encryption, endpoint detection, and more. These tools are essential, but not sufficient. </p>



<p><strong>The Silent Risk of Inactive Accounts</strong></p>



<p>One of the most common (and preventable) vectors for insider incidents or credential abuse is failure to promptly deactivate system access when an employee departs. Whether termination is amicable or not, if a former employee retains credentials to email, cloud storage, or enterprise software, the organization is vulnerable. These accounts may be exploited intentionally (as suspected in the KiranaPro case) or unintentionally if credentials are stolen or phished later.</p>



<p>Some organizations assume their IT department is handling these terminations automatically. Others rely on inconsistent handoffs between HR, legal, and IT teams. Either way, failure to follow a formal offboarding checklist&mdash;and verify deactivation&mdash;may be a systemic weakness, not a fluke.</p>



<p><strong>It&rsquo;s Not Just About Tech&mdash;It&rsquo;s About Governance</strong></p>



<p>This breach illustrates the point that information security is as much about <strong>governance and process</strong> as it is about technology. Managing who has access to what systems, when, and why is a core component of security frameworks such as NIST, ISO 27001, and the CIS Controls. In fact, user access management&mdash;including timely revocation of access upon employee separation&mdash;is a foundational expectation in every major cybersecurity risk assessment.</p>



<p>Organizations should implement the following best practices:</p>



<ol class="wp-block-list">
<li><strong>Establish a formal offboarding procedure.</strong> Involve HR, IT, and Legal to ensure immediate deactivation of all accounts upon separation.</li>



<li><strong>Automate user provisioning and deprovisioning</strong> where possible, using identity and access management (IAM) tools.</li>



<li><strong>Maintain a system of record for all access rights.</strong> Periodically audit active accounts and reconcile them against current employees and vendors.</li>



<li><strong>Train supervisors and HR personnel</strong> to notify IT or security teams immediately upon termination or resignation. There also may be cases where monitoring an employee&#8217;s system activity in anticipation of termination may be prudent.   </li>
</ol>



<p><strong>The Takeaway</strong></p>



<p>Wherever your company does business and regardless of industry, the fundamentals are the same: a lapse in basic access control can cause as much damage as a ransomware attack. The KiranaPro incident is a timely cautionary tale. Organizations must view cybersecurity not only as a technical discipline but as an enterprise-wide responsibility.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.workplaceprivacyreport.com/'>Workplace Privacy, Data Management &amp; Security Report</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Made-with-Canon-5d-Mark-1749669669-3228809-2858-lxb_photohvSr_CVecVIlxb_photo--550x550.jpg' type='image/jpeg' length='170312' />	</item>
		<item>
		<title>EU: Brussels Court of Appeal rules on IAB Europe and the TC String – Implications for GDPR Compliance</title>
		<link>https://www.lexblog.com/2025/06/11/eu-brussels-court-of-appeal-rules-on-iab-europe-and-the-tc-string-implications-for-gdpr-compliance/</link>
		
		<dc:creator><![CDATA[Heidi Waem, Verena Grentzenberg and Luca Sawatzki]]></dc:creator>
		<pubDate>Wed, 11 Jun 2025 11:22:08 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/11/eu-brussels-court-of-appeal-rules-on-iab-europe-and-the-tc-string-implications-for-gdpr-compliance/</guid>

					<description><![CDATA[On 14 May 2025, the Brussels Court of Appeal (Market Court) delivered the long-awaited judgement in the case concerning the Transparency &#38; Consent Framework (&#8220;TCF&#8221;) (case no.&#160;2022/AR/292). The Court largely upheld the findings of the Belgian Data Protection Authority (&#8220;Belgian DPA&#8221;), concluding that the TCF&#8217;s use of the Transparency and Consent String (&#8220;TC String&#8220;) fails...]]></description>
										<content:encoded><![CDATA[
<p>On 14 May 2025, the Brussels Court of Appeal (Market Court) delivered the long-awaited judgement in the case concerning the Transparency &amp; Consent Framework (&ldquo;<strong>TCF</strong>&rdquo;) (case no.&nbsp;2022/AR/292). The Court largely upheld the findings of the Belgian Data Protection Authority (&ldquo;<strong>Belgian DPA</strong>&rdquo;), concluding that the TCF&#8217;s use of the Transparency and Consent String (&#8220;<strong>TC String</strong>&#8220;) fails to fully comply with the General Data Protection Regulation (&ldquo;<strong>GDPR</strong>&rdquo;). This decision followed a preliminary ruling of the Court of Justice of the European Union (&ldquo;<strong>CJEU</strong>&rdquo;) in March 2024 (case no.&nbsp;C-604/22) (see our <a href="https://privacymatters.dlapiper.com/2024/03/cjeu-ruling-clarifies-data-protection-and-e-privacy-issues-in-the-ad-tech-space/">previous blog post</a>).</p>



<p><strong>Background</strong></p>



<p>IAB Europe is a non-profit association that represents the digital advertising and marketing sector at the European level. It developed the TCF to standardize the way online publishers, advertisers, and other participants in the ad-tech ecosystem collect user consent for data processing in accordance with the GDPR.</p>



<p>The TCF is widely applied in the context of a real-time auctioning system used to acquire advertising space for the display of targeted advertisements online. A key component of the TCF is the TC String.</p>



<p>The TC String is a combination of letters and characters which encodes and records user preferences through consent management platforms (&ldquo;<strong>CMPs</strong>&rdquo;) when users visit a website or app. The TC String is then shared with ad platforms and other participants of the ad-tech ecosystem; the CMP also places a specific cookie on the user device. When combined, the TC String and this cookie can be linked to the user&rsquo;s IP address.</p>



<p>In February 2022, the Belgian DPA decided that:</p>



<ul class="wp-block-list">
<li>the TC String constitutes personal data under Art.&nbsp;4(1) GDPR,</li>



<li>IAB Europe qualifies as a controller under Art.&nbsp;4(7) GDPR, and</li>



<li>the TCF, as implemented, did not meet certain GDPR requirements (for details see our previous <a href="https://privacymatters.dlapiper.com/2022/03/belgian-dpa-decision-on-iab-transparency-and-consent-framework/">blogpost</a>).</li>
</ul>



<p>IAB Europe appealed the decision, and in the course of proceedings, the Brussels Court of Appeal referred several key legal questions to the CJEU. In March 2024, the CJEU confirmed that:</p>



<ul class="wp-block-list">
<li>the TC String may constitute personal data under Art.&nbsp;4(1) GDPR if it can be combined with other data to identify a user.</li>



<li>IAB Europe can, under certain circumstances, be considered a controller under Art.&nbsp;4(7) GDPR.</li>



<li>IAB Europe is not automatically a controller for further processing activities by third parties if it does not determine the purposes or means of those processing activities (for details see our previous <a href="https://privacymatters.dlapiper.com/2024/03/cjeu-ruling-clarifies-data-protection-and-e-privacy-issues-in-the-ad-tech-space/">blogpost</a>).</li>
</ul>



<p><strong>Findings of the Brussels Court of Appeal</strong></p>



<p>The Brussels Court of Appeal partially annulled the Belgian DPA&rsquo;s original &euro;250,000 fine against IAB Europe &mdash;but solely on procedural grounds. Substantively, the Court confirmed the Belgian DPA&rsquo;s main findings.</p>



<p>The key rulings include:</p>



<ul class="wp-block-list">
<li><strong>Personal Data Classification</strong>: in line with the CJEU, the Court confirmed that the TC String qualifies as personal data when combined with additional identifying information such as the user&rsquo;s IP address.<br></li>



<li><strong>Controller Status</strong>: the Court agreed that IAB Europe acts as a (joint) controller within the meaning of Art.&nbsp;26 GDPR for processing operations within the TCF. However, IAB Europe was found not to be a controller for processing activities conducted solely within the OpenRTB protocol framework, over which it has no control.<br></li>



<li><strong>GDPR Violations Identified</strong>:<ul><li>IAB Europe failed to establish a valid legal basis under Art.&nbsp;6(1) GDPR for processing personal data via the TC String. In particular, legitimate interest under Art.&nbsp;6(1)(f) GDPR could not be relied upon, as the necessity of the processing was not demonstrated.</li></ul><ul><li><strong>Lack of Transparency</strong>: the framework failed to provide users with clear and accessible information, in violation of Art.&nbsp;12&ndash;14 GDPR.</li></ul><ul><li><strong>No Data Protection Impact Assessment (&ldquo;DPIA&rdquo;)</strong>: a DPIA required under Art.&nbsp;35 GDPR was not conducted.</li></ul>
<ul class="wp-block-list">
<li><strong>Additional Violations</strong>: the Court found further infringements of Art.&nbsp;5(1)(f), 25, 32, and 37 GDPR, concerning data security, privacy by design, and data protection officer obligations.</li>
</ul>
</li>
</ul>



<p><strong>Conclusion</strong></p>



<p>At first glance, the judgement provides important guidance on the compliance of IAB Europe when implementing the TCF.</p>



<p>However, it remains to be seen what the consequence of the judgement will be. It is worth noting that the judgment applies specifically to TCF versions 1.0 and 2.0. The currently deployed version &ndash; TCF 2.2 &ndash; was not subject to review in this case. Whether future implementations of the TCF will satisfy GDPR requirements remains an open question and may be subject to future regulatory scrutiny. Many of the points criticized have already been rectified in the version 2.2 of the TCF. The remaining uncertainty concerns the fact of whether and under which circumstances IAB Europe can qualify as a (joint) controller in relation to the TC String in its current version. This largely depends on whether IAB Europe has real decision-making power over the purposes and means of processing.</p>



<p>Moreover, the question of whether an individual company is in breach of GDPR when using the TCF remains a matter for national data protection authorities to decide on a case-by-case basis.</p>

]]></content:encoded>
					
		
		
		<source url='https://privacymatters.dlapiper.com/'>Privacy Matters</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Programming_Code_S_2309-copy-1.jpg' type='image/jpeg' length='158769' />	</item>
		<item>
		<title>Data Protection Meets Consumer Protection: The Crucial Role of Clear Terms in Service Contracts</title>
		<link>https://www.lexblog.com/2025/06/10/data-protection-meets-consumer-protection-the-crucial-role-of-clear-terms-in-service-contracts/</link>
		
		<dc:creator><![CDATA[Dan Cooper and Anna Sophia Oberschelp de Meneses]]></dc:creator>
		<pubDate>Tue, 10 Jun 2025 21:58:34 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/10/data-protection-meets-consumer-protection-the-crucial-role-of-clear-terms-in-service-contracts/</guid>

					<description><![CDATA[On June 10, 2025, the Finnish Data Protection Ombudsman published a decision (in FI) where it found that the processing of personal data for enforcing parking violations was unlawful because the enforcement mechanism was not described in the parking rental agreement.&#160; This recent decision is a striking example of how data protection and consumer protection...]]></description>
										<content:encoded><![CDATA[
<p>On June 10, 2025, the Finnish Data Protection Ombudsman published a <a href="https://tietosuoja.fi/-/pysakoinninvalvonta-ilman-sopimusta-johti-henkilotietojen-lainvastaiseen-kasittelyyn">decision</a> (in FI) where it found that the processing of personal data for enforcing parking violations was unlawful because the enforcement mechanism was not described in the parking rental agreement.&nbsp; This recent decision is a striking example of how data protection and consumer protection law are increasingly intertwined.&nbsp; The case demonstrates that the way in which customer services&mdash;and any related enforcement mechanisms for non-performance&mdash;are described in contracts is not just a matter of consumer transparency, but a legal requirement for the lawful processing of personal data under Article 6(1)(b) of the GDPR (&ldquo;processing [that] is necessary for the performance of a contract&rdquo;).</p>



<span id="more-3228364"></span>



<p><span style="text-decoration: underline">Background</span></p>



<p>In this case, persons who violated certain applicable rules for parking in a designated area (<em>e.g</em>., did not display a badge indicating their right to park), set out in a property rental agreement, faced collection actions for alleged parking violations from a third-party debt collection company.&nbsp; However, the rental agreement made no mention of such enforcement of the rules or any requirement to display a parking permit.&nbsp; Despite this, the debt collection agency processed the renter&rsquo;s personal data to pursue contested parking fines.</p>



<p>Upon review, the Data Protection Ombudsman determined that the absence of any contractual provision pertaining to the enforcement of the parking rules in the rental agreement meant there was no lawful basis for processing the renter&rsquo;s personal data for enforcement purposes.&nbsp; In this case, the third-party debt collection company may not rely on Article 6(1)(b) of the GDPR (&ldquo;processing [that] is necessary for the performance of a contract to which the data subject is party&rdquo;).&nbsp; From a contract law perspective, the agreement between the party renting out the parking spaces and the third-party debt collection company did not bind individual renters, as the terms were not referenced in the latter&rsquo;s rental agreement.</p>



<p><span style="text-decoration: underline">Why Service Descriptions Matter in Contracts</span></p>



<p>While the General Data Protection Regulation (GDPR) requires a lawful basis for personal data processing, EU consumer law&mdash;specifically the Consumer Rights Directive (CRD) and the Unfair Commercial Practices Directive (UCPD)&mdash;requires that consumers receive clear, comprehensive information about the characteristics of products and services before entering into a contract. &nbsp;This includes details on enforcement mechanisms, such as fines or collection actions, if they are part of the service.</p>



<ul class="wp-block-list">
<li>The CRD (Directive 2011/83/EU) obliges traders to inform consumers about the main characteristics of services and any conditions for enforcement or termination before the contract is concluded.</li>



<li>The UCPD (Directive 2005/29/EC) prohibits misleading omissions, ensuring consumers are not left in the dark about key contract features.</li>
</ul>



<p>If enforcement mechanisms are not clearly described and agreed upon, not only may the consumer&rsquo;s right to information be infringed, but any data processing for enforcement purposes may lack a lawful basis under the GDPR.</p>



<p><span style="text-decoration: underline">Implications for Service Providers</span></p>



<p>For service providers, the key takeaways are:</p>



<ul class="wp-block-list">
<li>Review and update contract templates to ensure that all enforcement mechanisms and data processing purposes for fulfilling and enforcing the contract are clearly described.</li>



<li>Before processing personal data with the intention of enforcing contractual terms that have been breached or other purposes, verify that the contract with the consumer expressly covers these activities.</li>
</ul>



<p class="has-text-align-center">*&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; *&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; *</p>



<p>Covington &amp; Burling continues to monitor and advise companies on navigating EU data protection law and its intersection with EU consumer protection law. &nbsp;Please do reach out if you need assistance in these areas.</p>



<p>(<em>This blog post was written with the contribution of Alberto Vogel</em>.)</p>



<p></p>

]]></content:encoded>
					
		
		
		<source url='https://www.insideprivacy.com/'>Inside Privacy</source>
	</item>
		<item>
		<title>Quantum Computing and its Impact on the Life Science Industry</title>
		<link>https://www.lexblog.com/2025/06/10/quantum-computing-and-its-impact-on-the-life-science-industry-3/</link>
		
		<dc:creator><![CDATA[Nira Pandya and Tamzin Bond]]></dc:creator>
		<pubDate>Tue, 10 Jun 2025 13:59:14 +0000</pubDate>
				<category><![CDATA[Health Care]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/10/quantum-computing-and-its-impact-on-the-life-science-industry-3/</guid>

					<description><![CDATA[Quantum computing uses quantum mechanics principles to solve certain complex mathematical problems faster than classical computers.&#160; Whilst classical computers use binary &#8220;bits&#8221; to perform calculations, quantum computers use quantum bits (&#8220;qubits&#8221;).&#160; The value of a bit can only be zero or one, whereas a qubit can exist as zero, one, or a combination of both...]]></description>
										<content:encoded><![CDATA[
<p>Quantum computing uses quantum mechanics principles to solve certain complex mathematical problems faster than classical computers.&nbsp; Whilst classical computers use binary &ldquo;bits&rdquo; to perform calculations, quantum computers use quantum bits (&ldquo;qubits&rdquo;).&nbsp; The value of a bit can only be zero or one, whereas a qubit can exist as zero, one, or a combination of both states (a phenomenon known as superposition) allowing quantum computers to solve certain problems exponentially faster than classical computers.</p>



<p>The potential applications of quantum computing are wide-ranging and industry-agnostic. For instance, they could be used to enhance the analysis of large, complex data sets, optimize supply-chain processes, and enhance artificial intelligence (&ldquo;AI&rdquo;) technologies and improve machine learning algorithms.</p>



<p>Given the potential applications, quantum computing could have a significant impact on companies in the life sciences sector, and more specifically could be used to improve:</p>



<span id="more-3227706"></span>



<p class="is-style-indented"><strong>1. Drug discovery</strong></p>



<p class="is-style-indented">Classical computational methods play a crucial role in the drug discovery and design process by providing tools and techniques to model, predict and analyze the behavior of chemical systems. Quantum computing technologies have the potential to offer a more powerful, accurate and efficient alternative to classical computers, and can simulate more intricate chemical structures and interactions, leading to more optimized drug design.</p>



<p class="is-style-indented">Quantum computing can also be used as a complementary technology in conjunction with AI, which is already being used for drug discovery, thus exponentially increasing the accuracy and speed of the drug discovery process and potentially reducing the associated costs.</p>



<p class="is-style-indented"><strong>2. Clinical development</strong></p>



<p class="is-style-indented">Quantum computers are far superior to classical computers when handling problems with multiple variables and complex datasets and it is exactly these capabilities that can be leveraged in clinical development phases. &nbsp;For example, quantum computing can be used to optimize the design of clinical trials; to enhance the data analysis and modelling process post-clinical trial by quickly detecting complex patterns and correlations; to monitor and adapt clinical trials in real-time, allowing for dynamic adjustments to protocols to maximize individual patient outcomes and address emerging safety concerns.</p>



<p class="is-style-indented">Quantum computers can be used in conjunction with AI, specifically machine learning, to analyze genetic and biomolecular data in order to better predict individual responses to specific treatments and create optimal treatment plans tailored to the genetic makeup and health condition a specific patient.</p>



<p class="is-style-indented"><strong>3. Diagnostics</strong></p>



<p class="is-style-indented">Quantum computing technologies have the potential to enhance patient diagnosis and personalized care. Not only can quantum computing improve image reconstruction and quality, but it also has the potential to aid faster, more accurate and more precise image analysis and interpretation by considering multiple data points at the same time, when compared to classical computing methods.</p>



<p class="is-style-indented">In additions, quantum sensing is an emerging technology that uses quantum technology principles to detect or &lsquo;sense&rsquo; changes in physical qualities such as temperature, motion, light and chemical composition. The high sensitivity and resolution of this technology offers the possibility of more efficient and accurate medical diagnosis, and it is likely we will see more of quantum sensing not just in classical medical diagnostic imaging settings, but also in wearables and health monitors.&nbsp;</p>



<p class="is-style-indented">For life sciences companies operating in the medical imaging field, quantum computing offers a real and unprecedented opportunity to tackle complex diagnostic challenges.&nbsp; For patients, the exploitation of quantum technologies could dramatically improve disease diagnosis and prognosis.</p>



<p class="is-style-indented"><strong>4. Therapy</strong></p>



<p class="is-style-indented">Quantum computers have the potential to vastly advance the field of personalized medicine.&nbsp; The exponential power of a quantum computer, compared to a classical computer, can be utilized to more efficiently and accurately calculate dosing regimens, improve treatment plans, and predict and improve individual patient outcomes.</p>



<p class="is-style-indented"><strong>5. Manufacturing and supply chain processes</strong></p>



<p class="is-style-indented">Quantum computing technologies have the potential to optimize both the manufacturing and distribution of products. Quantum computing&rsquo;s near real-time data processing capabilities can be used for more effective inventory management, demand forecasting and route planning, leading to a more robust supply chain particularly in the context of the pharmaceutical cold chain.</p>



<p class="is-style-indented"><strong>6. Generation of Synthetic Data</strong></p>



<p class="is-style-indented">As the world becomes increasingly driven by data, the most valuable resource of the future will likely be data itself.&nbsp; One potential issue is that there will not be enough credible, high-quality data to meet demand. Quantum computing has the power to generate higher-quality, more accurate synthetic data that simulates real-world data, thereby working towards alleviating the data scarcity issue.&nbsp;</p>



<p>Quantum computing and emerging quantum technologies have the potential to reduce the costs associated with research and development, and thus has the potential to open up research in rare, complex, and underfunded disease areas. However, the benefits of quantum computing are not without risk. Most significantly for the life sciences sector, there is a concern that in the future, quantum technologies may have the ability to solve the complex mathematical problems that underpin currently used cryptography methods, posing a threat to personal and sensitive patient data.</p>



<p>It remains to be seen how and when the field of quantum computing will develop, and how its potential impacts will be seen and felt. Quantum computing is still in the early stages of development and government policy, investment and regulation will likely play a crucial role in the growth of this technology in the EU, UK, U.S. and beyond.</p>



<p>Covington is monitoring developments globally in this fast-growing area.</p>



<p>Visit Covington&rsquo;s <a href="https://www.cov.com/en/topics/quantum-computing">Quantum Computing</a> web page for additional updates.&nbsp; Please reach out to a member of the team with any inquiries.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.covingtondigitalhealth.com/'>Covington Digital Health</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/Abstract-Connection-Concept_jpg-2.jpg' type='image/jpeg' length='53080' />	</item>
		<item>
		<title>Employee Privacy Notice: Why Your Business Can’t Afford to Wing It  </title>
		<link>https://www.lexblog.com/2025/06/10/employee-privacy-notice-why-your-business-cant-afford-to-wing-it/</link>
		
		<dc:creator><![CDATA[Jodi Daniels]]></dc:creator>
		<pubDate>Tue, 10 Jun 2025 05:00:00 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/10/employee-privacy-notice-why-your-business-cant-afford-to-wing-it/</guid>

					<description><![CDATA[The tech world is known for its jargon. &#8220;Garbage in, garbage out.&#8221; &#8220;Culture eats strategy for breakfast.&#8221; &#8220;Disrupt or be disrupted.&#8221;&#160; One that&#8217;s especially popular, though, is &#8220;build the plane while flying it.&#8221;&#160; This phrase evokes agility, responsiveness, and innovation&#8212;even if it doesn&#8217;t speak to a totally-thought-through roadmap. But the fact is, in plenty of...]]></description>
										<content:encoded><![CDATA[<p>The tech world is known for its jargon. &ldquo;Garbage in, garbage out.&rdquo; &ldquo;Culture eats strategy for breakfast.&rdquo; &ldquo;Disrupt or be disrupted.&rdquo;&nbsp;</p>
<p>One that&rsquo;s especially popular, though, is &ldquo;build the plane while flying it.&rdquo;&nbsp;</p>
<p>This phrase evokes agility, responsiveness, and innovation&mdash;even if it doesn&rsquo;t speak to a totally-thought-through roadmap. But the fact is, in plenty of contexts&mdash;iterative product launches, new market expansions&mdash;it&rsquo;s not a bad approach. Sometimes you have to get off the ground before adjusting midair.</p>
<p>But when it comes to <a href="https://redcloveradvisors.com/a-guide-to-employee-data-privacy/">employee privacy</a>? You need the plane fully built before takeoff.</p>
<p>Otherwise, you&rsquo;re not just looking at some harmless turbulence. You&rsquo;re facing regulatory headwinds, damaged employee trust, potential lawsuits, and a lot of metaphorical oxygen masks dropping from the ceiling.</p>
<h2 class="wp-block-heading">Why employee privacy matters more now than ever</h2>
<p>How companies collect, store, and manage employee data has changed&mdash;and so have the expectations around how that information should be handled.</p>
<p>Today, employee information lives across cloud-based HR platforms, payroll systems, benefits portals, <a href="https://redcloveradvisors.com/privacyoperations/thirdpartyriskmanagement/">third-party vendors</a>, collaboration apps, and <a href="https://redcloveradvisors.com/privacyoperations/aigovernance/">AI-driven business tools</a>. At the same time, remote work has made employee data more vulnerable.</p>
<p>Employees are also paying closer attention. They want to know how their personal information is used, who has access to it, and what rights they have to control it.</p>
<p>Regulators have responded to this shift. Laws like the <a href="https://redcloveradvisors.com/by-regulation/california-consumer-privacy-act-ccpa/">California Consumer Privacy Act (CCPA)</a> and the <a href="https://redcloveradvisors.com/gdpr-compliance/">European Union&rsquo;s General Data Protection Regulation (GDPR)</a> establish enforceable rights for employees over their data. Under these frameworks, companies are required to:</p>
<ul class="wp-block-list">
<li>Disclose what they collect and why&mdash;before collecting it</li>
<li>Provide employees access to their data, and allow them to correct, delete, or limit its use</li>
<li>Enforce data minimization and retention policies</li>
<li>Monitor third-party vendors that process employee information</li>
</ul>
<p>These regulations are broad in scope. GDPR, for example, applies to organizations that are processing data on those in the EU, even if the company itself is based in the US. (This goes for the California Consumer Privacy Act (CCPA) too&mdash;a global company based outside of California, for example, would be in scope for the law if it has employees in the state.) Remote work makes jurisdictional boundaries even harder to manage, adding risk for businesses that apply inconsistent standards across locations.</p>
<p>Just as employee privacy is spread across numerous systems, risk is spread across multiple spheres: regulatory, brand credibility, and talent retention.&nbsp;</p>
<h2 class="wp-block-heading">What a good employee privacy notice does</h2>
<p>At its core, a strong employee privacy notice accomplishes three things:</p>
<ol class="wp-block-list">
<li>First, it keeps you compliant with CCPA and GDPR.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</li>
<li>Second, it explains what data is collected, used, stored, and shared, as well as the various privacy rights of the employees.&nbsp;</li>
<li>Finally, it makes your privacy posture visible to said employees, so they don&rsquo;t have to wonder how their data is handled.&nbsp;&nbsp;</li>
</ol>
<p>But what goes into a &ldquo;good&rdquo; employee privacy notice? This will depend on your organization&rsquo;s needs, but at minimum, your notice should answer:</p>
<ul class="wp-block-list">
<li>What personal data do you collect?</li>
<li>Why are you collecting it?</li>
<li>Who can access it?</li>
<li>How long will you keep it?</li>
<li>What rights do employees have regarding accessing, correcting, or deleting their information?</li>
</ul>
<p>But the best notices go beyond listing facts. They show real thinking about how employee data moves through your systems&mdash;and how your company upholds employee rights in practice.</p>
<p>For example:</p>
<ul class="wp-block-list">
<li>If you use productivity monitoring software, the notice should spell out precisely what&rsquo;s tracked&mdash;keystrokes, browsing history, login times&mdash;and who can see that information. It should also explain how the data is protected, when it&rsquo;s deleted, and, if required, how employees&rsquo; written consent is obtained before monitoring begins.</li>
</ul>
<ul class="wp-block-list">
<li>If you collect health information for benefits programs, employees should know if that information stays internal, is shared with vendors, or is segregated from other HR files.</li>
<li>And if employees have the right to request data access, correction, or deletion (as they do under CCPA and GDPR), the notice must explain the process clearly, without legalese.</li>
</ul>
<h2 class="wp-block-heading">Bridging the gap between policy and practice</h2>
<p>Most companies have gaps between their employee privacy notice and their daily practices. The key is to identify where those breakdowns occur and address them systematically.</p>
<p>Here are some of the most common problems businesses are facing&mdash;and practical steps to close the gaps:</p>
<h3 class="wp-block-heading">Problem: employee data sprawl with no real oversight</h3>
<p>Employee data doesn&rsquo;t live in one clean database. But without a unified view, it&rsquo;s almost impossible to protect employee data or respond promptly (and correctly) to a rights request.</p>
<p><strong>How to fix it: </strong>Conduct a complete employee <a href="https://redcloveradvisors.com/privacyoperations/datainventory/">data inventory</a> across all departments and vendors. Map what you collect, where it&rsquo;s stored, how it&rsquo;s used, who has access, and how long it&rsquo;s retained.&nbsp;</p>
<p>Start simple: spreadsheets are fine. For closer management, label sensitive fields like Social Security numbers, bank information, or health records. Update your inventory quarterly or anytime a new system goes live or anytime new data is collected or used in a different manner.&nbsp;</p>
<h3 class="wp-block-heading">Problem: over-collecting information without a clear purpose</h3>
<p>Organizations often default to &ldquo;gather it all&rdquo; thinking, such as those extra fields on onboarding forms, surveys, and benefits enrollment portals that don&rsquo;t have an immediate critical necessity.&nbsp;</p>
<p>However, the more data you collect, the more regulatory and security exposure you create.</p>
<p><strong>How to fix it: </strong>Audit every point where employee data is collected. If you can&rsquo;t point to a legal requirement or a documented business purpose for a field, eliminate it. Focus especially on <a href="https://redcloveradvisors.com/sensitive-data-understanding-managing-and-complying-with-privacy-laws/">sensitive categories</a> where laws may impose stricter standards.&nbsp;</p>
<p>Moving forward, incorporate data minimization reviews into annual HR processes.</p>
<h3 class="wp-block-heading">Problem: shadow IT and unsanctioned tools</h3>
<p>When teams adopt business tools without going through a privacy review, employee information can end up in unsecured environments. This makes it impossible to enforce retention, access, or deletion rights.</p>
<p><strong>How to fix it: </strong>Formalize a lightweight software approval workflow. Require teams to submit a quick intake form describing any new tool that touches employee data. Privacy and IT teams can review access controls, vendor practices, and data sharing settings before approval.&nbsp;</p>
<p>Make sure employee training programs&mdash;especially those related to AI usage&mdash;explain why entering sensitive data into tools like chatbots is risky.</p>
<h3 class="wp-block-heading">Problem: no real system for handling employee rights requests</h3>
<p>Under laws like the CCPA and GDPR, employees have the legal right to access, correct, or delete their personal data. That means your company needs a defined way to intake, verify, and respond to those requests&mdash;whether you get five a year or fifty.</p>
<p>Manual processes aren&rsquo;t necessarily non-compliant. But they can create delays, confusion, or missed steps&mdash;especially if requests span multiple systems, involve sensitive data, or require input from more than one team.</p>
<p><strong>How to fix it:</strong> Establish a clear workflow for handling employee rights requests. Assign responsibility for verifying identity, coordinating across departments, and tracking deadlines. If requests are rare, this might be a well-documented manual process. If volume or complexity increases, consider adding software to log activity, route requests, and maintain an audit trail.</p>
<h2 class="wp-block-heading">Let&rsquo;s get your employee privacy notice off the ground.</h2>
<p>At Red Clover Advisors, we help businesses turn privacy from a compliance headache into a real operational advantage. Whether you need a clear employee privacy notice, a working data inventory, a rights request process, or all of the above, we&rsquo;ll help you make privacy something you can be proud of, not something you scramble to fix later.</p>
<p>@media screen and (max-width: 1023px){section[data-id=&#8221;block_b40b45cd0e824aa385725dd450be5cc0&#8243;]{  margin-top: 0px; }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=&#8221;block_b40b45cd0e824aa385725dd450be5cc0&#8243;]{  margin-top: -50px; }}@media screen and (min-width: 1366px){section[data-id=&#8221;block_b40b45cd0e824aa385725dd450be5cc0&#8243;]{  margin-top: -50px; }}</p>
<section class="cta-section has-background has-background-color has-transparent-background-color has-white-color default" data-id="block_b40b45cd0e824aa385725dd450be5cc0" data-aos="fade-in">
<div class="wrapper">
<div class="cta has-background-color has-background has-dark-red-background-color has-image jarallax">
<div class="grid align-center">
<div class="col-grid lg-col-6-12 offset-lg-col-1-12">
                <span class="preheader xs-mb-5 has-text-color has-pale-gray-color" data-aos="fade-up">Downloadable Resource</span>                </p>
<h3 class="xs-mb-16 has-text-color has-white-color" data-aos="fade-up">Privacy Notice Roadmap: Business Guide</h3>
<div class="has-text-color has-white-color has-white-link-color" data-aos="fade-up">
<p><img style=" max-width: 100%; height: auto; " loading="lazy" decoding="async" class="size-medium wp-image-38710 alignleft" src="https://www.lexblog.com/wp-content/uploads/2025/06/rca-guide-privacy-notice-2024-7-26-232x300-1.png" alt="" width="232" height="300"></p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>Download our Privacy Notice Roadmap and take the guesswork out of creating clear, compliant, and consumer-friendly privacy notices.</p>
</div></div>
<div class="col-grid lg-col-2-12 offset-lg-col-2-12">
<div class="buttons" data-aos="fade-up">
                    <a href="https://redcloveradvisors.com/privacy-notice-roadmap-business-guide/" target="_self" class="button button-primary-reverse"><span>Learn More</span></a>                </div>
</p></div>
</p></div>
<p>            <img style=" max-width: 100%; height: auto; " decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/GettyImages-1319151669.jpg" alt=""></p></div>
</p></div>
</section>
<p>The post <a rel="nofollow" href="https://redcloveradvisors.com/employee-privacy-notice/">Employee Privacy Notice: Why Your Business Can&rsquo;t Afford to Wing It&nbsp;&nbsp;</a> appeared first on <a rel="nofollow" href="https://redcloveradvisors.com">Red Clover Advisors</a>.</p>
]]></content:encoded>
					
		
		
		<source url='https://redcloveradvisors.com/feed/'>Red Clover Advisors Blog</source>
	</item>
		<item>
		<title>One Month to Go: What You Need to Know about the U.S. Department of Justice’s Data Security Program</title>
		<link>https://www.lexblog.com/2025/06/09/one-month-to-go-what-you-need-to-know-about-the-u-s-department-of-justices-data-security-program/</link>
		
		<dc:creator><![CDATA[Jordan Jennings and Scot Ganow]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 21:08:12 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/one-month-to-go-what-you-need-to-know-about-the-u-s-department-of-justices-data-security-program/</guid>

					<description><![CDATA[Last year, we wrote about updates from the Department of Justice (DOJ) and the DOJ&#8217;s proposed enforcement efforts and regulations implementing Executive Order 14117 &#8220;Preventing Access to Americans&#8217; Bulk Sensitive Personal Data and United States Data by Countries of Concern&#8221; (Rule). A year later, the DOJ has finalized the Rule and developed guidance on what...]]></description>
										<content:encoded><![CDATA[
<figure style=" max-width: 100%; height: auto; " class="wp-block-image alignright size-large is-resized"><img decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/retro-black-alarm-clock-dissolving-into-little-particles-time-can-return-never-wait-anyone-time-management-concept-640x427-1.jpg" alt="" class="wp-image-2875" style=" max-width: 100%; height: auto; width:334px;height:auto"></figure>



<p>Last year, we wrote about <a href="https://www.privacyanddatasecurityinsight.com/2024/05/recent-executive-order-and-doj-rulemaking-prioritize-the-protection-of-sensitive-personal-data-from-countries-of-concern/">updates from the Department of Justice (DOJ)</a> and the DOJ&rsquo;s proposed enforcement efforts and regulations implementing Executive Order 14117 &ldquo;Preventing Access to Americans&rsquo; Bulk Sensitive Personal Data and United States Data by Countries of Concern&rdquo; (Rule). </p>



<p>A year later, the DOJ has finalized the Rule and developed guidance on what companies handling (i) bulk U.S. sensitive personal data or (ii) U.S. Government-related data must know, especially when interacting with persons and entities in &rdquo;Countries of Concern,&rdquo; which currently include:</p>



<ul class="wp-block-list">
<li>China (includes Hong Kong and Macau)</li>



<li>Cuba</li>



<li>Iran</li>



<li>North Korea</li>



<li>Russia</li>



<li>Venezuela</li>
</ul>



<p>In April of this year, the DOJ&rsquo;s National Security Division (NSD) issued its <a href="https://www.dwt.com/-/media/files/blogs/privacy-and-security-blog/2025/04/nsd-data-security-program--compliance-guide--04112.pdf?rev=ba573cfa002a495c8bde974c0c6ce46f&amp;hash=D95D1DC49D70768709D0F19609AB0E29">Data Security Program and corresponding Compliance Guide</a> (DSP) and <a href="https://www.justice.gov/opa/media/1396351/dl">Frequently Asked Questions</a> (FAQs) providing information that all U.S. entities must understand and follow to comply with the Rule. The NSD&rsquo;s stated primary mission with respect to the implementation and enforcement of the DSP is to protect U.S. national security from Countries of Concern that may seek to collect and weaponize both government data and Americans&rsquo; most sensitive personal data.</p>



<p>As we have written previously, the DSP will require U.S. organizations to look deeply into their data collection and data sharing practices to determine whether they are (i) providing covered data to a Country of Concern and (ii) subject to the DSP&rsquo;s requirements.<br><br><strong>All U.S. organizations handling government-related data and bulk U.S. sensitive personal data must make good-faith efforts to comply with the DSP by July 8, 2025</strong>. </p>



<span id="more-3227087"></span>



<p>Specifically between now and then, those organizations need to consider quickly doing the following:</p>



<ul class="wp-block-list">
<li><strong>Assess Applicability.</strong> U.S. organizations first need to understand if they process and share either &ldquo;government-related data&rdquo; or &ldquo;bulk transfers of sensitive personal information as part of &ldquo;covered transactions&rdquo; with &ldquo;covered persons&rdquo; in Countries of Concern<br></li>



<li><strong>Evaluate Compliance Status and Address Any Gaps.</strong> Assuming the law applies to a U.S. organization&rsquo;s data processing operations, the organization should conduct necessary diligence and take steps to comply. The DOJ has stated while the Rule is currently in effect, it will commence enforcement on July 8, 2025.</li>
</ul>



<p>To assist in this endeavor, the following is a general overview of what U.S. organizations should understand and be working on as the applicable compliance deadlines approach. As with all Taft PDS posts, this is not intended to be an exhaustive summary or legal advice. Businesses considering compliance should consult qualified legal counsel.&nbsp;</p>



<p><strong><u>Part 1: The Basics</u></strong></p>



<p>A. <strong>What is the DSP&rsquo;s Purpose? </strong>In line with the DSP&rsquo;s stated purposes to protect the data of the United States and its citizens, the DSP establishes export controls that prevent foreign adversaries in Countries of Concern, and those subject to their control and direction, from accessing U.S. Government-related data and bulk U.S. sensitive personal data.</p>



<p>B. <strong>What is &ldquo;U.S. Government-Related Data&rdquo; and &ldquo;Bulk U.S. Sensitive Personal Data?&rdquo;</strong></p>



<ol style="list-style-type:upper-alpha" class="wp-block-list"></ol>



<ol style="list-style-type:lower-roman" class="wp-block-list">
<li><strong>U.S. Government-Related Data.</strong> There are two types of government-related data.
<ul class="wp-block-list">
<li>The first type is any precise geolocation data, regardless of volume, for any location within any area enumerated on the <a href="https://www.ecfr.gov/current/title-28/chapter-I/part-202">Government-Related Location Data List</a>. Examples of such locations include: (i) worksite or duty station of Federal Government employees or contractors who occupy a national security position; (ii) a military installation; or (iii) facilities or locations that otherwise support the Federal Government&#8217;s national security, defense, intelligence, law enforcement, or foreign policy missions.</li>



<li>The second type of government-related data is any sensitive personal data, regardless of volume, that a transacting party markets as linked or linkable to current or recent former employees or contractors, or former senior officials, of the United States Government, including the military and the intelligence community.<br><br>The terms &ldquo;recent former employees&rdquo; or &ldquo;recent former contractors&rdquo; mean employees or contractors who worked for or provided services to the United States Government, in a paid or unpaid status, within the past two years of a potential Covered Data Transaction (defined below) with a Country of Concern or Covered Person (defined below).<br></li>
</ul>
</li>



<li><strong>Bulk U.S. Sensitive Personal Data.</strong> The term &ldquo;bulk U.S. sensitive personal data&rdquo; means a collection or set of sensitive personal data relating to U.S. persons, in any format, <em><u>regardless of whether the data is anonymized, pseudonymized, de-identified, or encrypted</u></em>, (emphasis added), where such data meets or exceeds the applicable &ldquo;bulk&rdquo; threshold set forth below.
<ul class="wp-block-list">
<li><span>&ldquo;Sensitive personal data&rdquo; means covered personal identifiers, precise geolocation data, biometric identifiers, human &lsquo;omic data, personal health data, personal financial data, or any combination thereof.</span></li>



<li>&ldquo;Bulk&rdquo; means any sensitive personal data that meets or exceeds the following thresholds at any point in the preceding 12 months, whether through a single Covered Data Transaction or aggregated across Covered Data Transactions involving the same U.S. person and the same foreign person or Covered Person:
<ul class="wp-block-list">
<li>Human `omic data collected about or maintained on more than <strong>1,000 </strong>U.S. persons, or, in the case of human genomic data, more than <strong>100</strong> U.S. persons;</li>



<li>Biometric identifiers collected about or maintained on more than <strong>1,000</strong> U.S. persons;</li>



<li>Precise geolocation data collected about or maintained on more than <strong>1,000 </strong>U.S. devices;</li>



<li>Personal health data collected about or maintained on more than <strong>10,000</strong> U.S. persons;</li>



<li>Personal financial data collected about or maintained on more than <strong>10,000 </strong>U.S. persons;</li>



<li>Covered personal identifiers collected about or maintained on more than <strong>100,000</strong> U.S. persons; or</li>



<li>Combined data, meaning any collection or set of data that contains more than one of the categories above, or that contains any listed identifier linked to categories in <a href="https://www.ecfr.gov/current/title-28/section-202.205#p-202.205(a)">paragraphs (a)</a> through <a href="https://www.ecfr.gov/current/title-28/section-202.205#p-202.205(e)">(e)</a> in &sect; 202.205, where any individual data type meets the threshold number of persons or devices collected or maintained in the aggregate for the lowest number of U.S. persons or U.S. devices in that category of data.</li>
</ul>
</li>
</ul>
</li>
</ol>



<p>C. <strong>Which Entities Must Abide by the DSP? </strong>The NSD expects all U.S. organizations <br>    to understand their transactions and comply with the DSP. Specifically, U.S.<br>    organizations should know:</p>



<ul class="wp-block-list">
<li>any Covered Person, including vendors, contractors, employees in a Country of Concern that the organization interacts with;</li>



<li>the kinds of data the organization collects or maintains on U.S. persons or U.S. devices;</li>



<li>the volume of data the organization collects or maintains on U.S. persons or U.S. devices (including whether this volume meets the &ldquo;bulk&rdquo; thresholds under the Rule);</li>



<li>how the organization uses the data;</li>



<li>whether the organization engages in Covered Data Transactions; and</li>



<li>how such data is marketed, particularly with respect to current or recent former employees or contractors, or former senior officials, of the United States government, including the military and intelligence community (e.g., if the organization collects certain data from members that are military/government employees).</li>
</ul>



<p>D. <strong>What is a Covered Data Transaction under the Rule and DSP? </strong>A Covered Data<br> Transaction is a transaction that (1) involves any access by a Country of Concern or<br> Covered Person to any government-related data or bulk U.S. sensitive personal data<br> and (2) that involves: (a) data brokerage; (b) a vendor agreement; (c) an employment<br> agreement; or (d) an investment agreement. These transactions are further classified<br> as prohibited and restricted transactions, with specific obligations and restrictions<br> that attach to each class of transaction.<br><br> The DSP does not address purely domestic data transactions between U.S. <br> persons&mdash;such as the collection, maintenance, processing, or use of data by <br> U.S. persons within the United States&mdash;unless such U.S. persons are designated <br> as Covered Persons by the NSD.<br><br>E. <strong>Who are Covered Persons? </strong>There are five classes of Covered Persons under the Rule and DSP, which include:</p>



<ul class="wp-block-list">
<li>an entity owned by, controlled by, or subject to the jurisdiction or direction of a Country of Concern;</li>



<li>a foreign person who is an employee or contractor of such an entity;</li>



<li>a foreign person who is an employee or contractor of a Country of Concern;</li>



<li>a foreign person who is primarily resident in the territorial jurisdiction of a Country of Concern;</li>



<li>those persons NSD designates and publicly identifies (including both foreign and U.S. persons) as Covered Persons. The NSD will add designated persons to the Covered Persons List published in the Federal Register. Designated Covered Persons retain their Covered Persons status, even when located in the United States.</li>
</ul>



<p>F. <strong></strong><strong>Prohibitions, restrictions, and exemptions.</strong></p>



<ul class="wp-block-list">
<li><strong>Prohibitions. </strong>U.S. companies are prohibited from knowingly engaging in data brokerage transactions, the sharing of government related data, or the bulk transfer of sensitive personal data with Covered Persons in COC. U.S. companies must not knowingly engage in the sharing of Covered Data as part of Covered Transactions with Covered Persons in COC after April 8, 2025.&nbsp; Likewise, companies should not direct other entities or foreign individuals to do so.&nbsp; This knowledge requirement is met if a U.S. organization has &ldquo;actual knowledge or reasonably should have known that the transaction involved access to covered data by a covered person.&rdquo;</li>



<li><strong>Restrictions.</strong> The Rule also places certain restrictions on other Covered Transactions involving vendor, employment or investment agreements.&nbsp; Restricted transactions are allowed, provided the business or person:
<ul class="wp-block-list">
<li>CISA safeguards. The organization must adhere to cybersecurity requirements issued by the Cybersecurity and Infrastructure Security Agency (CISA);</li>



<li>Compliance program. The organization must establish and maintain an individualized, risk-based and written data compliance program, which meets several minimum requirements.</li>



<li>Annual audits. The organization must conduct independent audits on an annual basis that address the requirements of the DSP. Record requirements. The organization must also comply with applicable recordkeeping and reporting obligations.</li>
</ul>
</li>



<li><strong>Exemptions.</strong> The Rule, through the DSP, also provides the following exemptions from the Rule&rsquo;s requirements:
<ul class="wp-block-list">
<li>personal communications that do not involve the transfer of anything of value;</li>



<li>importation or exportation of any information or informational materials (which is limited to expressive material); activities or transactions ordinarily incident to travel from any country and related transactions;</li>



<li>conducted for official U.S. government business;</li>



<li>ordinarily incident to and part of the provision of financial services described in the regulations;</li>



<li>corporate group transactions to the extent that they are ordinarily incident to and part of administrative or ancillary business operations (such as, among other things, payroll transactions or business taxes);</li>



<li>transactions required or authorized by federal law or international agreement, or necessary to comply with federal law; investment agreements subject to a Committee of Foreign Investment in the United States (CFIUS) action defined under the regulations. DSP obligations apply until and unless CFIUS takes action;</li>



<li>ordinarily incident to telecommunications services, including the provision of voice and data communications services, but not all internet-based services, like cloud computing. This exemption does not apply for transactions involving data brokerage; and certain drug, biological product and medical device authorizations, and other clinical investigations and post-marketing surveillance data.</li>



<li>certain drug, biological, product and medical device authorizations, and other clinical investigations and post-marketing surveillance data.</li>
</ul>
</li>
</ul>



<p><strong><u>Part 2. Implementation, Compliance and Enforcement</u></strong></p>



<p>A. <strong>What are the Penalties for Violating the DSP? </strong>The NSD, under the International Emergency Economic Powers Act (IEEPA) and the DSP, has the authority to bring both civil enforcement actions and criminal penalties for violations of the DSP&rsquo;s requirements. Unlawful acts under the IEEPA may lead to penalties up to the greater amount of $368,136 or twice the value of each violative transaction. Willful violations of IEEPA are punishable by imprisonment of up to 20 years and a $1,000,000 fine.</p>



<p>B.<strong> Relevant Compliance Timeline &amp; Grace Period</strong></p>



<p>The NSD&rsquo;s core prohibitions and restrictions on Covered Data Transactions took effect April 8, 2025. <em><u>The NSD stated a grace period is in effect and it will not prioritize its civil enforcement of the DSP until July 8, 2025</u></em>. However, the NSD will immediately pursue enforcement for willful and egregious violations of the DSP.</p>



<p>Additional DSP compliance requirements, including those related to due diligence, auditing and reporting take effect on October 6, 2025.</p>



<p>C.<strong> Business Considerations During the DSP Grace Period</strong></p>



<p>All U.S. organizations handling government-related data and bulk U.S. sensitive personal data must make good-faith efforts to comply with the DSP. Specifically between now and the end of the enforcement grace period on July 8, 2025, the NSD has identified initial efforts for covered entities to include:</p>



<ul class="wp-block-list">
<li>conducting an internal review of access to sensitive personal data, including whether transactions involving access to such data flows constitute data brokerage;</li>



<li>reviewing internal datasets and datatypes to determine if they are potentially subject to the DSP</li>



<li>renegotiating vendor agreements or negotiating contracts with new vendors;</li>



<li>transferring products and services to new vendors;</li>



<li>conducting due diligence on potential new vendors;</li>



<li>negotiating contractual onward transfer provisions with foreign persons who are the counterparties to data brokerage transactions</li>



<li>adjusting employee work locations, roles, or responsibilities;</li>



<li>evaluating investments from Countries of Concern to Covered Persons;</li>



<li>renegotiating investment agreements with Countries of Concern or Covered Persons;</li>



<li>implementing the Cybersecurity and Infrastructure Agency (CISA) <a href="https://www.federalregister.gov/documents/2025/01/08/2024-31479/notice-of-availability-of-security-requirements-for-restricted-transactions-under-executive-order">Security Requirements</a>, including the combination of data-level requirements necessary to preclude Covered Person access to regulated data for restricted transactions.</li>
</ul>



<p>The robust initial steps prescribed by the NSD show that DSP compliance is not something companies should place on the back burner. While the steps above may seem daunting and time-consuming, the first step every U.S. organization should take is determining whether your organization interacts with Covered Persons/ is doing business in a Country of Concern. From there, working with legal counsel to get up to speed on DSP compliance is imperative. The NSD also encourages the public to contact the Division at <a href="mailto:nsd.firs.datasecurity@usdoj.gov">nsd.firs.datasecurity@usdoj.gov</a> with questions or information about the DSP and the guidance NSD has released.</p>



<p></p>



<p><a href="https://www.taftlaw.com/services/practices/privacy-and-data-security/">Taft&rsquo;s Privacy &amp; Data Security team</a> will continue to monitor updates from the DOJ on the Rule, DSP compliance and DSP enforcement actions. For more data privacy &amp; security-related updates, please visit <a href="https://www.privacyanddatasecurityinsight.com/" target="_blank" rel="noreferrer noopener">Taft&rsquo;s Privacy &amp; Data Security Insights blog</a>.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.privacyanddatasecurityinsight.com/'>Taft Privacy &amp; Data Security Insights</source>
	</item>
		<item>
		<title>DAA Launches AI-Focused Review of Interest-Based Advertising Self-Regulatory Principles</title>
		<link>https://www.lexblog.com/2025/06/09/daa-launches-ai-focused-review-of-interest-based-advertising-self-regulatory-principles/</link>
		
		<dc:creator><![CDATA[Mo Pham-Khan and Gregory P. Szewczyk]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 19:52:23 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[Technology]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/daa-launches-ai-focused-review-of-interest-based-advertising-self-regulatory-principles/</guid>

					<description><![CDATA[On June 4, 2025, the Digital Advertising Alliance (&#8220;DAA&#8221;), the self-regulatory body that sets and enforces privacy standards for digital advertising, announced it is launching a process to determine if it is necessary to issue new guidance to address how the DAA&#8217;s Self-Regulatory Principles apply to the use of artificial intelligence systems and tools that...]]></description>
										<content:encoded><![CDATA[
<p class="is-style-default">On June 4, 2025, the Digital Advertising Alliance (&ldquo;DAA&rdquo;), the self-regulatory body that sets and enforces privacy standards for digital advertising, announced it is launching a process to determine if it is necessary to issue new guidance to address how the <a href="https://digitaladvertisingalliance.org/principles">DAA&rsquo;s Self-Regulatory Principles</a> apply to the use of artificial intelligence systems and tools that leverage interest-based advertising (&ldquo;IBA&rdquo;) data.&nbsp;</p>



<p>The DAA intends to meet with relevant stakeholders, such as trade associations, advertisers, publishers, and ad tech over the coming weeks to consider the following issues:</p>



<ul class="wp-block-list">
<li>the appropriate industry participants;</li>



<li>the current and anticipated use cases for IBA data by AI systems and tools;</li>



<li>&nbsp;consumer expectations around the collection and use of such data; and</li>



<li>the legal and regulatory gaps/overlaps with any such guidance</li>
</ul>



<p>While it is too early to tell what specific guidance will entail, the CEO of the DAA stated in the DAA&rsquo;s announcement that the goal of the review is to &ldquo;look at the steps companies can take to ensure they are providing appropriate information and control to consumers around the collection and use of IBA data by those [artificial intelligence] systems.&rdquo;</p>

]]></content:encoded>
					
		
		
		<source url='https://www.cyberadviserblog.com/'>CyberAdviser</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/online-advertising-6693945_960_720.png' type='image/jpeg' length='36399' />	</item>
		<item>
		<title>White House Issues New Cybersecurity Executive Order</title>
		<link>https://www.lexblog.com/2025/06/09/white-house-issues-new-cybersecurity-executive-order-2/</link>
		
		<dc:creator><![CDATA[Ashden Fein, Susan B. Cassidy, Robert Huffman, Micaela McMurrough, Caleb Skeath, Joshua Williams, Ryan Burnette, Shayan Karbassi, Sierra Stubbs and Krissy Chapman]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 19:00:29 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/white-house-issues-new-cybersecurity-executive-order-2/</guid>

					<description><![CDATA[On June 6, 2025, President Trump issued an Executive Order (&#8220;Sustaining Select Efforts to Strengthen the Nation&#8217;s Cybersecurity and Amending Executive Order 13694 and Executive Order 14144&#8221;) (the &#8220;Order&#8221;) that modifies certain initiatives in prior Executive Orders issued by Presidents Obama and Biden and highlights key cybersecurity priorities for the current Administration. &#160;Specifically, the Order...]]></description>
										<content:encoded><![CDATA[
<p>On June 6, 2025, President Trump issued an <a href="https://www.whitehouse.gov/presidential-actions/2025/06/sustaining-select-efforts-to-strengthen-the-nations-cybersecurity-and-amending-executive-order-13694-and-executive-order-14144/">Executive Order (&ldquo;Sustaining Select Efforts to Strengthen the Nation&rsquo;s Cybersecurity and Amending Executive Order 13694 and Executive Order 14144&rdquo;</a>) (the &ldquo;Order&rdquo;) that modifies certain initiatives in prior Executive Orders issued by Presidents Obama and Biden and highlights key cybersecurity priorities for the current Administration. &nbsp;Specifically, the Order (i) directs that existing federal government regulations and policy be revised to focus on securing third-party software supply chains, quantum cryptography, artificial intelligence, and Internet of Things (&ldquo;IoT&rdquo;) devices and (ii) more expressly focuses cybersecurity-related sanctions authorities on &ldquo;foreign&rdquo; persons.&nbsp; Although the Order makes certain changes to prior cybersecurity related Executive Orders issued under previous administrations, it generally leaves the framework of those Executive Orders in place.&nbsp; Further, it does not appear to modify other cybersecurity Executive Orders.<a href="#_ftn1" id="_ftnref1">[1]</a>&nbsp; To that end, although the Order highlights some areas where the Trump administration has taken a different approach than prior administrations, it also signals a more general alignment between administrations on core cybersecurity principles.</p>



<span id="more-3227228"></span>



<p>The first section below provides a summary of revisions to existing federal government policy.&nbsp; The second section provides a chart of new directives to federal government departments and agencies.</p>



<p><strong>Amendments to Prior Orders</strong></p>



<p>The new Order seeks to amend existing federal government policies and regulations (as previously set by Executive Orders <a href="https://www.insidegovernmentcontracts.com/2025/03/january-and-february-2025-cybersecurity-developments-under-the-biden-and-trump-administrations/">14144</a> and <a href="https://www.cov.com/-/media/files/corporate/publications/2017/01/us_expands_sanctions_takes_other_steps_in_response_to_russias_election_related_cyber_operations.pdf">13694</a>) to: (i) remove certain requirements for secure software development attestations, directives tied to acceptance of digital identity documentation, and certain technical hardening measures for identity verification and email encryption, and (ii) more expressly focus cybersecurity-related sanctions authorities specifically to foreign (as opposed to any) cyber threat actors that target U.S. critical infrastructure.</p>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Secure Software Acquisition</span>: &nbsp;The Order removes certain requirements relating to secure software attestations that federal government contractors must submit to contracting agencies.&nbsp; This includes elimination of the requirement that attestations must be in machine readable format.&nbsp; This also includes elimination of the directive for centralized validation of software attestations by the Cybersecurity and Infrastructure Security Agency (&ldquo;CISA&rdquo;).&nbsp; Likewise, the associated directive to the Federal Acquisition Regulatory Council to amend the Federal Acquisition Regulation (&ldquo;FAR&rdquo;) to incorporate those requirements has also been eliminated.&nbsp; The Fact Sheet accompanying the Order notes that one goal was to eliminate requirements for &ldquo;imposing unproven and burdensome software accounting processes that prioritized compliance checklists over genuine security investments.&rdquo;&nbsp; However, the Order did not address the more general requirement for software attestations that appeared in the May 2021 Executive Order No. 14028 &ldquo;Improving the Nation&rsquo;s Cybersecurity,&rdquo; as implemented through Office of Management and Budget (&ldquo;OMB&rdquo;) Memoranda (<a href="https://www.whitehouse.gov/wp-content/uploads/2023/06/M-23-16-Update-to-M-22-18-Enhancing-Software-Security.pdf">M-23-16</a> and <a href="https://www.whitehouse.gov/wp-content/uploads/2022/09/M-22-18.pdf">M-22-18</a>) and the CISA Common Self-Attestation Form.&nbsp; Thus, it is unclear whether this Administration will promulgate regulations that would implement those requirements from the 2021 EO within the FAR, suspend any requirement for further attestations until NIST issues the final update of its Secure Software Development Framework required by the Order, eliminate the requirement for attestations altogether, or impose attestation requirements&nbsp; on a contract-by-contract basis and continue to maintain the CISA repository for those forms.</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Solutions to Combat Cyber Crime and Fraud</span>:&nbsp; The Order removes prior directives for federal government agencies to accept digital identity documentation (e.g., digital driver&rsquo;s licenses) for public benefit programs.</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Identity Technologies</span>: The Order removes prior requirements for the Federal Civilian Executive Branch (&ldquo;FCEB&rdquo;) to deploy commercial phishing-resistant standards such as &ldquo;WebAuthn.&rdquo;</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Email Encryption</span>: The Order removes a directive to OMB to require the expanded use of authenticated transport-layer encryption (&ldquo;TLS&rdquo;) between email servers used by FCEB agencies to send and receive emails.</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Quantum Computing</span>:&nbsp; The Order scales back quantum computing initiatives, included as part of <a href="https://bidenwhitehouse.archives.gov/briefing-room/statements-releases/2022/05/04/national-security-memorandum-on-promoting-united-states-leadership-in-quantum-computing-while-mitigating-risks-to-vulnerable-cryptographic-systems/">National Security Memorandum 10</a> (&ldquo;NSM-10&rdquo;)(&ldquo;On Promoting United States Leadership in Quantum Computing While Mitigating Risk to Vulnerable Cryptographic Systems,&rdquo; May 4, 2022) implemented through OMB Memorandum (<a href="https://www.whitehouse.gov/wp-content/uploads/2022/11/M-23-02-M-Memo-on-Migrating-to-Post-Quantum-Cryptography.pdf">M-23-02</a>), that required federal agencies to adopt post-quantum cryptography (&ldquo;PQC&rdquo;) as quickly as feasible and encourage technology vendors to do the same, as well as pushing for it being accepted internationally.&nbsp; The Order retains only a requirement for CISA to maintain a list of product categories where PQC-enabled tools are widely available.</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Artificial Intelligence (&ldquo;AI&rdquo;)</span>: The Order amends E.O. 14144&rsquo;s existing approach to security with and within AI as well as E.O. 14110 (&ldquo;Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,&rdquo; October 30, 2023), which encouraged AI-driven collaboration across industry and had tasked federal agencies with aggressively exploring artificial intelligence for cybersecurity defense. &nbsp;The Order instead takes a more focused view, requiring agencies to make existing datasets for cyber defense research accessible to the academic community to the extent feasible and for agencies to incorporate AI software vulnerabilities and compromises into their existing processes for vulnerability management and disclosure.</li>
</ul>



<ul class="wp-block-list">
<li><span style="text-decoration: underline">Focus on &ldquo;Foreign&rdquo; Cyber Threat Actors</span>: The Order amends existing cybersecurity-related sanctions authorities for malicious actors engaged in cyber-enabled activities that pose a threat to U.S. national security, foreign policy, economic health, or financial stability, including those targeting U.S. critical infrastructure, to limit these authorities to <em>foreign</em> malicious actors and thereby more clearly excluding domestic individuals or activities from the scope of the authorities.&nbsp; The accompanying <a href="https://www.whitehouse.gov/fact-sheets/2025/06/fact-sheet-president-donald-j-trump-reprioritizes-cybersecurity-efforts-to-protect-america/">Fact Sheet</a> further explains that the focus on <em>foreign </em>malicious actors is to prevent &ldquo;misuse&rdquo; of the sanctions authorities &ldquo;against domestic political opponents,&rdquo; and clarifies that &ldquo;sanctions do not apply to election-related activities.&rdquo;&nbsp; The Order and the accompanying Fact Sheet do not provide any additional information about whether the amendments are intended to exempt foreign cyber operations directed at U.S. election activities, though the underlying sanctions authorities do still address malicious cyber-enabled activities that involve &ldquo;tampering with, altering, or causing a misappropriation of information with the purpose of or that involves interfering with or undermining election processes or institutions.&rdquo;</li>
</ul>



<p>The Federal Communications Commission&rsquo;s cybersecurity labeling program, Cybersecurity Labeling for Internet of Things (proposed rule, 47 CFR Part 8) has remained. &nbsp;This program was modeled after the Energy Star efficiency label and will certify internet-connected consumer products, such as IoT devices, based on whether they meet certain cybersecurity criteria verified by accredited labs.</p>



<p><strong>Timeline of New Directives</strong></p>



<p>The below table outlines the directives to federal government departments and agencies, including: the Departments of Commerce, Defense, Energy, and Homeland Security as well as CISA, OMB, the National Institute of Standards and Technology (&ldquo;NIST&rdquo;), the Office of the Director of National Intelligence (&ldquo;ODNI&rdquo;), the National Security Agency (&ldquo;NSA&rdquo;), the National Science Foundation (&ldquo;NSF&rdquo;), the Office of Science and Technology Policy (&ldquo;OSTP&rdquo;), and the Office of the National Cyber Director (&ldquo;ONCD&rdquo;).<a id="_ftnref1" href="#_ftn1">[2]</a></p>



<figure style=" max-width: 100%; height: auto; " class="wp-block-image aligncenter size-full is-resized"><img decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/image-2.png" alt="" class="wp-image-15615" style=" max-width: 100%; height: auto; width:642px;height:auto"></figure>



<figure style=" max-width: 100%; height: auto; " class="wp-block-image aligncenter size-full"><img style=" max-width: 100%; height: auto; " decoding="async" src="https://www.lexblog.com/wp-content/uploads/2025/06/image-3.png" alt="" class="wp-image-15616"></figure>



<p class="has-text-align-center"><em>Table 1:&nbsp; Summary of directives to Departments and Agencies.</em></p>



<p></p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity">



<p><a href="#_ftnref1" id="_ftn1">[1]</a> Section 2 also provides, &ldquo;Except as specifically provided for in subsection 4(f) of [Executive Order <a href="https://www.insidegovernmentcontracts.com/2025/03/january-and-february-2025-cybersecurity-developments-under-the-biden-and-trump-administrations/">14144</a>], sections 1 through 7 of [Executive Order <a href="https://www.insidegovernmentcontracts.com/2025/03/january-and-february-2025-cybersecurity-developments-under-the-biden-and-trump-administrations/">14144</a>] shall not apply to Federal information systems that are NSS or are otherwise identified by the Department of Defense or the Intelligence Community as debilitating impact systems.&rdquo;</p>



<hr class="wp-block-separator has-alpha-channel-opacity">



<p><a id="_ftn1" href="#_ftnref1">[2]</a> For example, the Order does not rescind or modify Biden&rsquo;s Executive Order 14028 (&ldquo;Including Biden&rsquo;s Executive Order 14028&rdquo;).&nbsp;</p>

]]></content:encoded>
					
		
		
		<source url='https://www.insideprivacy.com/'>Inside Privacy</source>
	</item>
		<item>
		<title>New Jersey SLAPPs Back: New Jersey Court of Appeals Eradicates Anti-SLAPP Loophole</title>
		<link>https://www.lexblog.com/2025/06/09/new-jersey-slapps-back-new-jersey-court-of-appeals-eradicates-anti-slapp-loophole/</link>
		
		<dc:creator><![CDATA[Hannah Yozzo and Blaine C. Kimrey]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 19:51:30 +0000</pubDate>
				<category><![CDATA[Communications, Media & Entertainment]]></category>
		<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/new-jersey-slapps-back-new-jersey-court-of-appeals-eradicates-anti-slapp-loophole/</guid>

					<description><![CDATA[On May 29, 2025, the New Jersey Court of Appeals reversed dismissal in Satz v. Starr, No. A-2785-23, 2025 WL 1522032 (N.J. Super. Ct. App. Div. May 29, 2025), holding that the plaintiff&#8217;s voluntary dismissal of his claims did not preclude the defendants from seeking counsel fees and costs under New Jersey&#8217;s anti-SLAPP law, the...]]></description>
										<content:encoded><![CDATA[
<p>On May 29, 2025, the New Jersey Court of Appeals reversed dismissal in <em>Satz v. Starr, </em>No. A-2785-23, 2025 WL 1522032 (N.J. Super. Ct. App. Div. May 29, 2025<em>)</em>, holding that the plaintiff&rsquo;s voluntary dismissal of his claims did not preclude the defendants from seeking counsel fees and costs under New Jersey&rsquo;s anti-SLAPP law, the Uniform Public Expression Protection Act (UPEPA), N.J.S.A. 2A:53A-49. While precedent in New Jersey, this decision can be looked to as persuasive authority in other UPEPA jurisdictions that a plaintiff cannot avoid liability under UPEPA simply by voluntarily dismissing the complaint.</p>



<span id="more-3226989"></span>



<p>The plaintiff, Allen Satz, brought a suit against the <em>Jewish Link</em>, Organization For The Resolution Of Agunot, Keshet Starr, Beis Medrash, and Rabbi Neal Turk after they circulated a flyer in the <em>Jewish Link</em>, a Jewish newspaper, encouraging Satz to grant his wife a religious divorce. <em>Id</em> at *1. Satz alleged there was an unflattering picture of him in the flyer and that the flyer called for a protest outside his parents&rsquo; home. <em>Id</em>.</p>



<p>The defendants argued that Satz was targeting their First Amendment rights; therefore, the complaint was subject to dismissal under UPEPA. <em>Id</em>. The defendants&rsquo; reply brief included an order to show cause, seeking attorney&rsquo;s fees and costs under UPEPA. <em>Id</em>. Satz voluntarily dismissed his complaint. <em>Id</em>. The defendants asked the trial court to reopen the case for the limited purpose of ruling on their motion for relief and order to show cause under UPEPA. <em>Id</em> at *2. The trial court denied their motion because they had not answered the complaint (even though they had had the opportunity to answer). <em>Id</em>.</p>



<p>The defendants appealed this decision, arguing that their motion should have been granted because there is a strong public interest in ensuring that a plaintiff can&rsquo;t just dismiss its complaint to avoid liability under UPEPA. <em>Id</em>. The defendants asserted that by allowing the trial court&rsquo;s decision to stand, the Court of Appeals would create a loophole in UPEPA that would allow other litigants to bring similar suits to punish public participation. <em>Id</em>.</p>



<p>In a decision delivered by Appellate Judge Hany Mawla, the New Jersey Court of Appeals reversed the trial court&rsquo;s decision and remanded the case, directing the trial court to hear the defendants&rsquo; order to show cause according to the UPEPA procedures. <em>Id</em> at *5. The court held that allowing the trial court&rsquo;s ruling to stand would &ldquo;contravene legislative intent and create a loophole in the UPEPA allowing SLAPP plaintiffs to financially harm New Jersey residents who are the subject of their lawsuits and then strategically dismiss their suits.&rdquo; <em>Id</em>. The Court of Appeals concluded that the trial court misapplied New Jersey law and that the plain language of N.J.S.A. 2A:53A-51 does not require a defendant to have filed an answer before moving to dismiss a complaint. <em>Id</em> at *4. The Court of Appeals also ruled, contrary to the trial court&rsquo;s reasoning that the defendants could file a separate suit for fees, that UPEPA contemplates applications for fees and costs to be heard on orders to show cause. <em>Id</em>. Since the Uniform Law Commission adopted UPEPA on July 15, 2020, 13 states have enacted some version of &nbsp;UPEPA as their anti-SLAPP law: Hawaii, Idaho, Iowa, Kentucky, Maine, Minnesota, Montana, New Jersey, Ohio, Oregon, Pennsylvania, Utah, and Washington.<a id="_ftnref1" href="#_ftn1">[1]</a> The <em>Satz</em> decision likely will be cited all over the country as support for protecting the intent of UPEPA and protecting defendants from groundless lawsuits brought to curtail free speech. Every year more states adopt UPEPA, so the <em>Satz</em> decision&rsquo;s importance will only grow as time goes on.</p>



<p>[1] <em>Public Expression Protection Act</em>, Uniform Law Commission, last visited June 4, 2025, <a href="https://www.uniformlaws.org/committees/community-home?communitykey=4f486460-199c-49d7-9fac-05570be1e7b1">https://www.uniformlaws.org/committees/community-home?communitykey=4f486460-199c-49d7-9fac-05570be1e7b1</a>.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.mediaandprivacyriskreport.com/'>Media &amp; Privacy Risk Report</source>
<enclosure url='https://www.lexblog.com/wp-content/uploads/2025/06/158540094.jpg' type='image/jpeg' length='210352' />	</item>
		<item>
		<title>CISA Releases AI Data Security Guidance</title>
		<link>https://www.lexblog.com/2025/06/09/cisa-releases-ai-data-security-guidance-2/</link>
		
		<dc:creator><![CDATA[Susan B. Cassidy, Ashden Fein, Caleb Skeath, Micaela McMurrough, Robert Huffman, Moriah Daugherty, Ryan Burnette, Bolatito Adetula and Grace Howard]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 18:22:51 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<category><![CDATA[AI]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/cisa-releases-ai-data-security-guidance-2/</guid>

					<description><![CDATA[On May 22, 2025, the Cybersecurity and Infrastructure Security Agency (&#8220;CISA&#8221;), which sits within the Department of Homeland Security (&#8220;DHS&#8221;) released guidance for AI system operators regarding managing data security risks.&#160; The associated press release explains that the guidance provides &#8220;best practices for system operators to mitigate cyber risks through the artificial intelligence lifecycle, including...]]></description>
										<content:encoded><![CDATA[
<p>On May 22, 2025, the Cybersecurity and Infrastructure Security Agency (&ldquo;CISA&rdquo;), which sits within the Department of Homeland Security (&ldquo;DHS&rdquo;) <a href="https://media.defense.gov/2025/May/22/2003720601/-1/-1/0/CSI_AI_DATA_SECURITY.PDF">released guidance</a> for AI system operators regarding managing data security risks.&nbsp; The associated <a href="https://www.cisa.gov/news-events/alerts/2025/05/22/new-best-practices-guide-securing-ai-data-released">press release</a> explains that the guidance provides &ldquo;best practices for system operators to mitigate cyber risks through the artificial intelligence lifecycle, including consideration on securing the data supply chain and protecting data against unauthorized modification by threat actors.&rdquo;&nbsp; CISA published the guidance in conjunction with the National Security Agency, the Federal Bureau of Investigation, and cyber agencies from Australia, the United Kingdom, and New Zealand.&nbsp; This guidance is intended for organizations using AI systems in their operations, including Defense Industrial Bases, National Security Systems owners, federal agencies, and Critical Infrastructure owners and operators. This guidance builds on the <a href="https://www.cisa.gov/news-events/alerts/2024/04/15/joint-guidance-deploying-ai-systems-securely">Joint Guidance on Deploying AI Systems Security</a> released by CISA and several other U.S. and foreign agencies in April 2024.</p>



<span id="more-3226882"></span>



<p>The guidance&rsquo;s stated goals include raising awareness of the potential data security risks of AI systems, providing best practices for securing AI, and establishing a strong foundation for data security in AI systems. &nbsp;The first part of the guidance outlines a set of cybersecurity best practices for AI systems, after which the guidance provides additional detail on three separate risk categories for AI systems (data supply chain risks, maliciously modified data, and data drift) and describes mitigation recommendations for each risk category.</p>



<p>The guidance outlines ten cybersecurity best practices that are specific to AI systems, and refers to the <a href="https://csrc.nist.gov/pubs/sp/800/53/r5/upd1/final">NIST SP 800-53</a> &ldquo;Security and Privacy Controls for Information Systems and Organizations&rdquo; for additional details on general cybersecurity best practices (though does not specify any particular applicable baseline).&nbsp; Several of the best practices, such as &ldquo;source reliable data and track data provenance,&rdquo; and &ldquo;verify and maintain data integrity during storage and transport,&rdquo; align with the data supply chain risks discussed in greater detail further below in the guidance.&nbsp;&nbsp; Many of the other best practices build on security practices described in NIST SP 800-53 other common security frameworks, such as classifying data, leveraging access controls and trusted infrastructure, encrypting data, and storing and deleting data securely.&nbsp; The guidance&rsquo;s best practices also reference leveraging privacy-preserving techniques, such as data depersonalization or differential privacy, and conducting ongoing data security risk assessments.</p>



<p>In the section of the guidance devoted to data supply chain risks, the guidance discusses general risks and identifies three specific risks.&nbsp; The general risks section warns that &ldquo;one cannot simply assume that [web-scale] datasets are clean, accurate, and free of malicious content.&rdquo;&nbsp; The guidance offers several mitigation strategies, including dataset verification, using content credentials to track the provenance of data, requesting assurances of a foundation model trained by another party, requiring certification from dataset providers, and securely storing data after ingest.&nbsp;</p>



<p>In addition to this general risk, the guidance identifies &ldquo;curated web-scale datasets&rdquo; as the first of three specific data supply chain risks.&nbsp; The guidance notes that curated AI datasets are vulnerable to a technique known as &ldquo;split-view poisoning&rdquo; which can arise when someone purchases an expired domain and manipulates the data.&nbsp; The second risk is &ldquo;collected web-scale datasets,&rdquo; which are vulnerable to &ldquo;frontrunning poisoning techniques.&rdquo;&nbsp; This occurs when malicious examples are injected just before crowd-sourced content is collected from a website.&nbsp; The third risk is &ldquo;web-crawled datasets,&rdquo; which is described as an inherently risky type of dataset because it is less curated. &nbsp;The guidance provides a variety of mitigation strategies that include broad recommendations like dataset verification to detect abnormalities to more specific recommendations including using raw data hashes with hash verification.</p>



<p>Next, the guidance identifies risks and mitigation strategies for maliciously modified data, explaining that &ldquo;deliberate manipulation of data can result in inaccurate outcomes, poor</p>



<p>decisions, and compromised security.&rdquo;&nbsp; The risks include adversarial machine learning threats, bad data statements, statistical bias, data poisoning from inaccurate information, and data duplications.&nbsp; &nbsp;The guidance proposes various mitigation strategies to address these risks.&nbsp; For example, it recommends sanitizing the training data to reduce the impact of outliers and poisoned inputs.&nbsp; Similarly, it suggests that metadata validation may be helpful to check the completeness and consistency of metadata before it is used for AI training.</p>



<p>Finally, the guidance describes risks associated with data drift.&nbsp; The guidance explains that data drift occurs naturally over time as the statistical properties of input data become different from those of the original data used to train the model.&nbsp; The guidance suggests that data drift can be mitigated by &ldquo;incorporating application-specific data management protocols&rdquo; including continuous monitoring, retraining a model with new data, and data cleansing.&nbsp; The mitigation strategies in this section focus on data management which can include continuous monitoring and data cleansing.&nbsp; Many of the mitigation strategies posed in the earlier sections are good practices that can be applied here as well. Overall, the guidance notes that &ldquo;organizations can fortify their AI systems against potential threats and safeguard sensitive, proprietary, and mission critical data used in the development and operation of their AI systems&rdquo; by identifying risks and adopting best practices.&nbsp; The guidance serves as a reminder to organizations of the importance of data security to maintaining the accuracy, reliability, and integrity of AI, and the unique cybersecurity risks that are applicable to these types of systems.</p>

]]></content:encoded>
					
		
		
		<source url='https://www.insideprivacy.com/'>Inside Privacy</source>
	</item>
		<item>
		<title>What’s New in Digital Asset Policy?</title>
		<link>https://www.lexblog.com/2025/06/09/whats-new-in-digital-asset-policy/</link>
		
		<dc:creator><![CDATA[Mimi Bair and David Hirsch]]></dc:creator>
		<pubDate>Mon, 09 Jun 2025 18:14:38 +0000</pubDate>
				<category><![CDATA[Privacy & Data Security]]></category>
		<guid isPermaLink="false">https://www.lexblog.com/2025/06/09/whats-new-in-digital-asset-policy/</guid>

					<description><![CDATA[On May 20, 2025, the Senate cleared procedural obstacles to consider the GENIUS Act on the Senate floor. Originally introduced on Feb. 4, by Senator Bill Hagerty, R-TN, along with Senate Banking Committee Chairman Tim Scott, R-SC, Kirsten Gillibrand, D-NY, and Cynthia Lummis, R-WY, the Guiding and Establishing National Innovation for U.S. Stablecoins of 2025...]]></description>
										<content:encoded><![CDATA[
<p>On May 20, 2025, the Senate cleared procedural obstacles to consider the GENIUS Act on the Senate floor. Originally introduced on Feb. 4, by Senator Bill Hagerty, R-TN, along with Senate Banking Committee Chairman Tim Scott, R-SC, Kirsten Gillibrand, D-NY, and Cynthia Lummis, R-WY, the Guiding and Establishing National Innovation for U.S. Stablecoins of 2025 (GENIUS) Act would define and regulate payment stablecoins.&nbsp;Payment stablecoins are digital assets designed to maintain a stable value relative to another asset. More than 99% of stablecoins tie their value to the U.S. dollar.</p>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button is-style-outline is-style-outline--1"><a class="wp-block-button__link wp-element-button" href="https://mwcllc.com/2025/06/05/whats-new-in-digital-asset-policy/" target="_blank" rel="noreferrer noopener">Read more</a></div>
</div>

]]></content:encoded>
					
		
		
		<source url='https://www.passwordprotectedlaw.com/'>Password Protected</source>
	</item>
	</channel>
</rss>
