<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Data, Privacy &amp; Cybersecurity Insights</title>
	<atom:link href="https://www.goodwinprivacyblog.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.goodwinprivacyblog.com/</link>
	<description></description>
	<lastBuildDate>Fri, 19 Sep 2025 15:10:31 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Multistate Privacy Enforcement Sweep Puts Global Privacy Control in the Spotlight</title>
		<link>https://www.goodwinprivacyblog.com/2025/09/17/multistate-privacy-enforcement-sweep-puts-global-privacy-control-in-the-spotlight/</link>
		
		<dc:creator><![CDATA[Omer Tene|Jacqueline Klosek|Gabe Maldoff]]></dc:creator>
		<pubDate>Wed, 17 Sep 2025 16:12:26 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=728</guid>

					<description><![CDATA[<p>Recent enforcement actions and announcements from the California Privacy Protection Agency (CPPA) and state Attorneys-General (AGs) in California, Colorado and Connecticut, and a California bill that passed the state legislature, signal a new phase of heightened enforcement, focused on honoring consumers’ opt out requests, including through cookie banners and the...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/09/17/multistate-privacy-enforcement-sweep-puts-global-privacy-control-in-the-spotlight/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/09/17/multistate-privacy-enforcement-sweep-puts-global-privacy-control-in-the-spotlight/#more-728">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/09/17/multistate-privacy-enforcement-sweep-puts-global-privacy-control-in-the-spotlight/">Multistate Privacy Enforcement Sweep Puts Global Privacy Control in the Spotlight</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Recent enforcement actions and announcements from the California Privacy Protection Agency (CPPA) and state Attorneys-General (AGs) in California, Colorado and Connecticut, and a California bill that passed the state legislature, signal a new phase of heightened enforcement, focused on honoring consumers’ opt out requests, including through cookie banners and the Global Privacy Control (GPC).</p>
<p>Two critical developments stand out from our review of these actions and announcements: a crackdown on non-functional opt-out tools, highlighted by a multi-million dollar settlement and a multi-state investigative sweep; and a renewed emphasis on mandatory risk assessments for data “selling” and “sharing,” including for online advertising practices that are commonplace.</p>
<p>To read the full alert, <a href="https://www.goodwinlaw.com/en/insights/publications/2025/09/alerts-technology-dpc-multistate-privacy-enforcement-sweep">click here</a>.</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/09/17/multistate-privacy-enforcement-sweep-puts-global-privacy-control-in-the-spotlight/">Multistate Privacy Enforcement Sweep Puts Global Privacy Control in the Spotlight</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Colorado Proposes Children’s Privacy Amendments to Privacy Act Regulations</title>
		<link>https://www.goodwinprivacyblog.com/2025/08/08/colorado-proposes-childrens-privacy-amendments-to-privacy-act-regulations/</link>
		
		<dc:creator><![CDATA[Jacqueline Klosek|Gabe Maldoff|Jacob Lee]]></dc:creator>
		<pubDate>Fri, 08 Aug 2025 13:43:34 +0000</pubDate>
				<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=718</guid>

					<description><![CDATA[<p>What started as a flurry when California included protections for data about known teens in its 2018 privacy law soon became a blizzard. State after state passed new protections for teens into their own privacy laws, with each version raising the standards above the previous ones. Now, even in the depths of...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/08/08/colorado-proposes-childrens-privacy-amendments-to-privacy-act-regulations/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/08/08/colorado-proposes-childrens-privacy-amendments-to-privacy-act-regulations/#more-718">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/08/08/colorado-proposes-childrens-privacy-amendments-to-privacy-act-regulations/">Colorado Proposes Children’s Privacy Amendments to Privacy Act Regulations</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>What started as a flurry when California included protections for data about <em>known</em> teens in its 2018 privacy law soon became a blizzard. State after state passed new protections for teens into their own privacy laws, with each version raising the standards above the previous ones.</p>
<p>Now, even in the depths of summer, an avalanche is forming in Colorado’s mountain passes, as amendments to the Colorado Privacy Act (CPA) are set to come into force on October 1, 2025.</p>
<p>Draft rules implementing the CPA amendments (the <a href="https://coag.gov/app/uploads/2025/07/CPA2025ProposedRuleAmendments-1.pdf" target="_blank" rel="noopener noreferrer">Proposed Rules</a>) issued July 29, 2025, by the Colorado Department of Law (DOL) aim to extend heightened protections not only to data about <em>known</em> minors under 18 but also <em>any</em> personal information collected on websites or services <em>directed</em> to such minors, even when a business does not know the ages of its users.</p>
<p>Among other things, this will require controllers operating services intended for minor audiences to seek <em>opt-in consent</em> before engaging in many standard business practices, such as targeted advertising, profiling, and extended data retention. Controllers will also need to conduct data protection assessments, implement technical safeguards related to geolocation data and communication tools, and avoid design features intended to increase engagement or addiction by minors.</p>
<p>The DOL has opened the Proposed Rules to public comment from July 29 to September 10, 2025. A public hearing on September 10 will follow the public comment period. The Proposed Rules do not include an effective date, but it is likely that the DOL set the rulemaking timeline with the goal of finalizing the Proposed Rules by October 1, 2025, when the CPA amendments are scheduled to go into effect.</p>
<p>To read the full alert, <a href="https://www.goodwinlaw.com/en/insights/publications/2025/08/alerts-practices-dpc-colorado-proposes-childrens-privacy-amendments?utm_source=alrts&amp;utm_medium=email&amp;utm_campaign=PRA&amp;utm_term=DPC&amp;utm_content=20250807_alrts_pra_dpc_Colorado%20Proposes%20Childrens%20Privacy%20Amendments%20to%20Privacy%20Act%20Regulation#msdynmkt_trackingcontext=423bc40d-2a2d-4621-a7d6-249d353b0200">click here</a>.</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/08/08/colorado-proposes-childrens-privacy-amendments-to-privacy-act-regulations/">Colorado Proposes Children’s Privacy Amendments to Privacy Act Regulations</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>California’s New Privacy and Cybersecurity Regulations on Risk Assessments, Automated Decision making and Cybersecurity Audits: What Businesses Need to Know</title>
		<link>https://www.goodwinprivacyblog.com/2025/08/01/californias-new-privacy-and-cybersecurity-regulations-on-risk-assessments-automated-decision-making-and-cybersecurity-audits-what-businesses-need-to-know/</link>
		
		<dc:creator><![CDATA[Omer Tene|Jacqueline Klosek|Peter Marta|Jud Welle|Kaitlin Betancourt|Gabe Maldoff|Reema Moussa]]></dc:creator>
		<pubDate>Fri, 01 Aug 2025 20:48:57 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=713</guid>

					<description><![CDATA[<p>During a Board Meeting on July 24, 2025, the California Privacy Protection Agency (CPPA) unanimously approved the long-awaited final text of its second rulemaking package, implementing a broad swath of new requirements regarding risk assessments, automated decisionmaking technology (ADMT), and cybersecurity audits. The regulations, under the California Consumer Privacy Act (CCPA), also...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/08/01/californias-new-privacy-and-cybersecurity-regulations-on-risk-assessments-automated-decision-making-and-cybersecurity-audits-what-businesses-need-to-know/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/08/01/californias-new-privacy-and-cybersecurity-regulations-on-risk-assessments-automated-decision-making-and-cybersecurity-audits-what-businesses-need-to-know/#more-713">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/08/01/californias-new-privacy-and-cybersecurity-regulations-on-risk-assessments-automated-decision-making-and-cybersecurity-audits-what-businesses-need-to-know/">California’s New Privacy and Cybersecurity Regulations on Risk Assessments, Automated Decision making and Cybersecurity Audits: What Businesses Need to Know</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>During a Board Meeting on July 24, 2025, the California Privacy Protection Agency (CPPA) unanimously approved the long-awaited <a href="https://cppa.ca.gov/regulations/pdf/ccpa_updates_cyber_risk_admt_mod_txt_pro_reg.pdf" target="_blank" rel="noopener noreferrer">final text</a> of its second rulemaking package, implementing a broad swath of new requirements regarding risk assessments, automated decisionmaking technology (ADMT), and cybersecurity audits. The regulations, under the California Consumer Privacy Act (CCPA), also amended various provisions of the initial CCPA regulations. While not using – and, in fact, removing from previous drafts – the words “artificial intelligence,” the regulations very much impact AI, through risk assessment and ADMT rules, and require companies to enhance their data privacy and cybersecurity programs, including undergoing an annual evidence-based cybersecurity audit.</p>
<p>To read the full alert, <a href="https://www.goodwinlaw.com/en/insights/publications/2025/07/alerts-practices-dpc-californias-new-privacy-and-cybersecurity-regulations">click here.</a></p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/08/01/californias-new-privacy-and-cybersecurity-regulations-on-risk-assessments-automated-decision-making-and-cybersecurity-audits-what-businesses-need-to-know/">California’s New Privacy and Cybersecurity Regulations on Risk Assessments, Automated Decision making and Cybersecurity Audits: What Businesses Need to Know</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>EU Supervisory Authorities Approve Irish Data Protection Commission’s Decision on TikTok’s International Data Flows</title>
		<link>https://www.goodwinprivacyblog.com/2025/04/02/eu-supervisory-authorities-approve-irish-data-protection-commissions-decision-on-tiktoks-international-data-flows/</link>
		
		<dc:creator><![CDATA[Gretchen Scott|Curtis McCluskey|Elliot Luke]]></dc:creator>
		<pubDate>Wed, 02 Apr 2025 12:13:18 +0000</pubDate>
				<category><![CDATA[Data Privacy in Transactions and Agreements]]></category>
		<category><![CDATA[Data Protection]]></category>
		<category><![CDATA[Data Transfers]]></category>
		<category><![CDATA[GDPR]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=703</guid>

					<description><![CDATA[<p>On 25 March 2025, the Irish Data Protection Commission (‘DPC’) confirmed that it received no objections to its draft decision on how TikTok Technology Limited (‘TikTok’) transfers personal data to China. The DPC, in its role as the lead supervisory authority for the Irish-headquartered TikTok, opened two ex officio inquiries...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/04/02/eu-supervisory-authorities-approve-irish-data-protection-commissions-decision-on-tiktoks-international-data-flows/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/04/02/eu-supervisory-authorities-approve-irish-data-protection-commissions-decision-on-tiktoks-international-data-flows/#more-703">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/04/02/eu-supervisory-authorities-approve-irish-data-protection-commissions-decision-on-tiktoks-international-data-flows/">EU Supervisory Authorities Approve Irish Data Protection Commission’s Decision on TikTok’s International Data Flows</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>On 25 March 2025, the Irish Data Protection Commission (‘<strong>DPC</strong>’) confirmed that it received no objections to its draft decision on how TikTok Technology Limited (‘<strong>TikTok</strong>’) transfers personal data to China.</p>
<p>The DPC, in its role as the lead supervisory authority for the Irish-headquartered TikTok, opened two ex officio inquiries into the TikTok’s GDPR compliance in September 2021 in relation to:</p>
<ol>
<li>the processing of children’s personal data; and</li>
<li>the data the platform sends to China.</li>
</ol>
<p>The first of these resulted in the DPC imposing on TikTok a <strong>EUR 345 million fine</strong> in 2023 for insufficient processes for user age verification when creating accounts as well as the Company’s ‘public-by-default’ children’s setting whereby, when users of public accounts posted a video, the video was publicly published to &#8216;Everyone&#8217; by default.</p>
<p>Following the second inquiry, the DPC submitted its draft decision regarding TikTok’s international data transfers to China on 21 February 2025 for the attention of other concerned supervisory authorities across the European Economic Area to provide any objections under Article 60 GDPR. The DPC received no objections. The confirmation of no objections indicates alignment from the other regulators on the DPC’s proposed findings. As such, the draft decision will become final, with a formal decision expected to be announced in the coming weeks.</p>
<p>The announcement reiterates the consensus amongst European supervisory authorities that non-compliant international data transfers – to territories such as China – are a top priority; particularly following recent decisions and investigations opened by the Italian, Dutch and German regulators over the last month in relation to DeepSeek.</p>
<p>We await the text of the DPC’s final decision. Once released, we will provide an update on the DPC’s findings and any resulting enforcement actions.</p>
<p><em>At Goodwin, we are dedicated to helping companies navigate the complexities of their data protection requirements, in the EU, UK and globally. We have experts who understand the challenges posed by these laws. Goodwin provides tailored support to help businesses anticipate and meet their obligations.</em></p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/04/02/eu-supervisory-authorities-approve-irish-data-protection-commissions-decision-on-tiktoks-international-data-flows/">EU Supervisory Authorities Approve Irish Data Protection Commission’s Decision on TikTok’s International Data Flows</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>California Privacy Agency Signals Stronger CCPA Enforcement in Settlement with Honda</title>
		<link>https://www.goodwinprivacyblog.com/2025/03/21/california-privacy-agency-signals-stronger-ccpa-enforcement-in-settlement-with-honda/</link>
		
		<dc:creator><![CDATA[Federica De Santis|Omer Tene]]></dc:creator>
		<pubDate>Fri, 21 Mar 2025 22:13:15 +0000</pubDate>
				<category><![CDATA[Data Privacy in Transactions and Agreements]]></category>
		<category><![CDATA[Litigation & Enforcement]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<category><![CDATA[AdTech]]></category>
		<category><![CDATA[Data Privacy]]></category>
		<category><![CDATA[Data Protection]]></category>
		<category><![CDATA[Personal Data]]></category>
		<category><![CDATA[US Data Privacy Laws]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=699</guid>

					<description><![CDATA[<p>On March 7, 2025, the California Privacy Protection Agency (Agency) reached a settlement with American Honda Motor Co. (Honda) resolving allegations that the company violated the California Consumer Privacy Act (CCPA). The order required Honda to pay a $632,500 fine and implement changes to its data privacy practices. The Agency...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/03/21/california-privacy-agency-signals-stronger-ccpa-enforcement-in-settlement-with-honda/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/03/21/california-privacy-agency-signals-stronger-ccpa-enforcement-in-settlement-with-honda/#more-699">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/03/21/california-privacy-agency-signals-stronger-ccpa-enforcement-in-settlement-with-honda/">California Privacy Agency Signals Stronger CCPA Enforcement in Settlement with Honda</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>On March 7, 2025, the California Privacy Protection Agency (Agency) reached a <a href="https://cppa.ca.gov/regulations/pdf/20250307_hmc_order.pdf">settlement</a> with American Honda Motor Co. (Honda) resolving allegations that the company violated the California Consumer Privacy Act (CCPA). The order required Honda to pay a $632,500 fine and implement changes to its data privacy practices.</p>
<p>The Agency alleged that Honda improperly required consumers to verify their identity or the authorization of an agent on their behalf in order to submit opt-out requests. While the CCPA allows businesses to verify the identity of consumers or the authorization of agents for exercising other rights, such as access or deletion, it does not permit verification in opt-out requests. Moreover, the Agency alleged that Honda’s cookie management tool made it more difficult for consumers to opt out of tracking than to opt in. Given the latest wave of class action litigation and claims around cookies and pixels, businesses should pay close attention to the Agency’s analysis of Honda’s consent modal.</p>
<p>This was the Agency’s first enforcement action under the CCPA. So far, the law has been enforced only by the California Attorney General. The order stems from an <a href="https://cppa.ca.gov/announcements/2023/20230731.html">ongoing investigation</a> by the Agency, which is the first government agency dedicated to privacy enforcement in the U.S., into data privacy practices in the connected car industry, signaling increased regulatory scrutiny for automakers and other technology-driven businesses. It comes on the heels of a <a href="https://www.texasattorneygeneral.gov/news/releases/attorney-general-ken-paxton-sues-general-motors-unlawfully-collecting-drivers-private-data-and">privacy enforcement action</a> by the Texas Attorney General against General Motors.</p>
<p><strong>Key Takeaways</strong></p>
<ul>
<li>The settlement highlights the importance of ensuring that consumer-facing privacy interfaces comply with regulatory requirements and provide a user-friendly experience. Website forms that collect personal information, opt-out mechanisms, and consent management tools must be clear, accessible, and designed to facilitate—rather than hinder—consumer rights requests. Regulators actively monitor these interfaces through audits, automated compliance tools, and consumer complaints, making public-facing compliance a critical area of focus.</li>
<li>Companies, particularly those handling large volumes of consumer data or using ad tech tools, should regularly evaluate and refine their privacy touchpoints to ensure they align with CCPA requirements and prioritize ease of use for consumers. In particular, businesses should:
<ul>
<li>Ensure consumer data request workflows collect only necessary information and apply identity verification only where necessary and permitted.</li>
<li>Review consent management tools (e.g., cookie banners) for fairness and symmetry—ensuring consumers can opt out as easily as they can opt in.</li>
<li>Verify that contracts with all third-party data recipients contain CCPA-required terms and can be produced on demand.</li>
<li>Regularly train relevant staff on CCPA compliance.</li>
</ul>
</li>
</ul>
<p>Below, we summarize the key findings of the Agency’s order.</p>
<p><strong>Findings and Violations</strong></p>
<p>The Agency <a href="https://cppa.ca.gov/regulations/pdf/20250307_hmc_order.pdf">alleged</a> four violations of the CCPA:</p>
<ol>
<li><strong> Excessive Information Requests for Data Subject Rights</strong>.</li>
</ol>
<p>The Agency alleged that Honda required consumers to provide more information than necessary when exercising their privacy rights.</p>
<p>Under the CCPA:</p>
<ul>
<li>Requests to access, delete, or correct personal information require identity verification (verifiable consumer requests).</li>
<li>Requests to opt out of the sale or sharing of personal data or limit the use of sensitive personal information do not require verification.</li>
</ul>
<p>However, Honda applied the same verification process to all requests, requiring consumers to fill eight separate data fields (including full name, address, email, and phone number) in its online form. The Agency argued that this practice created an unnecessary barrier for consumers attempting to opt out or limit the use of their data.</p>
<p>The Agency also noted that Honda typically needs only two data points to identify a consumer in its database. Hence, Honda’s verification process demanded more information than necessary even for requests that did require identity verification.</p>
<p>The focus on data minimization in this enforcement action aligns with the Agency’s <a href="https://cppa.ca.gov/pdf/enfadvisory202401.pdf">enforcement advisory </a>on data minimization issued in April 2024, which emphasized that businesses should only collect the minimum data necessary to process consumer requests and should not impose unnecessary burdens that could deter consumers from exercising their rights.</p>
<ol start="2">
<li><strong>Improper Authorized Agent Verification</strong>.</li>
</ol>
<p>Under the CCPA, consumers can designate an authorized agent to request an opt-out of sale/sharing or to limit the use and disclosure of sensitive personal information on their behalf.</p>
<p>While businesses may require proof of authorization (such as a signed permission document) for an agent’s submission of these requests, they may not require the consumer to directly confirm the authorization with the business. Direct confirmation is only permitted for access, correction, or deletion requests, not for opt-out or requests to limit.</p>
<p>The Agency alleged that Honda failed to distinguish between these types of requests and unlawfully required consumers to personally confirm that they had authorized an agent to submit opt-out requests or requests to limit on their behalf. This added an unnecessary step, making it harder for consumers to exercise their rights.</p>
<ol start="3">
<li><strong>Asymmetry in Opt-In and Opt-Out Choices</strong>.</li>
</ol>
<p>Honda provides consumers the ability to submit requests to opt out of sale/sharing for cross-context behavioral advertising through a third-party cookie management tool.</p>
<p>The Agency alleged that Honda’s privacy settings in its cookie management tool made it harder to opt out of data sharing than to opt in. In Honda’s cookie management tool, a consumer had to take two steps to opt out of the sale or sharing of personal information (e.g., toggling off a setting <em>and</em> clicking “Confirm My Choices”) whereas opting back in required just a single click (via an “Accept All” button). The Agency found that this discrepancy violated the CCPA’s requirement for symmetry in choice, which mandates that privacy-protective options be as easy to exercise as less protective ones. According to the Agency, Honda should have provided a “Reject All” option alongside “Accept All” to ensure an equal or symmetrical choice.</p>
<ol start="4">
<li><strong>Failure to Establish Contracts with Ad Tech Partners</strong>.</li>
</ol>
<p>The Agency found that Honda shared consumers’ personal information with advertising technology vendors without the required contractual safeguards. The CCPA requires businesses to execute contracts with third-party data recipients that restrict use of personal data to specified purposes and require compliance with the CCPA. Honda was unable to produce contracts that included the mandatory CCPA provisions for its ad tech vendors.</p>
<p><strong>Fine and Corrective Actions</strong></p>
<p>Honda was ordered to pay a $632,500 administrative fine, with $382,500 specifically tied to 153 consumers whose rights were affected by the company’s practices. Under the CCPA, the Agency can impose fines of up to $2,500 per violation—or $7,500 per intentional violation—with adjustments for inflation. This case highlights that fines are calculated on a <em>per-violation</em> basis, meaning even routine compliance missteps can quickly escalate into substantial penalties when they impact multiple consumers.</p>
<p>In addition to the fine, the company must implement a series of remediation steps, including:</p>
<ul>
<li>Establishing separate methods for submitting verifiable requests (requests to access, correct, or delete data, which require identity verification) versus non-verifiable requests (requests to opt out of sale/sharing and limit the use and disclosure of sensitive personal information).</li>
<li>Changing its authorized agent process so that agents can submit opt-out or limit requests without requiring direct consumer confirmation.</li>
<li>Within its cookie management platform, including a “Reject All” button to provide symmetry in choice with the “Allow All” button.</li>
<li>Engaging a UX designer to assess Honda’s methods for submitting CCPA requests and recommend improvements.</li>
<li>Providing updated training to all personnel who handle CCPA requests.</li>
<li>Ensuring contracts with all third-party data recipients contain CCPA-compliant terms.</li>
</ul>
<p><strong>Takeaway for Businesses</strong></p>
<p>The Agency is ramping up enforcement of consumer data rights and data minimization practices. Even standard compliance efforts can draw penalties if they are implemented in a way that frustrates consumer choice. The Honda case serves as a clear warning that regulators will not tolerate unnecessary hurdles—whether excessive form fields, multi-step opt-outs, or improper verification requirements.</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/03/21/california-privacy-agency-signals-stronger-ccpa-enforcement-in-settlement-with-honda/">California Privacy Agency Signals Stronger CCPA Enforcement in Settlement with Honda</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Trump 2.0 Tech Policy Rundown: Breakneck Pace Continues</title>
		<link>https://www.goodwinprivacyblog.com/2025/02/14/trump-2-0-tech-policy-rundown-breakneck-pace-continues/</link>
		
		<dc:creator><![CDATA[Omer Tene|Reema Moussa]]></dc:creator>
		<pubDate>Fri, 14 Feb 2025 21:13:37 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=696</guid>

					<description><![CDATA[<p>The Trump Administration has not slowed down in its rollout of wide-sweeping technology policy changes with potentially significant impacts to be felt throughout the country and around the globe. Personnel changes and public announcements of new priorities are the throughline of new actions crossing various sectors and agencies at the...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/02/14/trump-2-0-tech-policy-rundown-breakneck-pace-continues/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/02/14/trump-2-0-tech-policy-rundown-breakneck-pace-continues/#more-696">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/02/14/trump-2-0-tech-policy-rundown-breakneck-pace-continues/">Trump 2.0 Tech Policy Rundown: Breakneck Pace Continues</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The Trump Administration has not slowed down in its rollout of wide-sweeping technology policy changes with potentially significant impacts to be felt throughout the country and around the globe. Personnel changes and public announcements of new priorities are the throughline of new actions crossing various sectors and agencies at the federal level; notably, at the Federal Trade Commission (FTC), the Consumer Financial Protection Bureau (CFPB), the Privacy and Civil Liberties Oversight Board (PCLOB), and more.</p>
<p>To read the full rundown, <a href="https://www.goodwinlaw.com/en/insights/publications/2025/02/alerts-technology-dpc-trump-20-tech-policy-rundown-breakneck">click here.</a></p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/02/14/trump-2-0-tech-policy-rundown-breakneck-pace-continues/">Trump 2.0 Tech Policy Rundown: Breakneck Pace Continues</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>California Forges a New Path on Automated Decision-Making Technology, Risk Assessments, and Cybersecurity Audits</title>
		<link>https://www.goodwinprivacyblog.com/2025/02/05/california-forges-a-new-path-on-automated-decision-making-technology-risk-assessments-and-cybersecurity-audits/</link>
		
		<dc:creator><![CDATA[Omer Tene|Jud Welle|Reema Moussa|Karl Dragosz|Victoria F. Volpe|Gabe Maldoff]]></dc:creator>
		<pubDate>Wed, 05 Feb 2025 18:58:31 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<category><![CDATA[AdTech]]></category>
		<category><![CDATA[Advertising]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Automated Decisionmaking]]></category>
		<category><![CDATA[CCPA / CPRA]]></category>
		<category><![CDATA[Cybersecurity]]></category>
		<category><![CDATA[Data Privacy]]></category>
		<category><![CDATA[US Data Privacy Laws]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=682</guid>

					<description><![CDATA[<p>Introduction As the United States transitions to a new administration, federal policymaking is beginning to shift away from civil rights and other Biden-era AI governance priorities and towards AI policies focused on “out-innovating the rest of the world,” securing US technological advantage, and national security, defense, and cybersecurity. In the...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/02/05/california-forges-a-new-path-on-automated-decision-making-technology-risk-assessments-and-cybersecurity-audits/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/02/05/california-forges-a-new-path-on-automated-decision-making-technology-risk-assessments-and-cybersecurity-audits/#more-682">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/02/05/california-forges-a-new-path-on-automated-decision-making-technology-risk-assessments-and-cybersecurity-audits/">California Forges a New Path on Automated Decision-Making Technology, Risk Assessments, and Cybersecurity Audits</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><strong>Introduction</strong></p>
<p>As the United States <a href="https://www.goodwinlaw.com/en/flex-pages/new-administration-hub">transitions to a new administration</a>, federal policymaking is beginning to <a href="https://apnews.com/article/trump-ai-repeal-biden-executive-order-artificial-intelligence-18cb6e4ffd1ca87151d48c3a0e1ad7c1">shift</a> away from civil rights and other Biden-era AI governance priorities and towards AI policies focused on <a href="https://apnews.com/article/trump-ai-artificial-intelligence-executive-order-eef1e5b9bec861eaf9b36217d547929c">“out-innovating the rest of the world,”</a> securing US technological advantage, and national security, defense, and cybersecurity. In the meantime, states will play a critical role in technology policy, as they continue to innovate on privacy, AI, and cyber policymaking.</p>
<p>Among the raft of state-level AI, privacy, and cyber laws that passed in recent years – including Colorado’s sweeping AI Act – and the many more proposals percolating within state legislatures as the 2025 legislative session opens, the California Privacy Protection Agency (CPPA) has proposed new regulations that would, if adopted, establish AI, privacy, and cybersecurity norms likely to ripple across the US market.</p>
<p>Specifically, on November 8, 2024, the CPPA advanced proposed regulations implementing the following provisions of the California Consumer Privacy Act (CCPA):</p>
<ul>
<li><span class="sd-light"><strong>Risk Assessments.</strong> “Businesses” regulated by the CCPA would need to conduct and document risk assessments for a variety of enumerated activities, including “selling” personal information, “sharing” personal information for “cross-context behavioral advertising,” processing “sensitive personal information” (such as government IDs, financial information, precise geolocation, biometric data, health and genetic information, and information relating to race, ethnicity, and sexual orientation), using “automated decision-making technologies” (ADMT) which replace or substantially facilitate human decision making, engaging in “extensive profiling,” and training AI models. AI is defined within the proposed regulations as a “machine-based system that infers, from the input it receives, how to generate outputs [including predictions, content, recommendations, or decisions] that can influence physical or virtual environments” and may operate with “varying levels of autonomy” to achieve “explicit or implicit objectives.” The regulations provide detailed guidance on the content of these risk assessments and prohibit businesses from proceeding with an activity when its risks outweigh the benefits.</span></li>
<li><span class="sd-light"><strong>ADMT Notices, Opt-Outs, and Explanations – Including a New Right to Opt Out of First-Party Advertising.</strong> The proposed regulations require businesses to provide consumers with “pre-use notices” before using ADMT and permit consumers to challenge decisions made using ADMT. Businesses would also need to provide consumers with explanations of the logic employed by any ADMT systems, among other requirements. Importantly, the CPPA’s proposed regulations would extend these ADMT requirements to AI-<em>assisted</em> decisions – not just “solely” automated decisions as in other privacy laws, such as the EU’s General Data Protection Regulation (GDPR). The proposed regulations also seek to create an opt-out of first-party advertising that was not previously regulated by the CCPA.</span></li>
<li><span class="sd-light"><strong>Cybersecurity Audits.</strong> The proposed regulations would require businesses meeting certain thresholds relating to the volume and sensitivity of personal information they process to conduct annual “cybersecurity audits” meeting specified <em>content, scope,</em> and<em> methodology</em> Compliance with the audit requirements will need to be certified annually by a member of the board or high-ranking executive. The proposed regulations will indirectly compel a wide range of businesses not currently subject to prescriptive cybersecurity requirements to improve their security controls and overall cyber program.</span></li>
<li><span class="sd-light"><strong>CCPA Application to Insurers. </strong>The proposed regulations would apply the CCPA to state-regulated insurance companies, which currently fall outside the scope of the CCPA for any activities regulated by the Gramm-Leach-Bliley Act (GLBA).</span></li>
</ul>
<p>The CPPA held a hearing on January 14, 2025, to provide an opportunity for public comment on the proposed rules, with an additional hearing and extended deadline for public comment on February 19. If the agency votes to proceed with rulemaking following public comment, the agency could move to finalize the regulations as soon as April 1, 2025. Any substantial changes would require additional public consultation.</p>
<p>Below are the key takeaways for business regulated by the CCPA.</p>
<p><strong>Risk Assessments </strong></p>
<p>The draft regulations require businesses to conduct risk assessments for data processing that presents “significant risk” to consumers. Risk assessments would need to identify the benefits and risks of the proposed processing activities, as well as the safeguards employed to address such risks.</p>
<p>While risk assessment requirements exist in other privacy laws, including the GDPR (in the form of a “data protection impact assessment”) and other US state laws, the proposed regulations would impose several novel requirements, including:</p>
<ol>
<li><strong>Extension to new types of processing activities.</strong> The proposed regulations require businesses that engage in certain specified activities to conduct risk assessments. Such specified activities include “selling” personal information, “targeted advertising,” use of ADMT to make certain types of “significant decisions” (such as determining compensation, hiring, allocation of work, promotions, admission into an academic program, etc.), and processing “sensitive personal information.” All of these activities would also require risk assessments under several other state consumer privacy laws. In addition to these activities, the proposed regulations require risk assessments for activities that are not subject to risk assessment requirements under other state laws, including:
<ul>
<li>“<strong><em>Extensive profiling</em></strong>,” defined as systematic observation of job applicants, employees, or students, systematic observation of a publicly accessible place, or profiling a consumer for behavioral advertising; or</li>
<li>Using personal information to<strong><em> “train” certain types of AI/ADMT models. </em></strong>“Training” is defined as the process through which ADMT or AI “discovers underlying patterns, learns a series of actions, or is taught to generate a desired output.” This requirement applies only to AI and ADMT models with the following characteristics:
<ul>
<li> Generative models, such as large language models (LLMs); or</li>
<li>Models that are capable of being used for: “profiling” consumers (i.e., to analyze interests, preferences, reliability, behavior, location, health, performance at work, etc.); making significant decisions (i.e., decisions that affecting financial or lending services, housing, insurance, education enrollment, employment opportunities, healthcare, or essential goods or services, etc.); establishing individual identity, or for physical or biological identification (i.e., information that depicts or describes a consumer’s physical or biological characteristics, or measurements of or relating to their body, such as biometric information, vocal intonation, facial expression, and gestures); or generating a deepfake (i.e., manipulated or synthetic audio, image or video content that is falsely represented as a truthful depiction of a consumer)<strong><em>.</em></strong></li>
</ul>
</li>
</ul>
</li>
<li> <strong>Specific, additional requirements for AI and ADMT applications. </strong>The proposed regulations would require businesses that use AI or ADMT to document the “completeness, representativeness, timeliness, validity, accuracy, consistency, and reliability” of personal information used in connection with AI and ADMT, as well as the “logic of the [ADMT], including any assumptions or limitations in the logic.” Service providers that make AI or ADMT available to their business customers would be required to provide such customers with the facts necessary to permit them to conduct their own risk assessment.</li>
<li><strong>Substantive restrictions on activities that involve disproportionate risks to consumers.</strong> The proposed regulations prohibit businesses from engaging in activities in which the risk to consumers outweighs the intended benefits to consumers, the business, or other third parties. However, the proposed regulations do not specify how businesses should weigh the relative risks and benefits of an activity, particularly where such risks and benefits accrue to different parties, or whether a business’s analysis is entitled to deference from regulators. If the proposed regulations take effect without material changes, enforcement actions and judicial decisions will be critical to understanding where businesses will need to draw the line regarding such risks and benefits.</li>
<li><strong>Privilege and confidentiality considerations.</strong> The proposed regulations require businesses to disclose their risk assessment in its entirety to the CPPA within 10 days of a request by the agency. Businesses will need to carefully consider how to protect any confidential and/or privileged information that forms the basis of a business’s assessment.</li>
</ol>
<p>The proposed regulations allow a business to repurpose a CCPA risk assessment for compliance with other applicable laws and for other “compatible” processing activities. Given that other US state risk assessment requirements are less prescriptive, the CPPA’s proposed approach is likely to become the default standard for US risk assessments.</p>
<p><strong>ADMT</strong></p>
<p>Drawing from frameworks on both sides of the Atlantic, the CPPA’s proposed ADMT regulations bridge the gap between ADMT requirements that have been a longstanding feature of privacy laws and the newer generation of AI-specific laws, such as the EU AI Act and the Colorado AI Act.</p>
<p>Key features of the proposed framework include:</p>
<ol>
<li><strong>Expansion of ADMT to AI-<em>assisted </em>decisions, not just decisions that are “solely” automated. </strong>The proposed regulations define ADMT as “any technology that processes personal information and uses computation to execute a decision, replace <em>or substantially facilitate human decision-making</em>” (emphasis added). The extension of ADMT requirements to tools that facilitate human decisions could bring into scope a wide range of technologies that do not make solely automated decisions, such as analytic and diagnostic tools designed to inform human judgment, rather than replace it.</li>
<li><strong>Application of ADMT requirements to AI training, “public profiling,” and “extensive profiling” even in the absence of a significant decision.</strong> Previous ADMT frameworks have focused on decisions with substantial impact for consumers, such as “legal” or “significant” decisions under GDPR, “eligibility” decisions under the US Fair Credit Reporting Act (FCRA), and “consequential” decisions under the Colorado AI Act. The proposed regulations would extend further, affecting technologies that “profile” consumers, including for advertising, employment or educational purposes, and that use a consumer’s personal information to train an ADMT system <em>capable</em> of making significant decisions, profiling consumers, establishing identity, or generating deepfakes, even if the technologies have not been used to make any significant decisions involving the consumer.</li>
<li><strong>Separate “pre-use notices” must describe any proposed use of ADMT. </strong>The proposed regulations would require businesses to provide consumers with a “pre-use notice” – before employing ADMT – that explains key features of the ADMT system, including its “logic.” Pre-use notices will require additional specificity around data uses that may exceed businesses’ current disclosure practices. For example, describing processing activities “in generic terms, such as ‘to improve our services’” will not be sufficient.</li>
<li><strong>Consumers can opt out of ADMT, with limited exceptions.</strong> The proposed regulations permit consumers to opt out of ADMT, except for certain security, fraud prevention, and safety purposes. In addition, in the employment and educational contexts, businesses can give consumers the option to have a human review the decision, rather than allowing them to opt out altogether. Businesses cannot ask consumers for proof of identity for opt-out requests, unless the business has a good faith, reasonable, and documented belief that a request is fraudulent.</li>
<li><strong>A new right to opt out of first-party advertising. </strong>By extending ADMT requirements to “behavioral advertising,” which the CPPA defines to include targeted advertising based on a consumer’s activity within the business’s own “<em>distinctly-branded websites, applications or services</em>” – in additional to “cross-context behavioral advertising” already regulated by the CCPA – the proposed regulations create an opt-out of first-party advertising that was not previously regulated by the CCPA. This new requirement generated significant feedback during the January 14 hearing, with commenters noting the risk of the CPPA exceeding its statutory authority.</li>
<li><strong>“Access” and “explanation” requirements will create challenges for LLMs and black-box systems. </strong>The proposed regulations grant consumers a broad right of access that includes “plain language explanations” of the ADMT’s outputs, how the business plans to use the outputs, and “how the [ADMT] worked with respect to the consumer.” Explaining the logic of ADMT systems, not only in general terms but also as applied to any particular consumer, is likely to present significant challenges for businesses that employ sophisticated AI tools.</li>
<li><strong>Significant overlap with other federally-regulated sectors.</strong> Many of the examples of ADMT in the proposed regulations overlap with activities that are regulated by federal frameworks and exempted from the CCPA. For example, the proposed regulations define ADMT to include housing, employment, and financial eligibility decisions that often rely on “consumer reports” regulated by FCRA. While the CCPA expressly exempts from its scope personal information regulated by such frameworks, the inclusion of these examples within the proposed regulations raises questions about how the CPPA will interpret the breadth of its authority.</li>
</ol>
<p><strong>“Cybersecurity Audits”</strong></p>
<p>The draft regulations require covered businesses to conduct annual, <em>independent</em> cybersecurity audits that assess and document how the business’s cybersecurity program protects personal information from unauthorized access, destruction, use, modification, or disclosure; and protects against unauthorized activity resulting in the loss of availability of personal information.</p>
<p>Businesses whose processing of consumers’ personal information presents “significant risk to consumers’ security,” would be required to comply with the cybersecurity audit requirements. Businesses whose processing presents a “significant risk to consumers’ security” includes businesses that:</p>
<ol>
<li>Derive 50% or more of their annual revenue from selling or sharing consumers’ personal information; <strong><u>or</u></strong></li>
<li>Made over $28 million<a href="#_ftn1" name="_ftnref1">[1]</a> in gross annual revenue the preceding year <strong><em>and</em></strong>:
<ul>
<li>processed personal information of 250,000 or more consumers or households in the preceding calendar year; <strong><em>or</em></strong></li>
<li>processed the “sensitive personal information” of 50,000 or more consumers in the preceding calendar year.</li>
</ul>
</li>
</ol>
<p>Key considerations include:</p>
<ol>
<li><strong>Extension of auditing requirements to whole classes of businesses not currently subject to cybersecurity regulation.</strong> US cybersecurity laws and regulations historically focused on specific regulated sectors, such as financial services, healthcare, and critical infrastructure. The proposed regulations, by contrast, are sector-agnostic and could apply to any business that meets the data processing thresholds described above.</li>
<li><strong>Expanded and more detailed scope.</strong> The proposed scope of the cybersecurity audit includes both broad and specific requirements. Broadly, the audit must include an assessment of the business’s cybersecurity program appropriate to its size, complexity, the nature and scope of processing activities, and the cost of implementing the components of a cybersecurity program. Specifically, the audit must include an assessment of 18 categories with numerous subcategories and examples that align closely to the requirements of prevailing cybersecurity frameworks and sector-specific audits, such as those that appear within NIST’s Cybersecurity Framework and financial sector security frameworks.</li>
<li><strong>Qualified and independent auditors</strong>. Cybersecurity audits would need to be performed by a qualified, objective, and independent professional. While the proposed regulations permit internal auditors to perform cybersecurity audits, internal auditors must report directly to the board or governing body and not to business management.</li>
<li><strong>Certification requirements create an avenue for individual board member or executive liability</strong>. The draft regulations require businesses to submit a certificate of completion directly to the CPPA each calendar year signed by a member of the board or governing body, or if no such board or equivalent exists, the business’s the highest-ranking executive responsible for cybersecurity. In other contexts, such as Sarbanes-Oxley Act compliance, similar certification requirements have been used to create personal liability for the attesting company officer. Businesses will need to carefully oversee their cybersecurity audits and document their practices to protect responsible officers and the company from enforcement risk.</li>
<li><strong>Required documentation of inapplicable controls</strong>. If a business determines that it does not need to have any one or more of the security controls listed within scope of the CPPA’s proposed audit requirements, the business must document and explain, within the cybersecurity audit, why such control is not necessary to the business’s protection of personal information, and how the safeguards in place provide at least equivalent security. Such a requirement may force many companies to accelerate or take on investment in maturing their cybersecurity controls and risk management processes.</li>
<li><strong>Record retention</strong>. Records of previously conducted cybersecurity audits must be retained for five years.</li>
<li><strong>Assessment of prior audits</strong>. Cybersecurity audits must include existing gaps or weaknesses and address the status of gaps or weaknesses identified in previous cybersecurity audits (and any corrections or amendments made from such previous cybersecurity audits).</li>
<li><strong>Security incident documentation, with implications for incidents that have no connection to California.</strong> If a business has experienced a security incident that it notified to any regulator <strong>with jurisdiction over privacy laws or other data processing authority</strong> – anywhere in the world – the proposed regulations require the business to include within the cybersecurity audit a description of the incident and copies of relevant notifications.</li>
<li><strong>Equivalent audits would not need to be duplicated. </strong>The proposed regulations would not require a business to duplicate a cybersecurity audit if it already conducted one that complies with the requirements. Due to the highly prescriptive criteria for required audits under these proposed rules, it is unlikely that businesses will be completing such extensive audits that this criteria would be met independently.</li>
</ol>
<p><strong>Insurance Companies</strong></p>
<p>The proposed regulations require insurance companies subject to the California Insurance Code to comply with the CCPA for any collection of personal information not preempted by the Code. Currently, personal information processed by state-regulated insurance companies in connection with any financial products or services they offer is usually exempt from the CCPA because such personal information is regulated by the GLBA and state-level financial privacy laws, such as the California Financial Information Privacy Act.</p>
<p>If the proposed regulations are implemented in their current form, California-regulated insurance companies that meet the thresholds of a “business” will need to extend their CCPA-compliance programs to capture GLBA-regulated consumer personal information. The proposed regulations do not address how such insurance companies would address any conflicts or inconsistencies between existing financial privacy requirements and the CCPA.</p>
<p><strong>Conclusion</strong></p>
<p>The proposed regulations, if approved and implemented in their current form, would impose significant new requirements for businesses regulated by the CCPA, and would introduce several novel consumer choices, including a right to opt out of first-party behavioral advertising and a right to opt out of AI training. The proposed regulations are also likely to raise privacy, cybersecurity, and AI governance standards across sectors that previously were not subject to US requirements to conduct internal risk assessment and cybersecurity audits.</p>
<p>In past CPPA rulemakings, the final form of proposed draft regulations has not changed significantly after advancing to the formal rulemaking process. However, these proposed regulations have triggered division within the CPPA unlike any previous rulemaking, with two of five CPPA board members (including Alastair Mactaggart, an original author of the CCPA) dissenting and calling the proposed regulations “overreach.” On January 31, one of the members that voted <a href="https://cppa.ca.gov/announcements/2023/20231127.html">in favor</a> of the proposed regulations, Vinhcent Le, was replaced on the CPPA Board by Brandie Nonnecke, a tech policy researcher and Founding Director of the CITRIS Policy Lab at the University of California, Berkeley.</p>
<p>Following the close of the comment period on February 19, the CPPA will review submitted comments and may revise the proposed regulations if needed. Under the California Administrative Procedure Act (APA), if substantial changes are made, an additional 15-day public comment period for the revised text would be required. Barring any major changes, the CPPA could implement the rules as soon as April 1.</p>
<p><a href="#_ftnref1" name="_ftn1">[1]</a> This includes the legally required increase to account for the increase in the Consumer Price Index. See Draft Update to Existing Regulations, March 2023, at § 7005(b)(1), available at https://cppa.ca.gov/meetings/materials/20240308_item4_draft_update.pdf</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/02/05/california-forges-a-new-path-on-automated-decision-making-technology-risk-assessments-and-cybersecurity-audits/">California Forges a New Path on Automated Decision-Making Technology, Risk Assessments, and Cybersecurity Audits</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>UK Ransomware Consultation: Government Moves to Rein in Attacks</title>
		<link>https://www.goodwinprivacyblog.com/2025/01/30/uk-ransomware-consultation-government-moves-to-rein-in-attacks/</link>
		
		<dc:creator><![CDATA[Curtis McCluskey|Tobin Cleary]]></dc:creator>
		<pubDate>Thu, 30 Jan 2025 19:14:29 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<category><![CDATA[Cybersecurity]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=679</guid>

					<description><![CDATA[<p>On 14 January 2025, the UK government launched a public consultation on proposed legislative measures to combat the ever-increasing threat of ransomware. With these proposals, the UK government is seeking to step up its efforts to understand, deter and prosecute ransomware attacks by gathering more information from victims and undermining...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/01/30/uk-ransomware-consultation-government-moves-to-rein-in-attacks/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/01/30/uk-ransomware-consultation-government-moves-to-rein-in-attacks/#more-679">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/01/30/uk-ransomware-consultation-government-moves-to-rein-in-attacks/">UK Ransomware Consultation: Government Moves to Rein in Attacks</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>On 14 January 2025, the UK government launched a <a href="https://www.gov.uk/government/consultations/ransomware-proposals-to-increase-incident-reporting-and-reduce-payments-to-criminals" target="_blank" rel="noopener">public consultation</a> on <a href="https://assets.publishing.service.gov.uk/media/67864097c6428e013188175a/Consultation-Document-Proposals-v2.pdf" target="_blank" rel="noopener">proposed legislative measures</a> to combat the ever-increasing threat of ransomware. With these proposals, the UK government is seeking to step up its efforts to understand, deter and prosecute ransomware attacks by gathering more information from victims and undermining the ransomware business model.</p>
<p>The new framework would ban ransom payments in the public sector and for certain critical infrastructure providers, and more broadly would require all companies to report ransomware attacks, including whether they plan to pay the ransom. The government is seeking views on these proposals, including the introduction of criminal sanctions, and whether the regime should cover all UK individuals and organisations, or be limited by size of the organisation and/or ransom. The public consultation is open until <strong>8 April 2025</strong>.</p>
<p><strong>The Three Proposals</strong></p>
<ol>
<li><strong><em>Ban on Ransomware Payments for the Public Sector and CNI</em></strong></li>
</ol>
<ul>
<li><strong>Proposal:</strong> All organisations in the UK public sector – including local government, as well as owners and operators of critical national infrastructure (“CNI”) that are regulated, or that have competent authorities – would be prohibited from making payments to cyber criminals in response to ransomware incidents. The proposal expands the current principle that government departments cannot make ransomware payments.</li>
<li><strong>Goal:</strong> The public sector has been increasingly targeted by bad actors, resulting in serious harm to the UK public. The proposal aims to disincentivise cyber criminals – who will not receive a payout from their targets – from targeting essential agencies and infrastructure, thereby protecting the UK’s public services and CNI from the disruption caused by ransomware attacks.</li>
<li><strong>Consultation:</strong> The government is seeking views on (a) whether additional businesses, including essential suppliers to these sectors, should also be in scope; and (b) effective and proportionate measures to encourage compliance with the proposed ban, including criminal and civil penalties.</li>
</ul>
<ol start="2">
<li><strong><em>Ransomware Payment Prevention Regime</em></strong></li>
</ol>
<ul>
<li><strong>Proposal:</strong> All companies and individuals not covered by the ban would have to, prior to making a payment in response to a ransomware attack, report their intention to make a payment to the government. Following notification, the government would review the payment proposal and open up a dialogue with the reporting company on next steps, including exploring alternative options. The government could ultimately block any payment.</li>
<li><strong>Goal:</strong> The intention behind this proposal is to:
<ul>
<li>increase the intelligence available to support operational activity, major investigations, and the government’s understanding of the ransomware payment landscape;</li>
<li>influence the behaviour and experience of victims of ransomware through the provision of advice and guidance; and</li>
<li>prevent payments that would breach sanctions or terrorism finance legislation.</li>
</ul>
</li>
<li><strong>Consultation:</strong> The government is seeking views on (a) measures for encouraging compliance with the regime, such as whether to impose criminal and/or civil penalties for non-compliance; and (b) whether the regime and any accompanying compliance measures should be subject to a threshold determined by the size of the organisation and/or the amount of the ransom demanded.</li>
</ul>
<ol start="3">
<li><strong><em>Ransomware Incident Reporting Regime</em></strong></li>
</ol>
<ul>
<li><strong>Proposal:</strong> Companies and individuals would be required to report a ransomware attack to the government, regardless of their intention to pay the ransom. The government intends to harmonise the new ransomware regime with the NIS Regulations and upcoming Cyber Security and Resilience Bill, to ensure that UK victims will only have to report an individual ransomware incident once.</li>
<li><strong>Goal:</strong> To assist the government’s understanding of the scale, type and source of the ransomware threats that individuals and organisations in the UK face.</li>
<li><strong>Consultation:</strong> The government is seeking views on whether the mandatory reporting requirement should only impact organisations and individuals that meet a certain threshold. If the regime is introduced with a threshold, the government would continue to encourage all victims of a ransomware incident to report through the same mechanism.</li>
</ul>
<p><strong>Conclusion</strong></p>
<p>Whilst there are clear aims behind this proposal to disincentivise cyber criminals, these reporting obligations will introduce another layer of complexity and accountability during the early stages of a ransomware attack. If the proposals are implemented in their most extreme form, many UK businesses and individuals will be effectively stopped from making ransomware payments, and will face additional reporting obligations. The government is, however, open to input, including on scope and sanctions. Any businesses that wish to submit comments on the proposals should do so <a href="https://www.homeofficesurveys.homeoffice.gov.uk/s/E6ROXH/" target="_blank" rel="noopener">here</a> by <strong>8 April 2025</strong>.</p>
<p>At Goodwin, we have a dedicated team of data, privacy and cybersecurity experts to assist clients navigate their legal obligations following a ransomware attack, whether in the UK, EU or globally.</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/01/30/uk-ransomware-consultation-government-moves-to-rein-in-attacks/">UK Ransomware Consultation: Government Moves to Rein in Attacks</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>US Privacy and AI Outlook for 2025: Less Feds, More States</title>
		<link>https://www.goodwinprivacyblog.com/2025/01/07/us-privacy-and-ai-outlook-for-2025-less-feds-more-states/</link>
		
		<dc:creator><![CDATA[Omer Tene|Jacqueline Klosek]]></dc:creator>
		<pubDate>Tue, 07 Jan 2025 20:14:42 +0000</pubDate>
				<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=672</guid>

					<description><![CDATA[<p>Aristotle, the Greek philosopher and polymath of the fourth century BC is known to have coined the phrase horror vacui &#8211; nature abhors a vacuum, meaning that in nature, a vacuum or a void isn’t a steady state. Fast forward 2,400 years, and the same holds true for tech regulation....</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2025/01/07/us-privacy-and-ai-outlook-for-2025-less-feds-more-states/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2025/01/07/us-privacy-and-ai-outlook-for-2025-less-feds-more-states/#more-672">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2025/01/07/us-privacy-and-ai-outlook-for-2025-less-feds-more-states/">US Privacy and AI Outlook for 2025: Less Feds, More States</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Aristotle, the Greek philosopher and polymath of the fourth century BC is known to have coined the phrase <em>horror vacui</em> &#8211; nature abhors a vacuum, meaning that in nature, a vacuum or a void isn’t a steady state. Fast forward 2,400 years, and the same holds true for tech regulation. If Washington policymakers take their foot off the gas, don’t expect privacy and AI laws to slow down. Enter Sacramento, Austin and Albany, Tallahassee, Trenton and Olympia. This land is your land, and it is rife with tech policy initiatives. No federal privacy law? Introducing 20 states &#8211; and counting. No federal AI regulation? Prepare for an avalanche of state action.</p>
<p>As we enter the early days of 2025, it is an ideal moment to consider what the new year may bring to privacy regulation. We are in a period of continued evolution, marked by the emergence of new technologies, most notably artificial intelligence (AI), growing awareness of personal data rights, and rising public demand for stronger protections. This year, however, the shift is not only technological but also political, with Republicans consolidating power in Washington, holding control over the executive, legislative, and, for the most part, judicial branches.</p>
<p>While some observers suggested this may signal a deregulatory stance to privacy and AI, in the U.S., the absence of federal regulation may very well open the floodgates of state laws. Blue states in particular are expected to double down on protecting consumer health information and imposing anti-bias and discrimination measures on developers and deployers of AI.</p>
<p>With a diverse and complex regulatory landscape at both the federal and state levels, businesses and individuals alike are navigating the implications of this shift. At the state level, privacy legislation continues to gain momentum, with states like California, Texas, and Colorado leading the charge, setting precedents with groundbreaking laws, while other states move toward similar measures.</p>
<p>Here are the key privacy and AI issues that will make 2025:</p>
<p><strong>Federal Regulation &#8211; </strong></p>
<p>Consider this curveball: with Republicans in control of both chambers of Congress as well as the White House, federal privacy legislation could actually get done. To be sure, historically Republican lawmakers were more circumspect about privacy regulation. At the same time, industry has been clamoring for a federal law that would preempt the growing patchwork of state privacy laws. While we wouldn’t hold our breath waiting for a federal privacy law, such an initiative could gain momentum in Congress. If it happens, with Sen. Ted Cruz (R-TX) at the helm of Senate Commerce and Rep. Brett Guthrie (R-KY) chairing House Energy and Commerce, we predict a proposal for a new comprehensive federal law that looks more like the Texas Data Privacy And Security Act than the last iteration of a federal privacy bill, the American Privacy Rights Act (APRA). That is, a federal law with strong preemption, no private right of action, less emphasis on civil rights and data minimization, and a stronger focus on security. Short of a federal privacy law, we may well see kids’ privacy legislation pass in this Congress. Recent efforts, such as the <a href="https://www.congress.gov/bill/118th-congress/senate-bill/2073/text">Kids Online Safety and Privacy Act (“KOSPA”)</a> have strong bi-cameral and bi-partisan support. If lawmakers want to show they can get <em>something</em> done, this would be a good place to start.</p>
<p>The new Federal Trade Commission (FTC), led by Trump’s choice for chair, sitting Commissioner Andrew Ferguson, and with a Republican majority with new Commissioner Mark Meador, will look starkly different from outgoing Chair Lina Khan’s agency. For starters, the agency is all but certain to scrap its “<a href="https://www.ftc.gov/legal-library/browse/federal-register-notices/commercial-surveillance-data-security-rulemaking">Commercial Surveillance and Data Security Rulemaking</a>”. In fact, we’re unlikely to even hear the term “commercial surveillance” under this leadership. Moreover, nominated Chair Ferguson has already stated that he intends to roll back the agency’s recent efforts to become a de facto AI enforcement agency, focusing instead on its traditional role as a competition and consumer protection regulator. This intent can be witnessed in his <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/ferguson-rytr-statement.pdf"><em>Rytr</em> dissent, where Feguson</a> stated the agency “certainly should not chill innovation by threatening to hold AI companies liable for whatever illegal use some clever fraudster might find for their technology”. On privacy, we expect the FTC to shift from an activist enforcement stance, which often relies on the unfairness prong of its section 5 authority, to a more conservative approach based on anti deception.</p>
<p><strong>State Law Frenzy </strong> &#8211;</p>
<p>You can surely expect 2025 to mark a pivotal year in state law regulation of privacy as more states implement comprehensive privacy laws that offer robust consumer rights.  Many of these laws reflect a trend toward more consumer control over personal data, corporate transparency, and greater accountability for organizations that process sensitive information.  In 2024, the number of comprehensive state privacy laws grew from five to eight, with the laws in Texas, Oregon and Montana coming into force. This year, the tally will balloon to 16, as privacy laws go into effect in Tennessee, Delaware, Iowa, New Jersey, New Hampshire, Nebraska, Maryland and Minnesota. Some of these laws, including in Delaware, Iowa, Nebraska and New Hampshire, already entered into force on January 1, and New Jersey’s is imminent on January 15. Further, in certain states with such laws in effect, new requirements are now implemented. For example, in Texas and Connecticut, businesses must now respect Universal Opt-Out Mechanisms (UOOMs). All in all, 19 states (plus Florida, whose law’s scope is somewhat more limited) have passed a comprehensive privacy law, with many others in the pipeline.</p>
<p>In addition to having a direct effect for rights of consumers and obligations of business in the particular state at play, these laws may come to have a broader impact by influencing discussions around federal privacy legislation. Importantly, the new Maryland law adopted the data minimization paradigm from one of the stalled federal bills, marking a shift in US state law from a permissive, largely notice and opt out approach, to a stricter GDPR-like regime limiting the collection of data to only what’s “necessary”.</p>
<p>While for several years, businesses addressed the fracturing US privacy landscape as “California plus”, grounding their compliance efforts in CCPA while keeping an eye on state by state developments, this approach will now change to a more integrative strategy. California is no longer the strictest regime, though it continues to be an outlier in its application to employees’ data, and new frameworks, such as <a href="https://mgaleg.maryland.gov/2024RS/Chapters_noln/CH_455_sb0541e.pdf">Maryland’s data minimization principles </a>or Washington’s sector specific <a href="https://app.leg.wa.gov/RCW/default.aspx?cite=19.373&amp;full=true">My Health My Data Act</a>, will need to be accounted for. State laws also diverge in application thresholds &#8211; for example, Texas law introduced a small business exemption based not on revenue thresholds but on being defined as a small business by the United States Small Business Administration. And while some states provide entity level exemptions for businesses covered by federal laws such as HIPAA, GLBA or FERPA, others allow a much more limited data-based exemption. Other differences manifest in state law treatment of specific categories of data, such as kids’ information or biometrics, or in the scope of consumer privacy rights.</p>
<p><strong>Health Privacy </strong>&#8211;</p>
<p>Don’t be surprised if the red wave in federal elections will lead to a blue counterwave in state efforts to protect consumer health information, particularly &#8211; though not only &#8211; in the context of reproductive rights and gender affirming care. Already, back in 2023, Washington state enacted the <a href="https://app.leg.wa.gov/billsummary?BillNumber=1155&amp;Initiative=false&amp;Year=2023">My Health My Data Ac</a>t, which has been emulated by legislation in <a href="https://www.leg.state.nv.us/Session/82nd2023/Bills/SB/SB370.pdf">Nevada</a>. Over the past couple of months, we’ve seen a reproductive data privacy geofencing bill in New York (A 5517); a bill specifically scoped to reproductive data privacy in Michigan (SB 1082); a bill on AI and mental health data in Texas (HB 1265); multiple bills on location health data in California (AB 45 and AB 67); and a bill on privacy in the context of reproductive rights and gender-affirming care in Virginia (SB 754).</p>
<p>At the federal level, we will see significant changes to Health Insurance Portability and Accountability Act of 1996 (HIPAA) regulations in the near future. Recently, the U.S. Department of Health and Human Services (HHS) proposed updates to the HIPAA Security Rule to strengthen the security of electronic protected health information (ePHI). The <a href="https://www.federalregister.gov/documents/2025/01/06/2024-30983/hipaa-security-rule-to-strengthen-the-cybersecurity-of-electronic-protected-health-information">proposed rule </a>outlines several major revisions to the Security Rule, including the removal of the distinction between &#8220;required&#8221; and &#8220;addressable&#8221; implementation specifications, the introduction of compliance deadlines and documentation requirements, and more detailed risk analysis guidelines. Additionally, the rule proposes new notice requirements for changes in workforce access termination status, updates to contingencies and security incident response procedures and the introduction of new security controls. It further mandates the encryption of ePHI both at rest and in transit, along with the required use of multi-factor authentication.</p>
<p><strong>Kids privacy &#8211; </strong></p>
<p>Few areas are as contentious – but also draw bi-partisan convergence &#8211; as kids’ privacy. Policymakers, educators and parents alike assert that <em>something </em>must be done about kids’ privacy, <em>but what</em>? Everyone is anxious about kids’ use of devices, apps and social media sites that generate and consume a gushing stream of data, but regulatory protective measures present formidable risks to free speech and online anonymity. As discussed above, this may be one area where Congress steps up and delivers federal legislation. The latest bi-partisan efforts include KOSPA, sponsored by Sen. Marsha Blackburn (R‑TN) and Richard Blumenthal (D‑CT) and <a href="https://www.markey.senate.gov/news/press-releases/senators-markey-cassidy-announce-chair-cantwell-and-ranking-member-cruz-as-cosponsors-of-coppa-20-childrens-privacy-legislation">COPPA 2.0</a>, sponsored by Sen. Ed Markey (D-MA) and Bill Cassidy (R-LA).</p>
<p>In this area too, states haven’t been sitting on their hands. Just this past year, New York passed two important kids’ privacy laws, the <a href="https://legislation.nysenate.gov/pdf/bills/2023/S7695A">Child Data Protection Act</a>, which focuses on kid’s data protection, and the <a href="https://www.nysenate.gov/legislation/bills/2023/S7694/amendment/A">Stop Addictive Feeds Exploitation (SAFE) For Kids Act</a>, which focuses on preventing addictive apps and technologies. Meanwhile, Maryland passed an <a href="https://mgaleg.maryland.gov/2024RS/bills/hb/hb0603T.pdf">Age Appropriate Design Code Act</a>, modeled after the California law that, for now, is held up in constitutional litigation.</p>
<p>Regulators too have weighed into this space. While the FTC’s commercial surveillance rulemaking is kaput, its <a href="https://www.federalregister.gov/documents/2024/01/11/2023-28569/childrens-online-privacy-protection-rule">COPPA Rule</a> refresh is very much viable. Furthermore, enforcement agencies from the FTC to the California and Texas AGs have demonstrated vigor in enforcing against violations of kids’ privacy rights. We expect state-level interest in and enforcement of children’s privacy rights to grow in 2025 and beyond.</p>
<p><strong>Sensitive data, ad tech and data brokers</strong></p>
<p>In 2024, we witnessed the regulation of data brokers for the first time veer off from a privacy track to one anchored in national security. As tensions with China heightened over national security and trade policy, policymakers recognized concerns around foreign governments’ access to US persons’ sensitive personal data. Consequently, the Biden administration passed legislation and an Executive Order intended to protect sensitive personal information of U.S. persons from being accessed by China or entities under its control. While the “Tik Tok law” &#8211; <a href="https://www.congress.gov/bill/118th-congress/house-bill/7521">Protecting Americans from Foreign Adversary Controlled Applications Act of 2024 </a>&#8211; drew a lot of media attention, the administration also passed the <a href="https://www.congress.gov/bill/118th-congress/house-bill/7520/text/eh">Protecting Americans’ Data from Foreign Adversaries Act of 2024 (PADFA)</a> as an add-on to a supplemental appropriations bill to support Israel and Ukraine. In addition, the <a href="https://www.federalregister.gov/documents/2024/03/01/2024-04573/preventing-access-to-americans-bulk-sensitive-personal-data-and-united-states-government-related">President’s Executive Order (EO) 14117 on Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern</a> passed and was accompanied by extensive <a href="https://www.justice.gov/d9/2024-12/NSD%20104%20-%20Data%20Security%20-%201124-AA01%20-%20Final%20Rule_0.pdf">rulemaking by the DOJ</a>, published just days before the end of 2024. While PADFA focuses on preventing data brokers from selling personally identifiable sensitive data of U.S. individuals to a foreign adversary country or an entity controlled by a foreign adversary country, the EO has much broader remit. The EO applies not only to data brokers, but rather to any entity that makes Americans’ bulk sensitive personal data and certain U.S. Government-related data accessible in countries of concern, notably China (including Hong Kong), and extends to not only data selling but also investment, employment or vendor agreements involving such entities and access to data.</p>
<p>Data brokers attracted the scrutiny of other regulators too. Last month, the CFPB issued a proposed <a href="https://files.consumerfinance.gov/f/documents/cfpb_nprm-protecting-ams-from-harmful-data-broker-practices_2024-12.pdf">rule for Protecting Americans from Harmful Data Broker Practices</a>. The rule would apply the protections under the Fair Credit Reporting Act to a category of data brokers that have traditionally not been viewed as consumer reporting agencies, if such businesses sold information about a consumer’s income or financial tier, credit history, credit score, or debt payments, regardless of how the information is used. The rule also asserts that advertising and marketing are not “permissible purposes” for which consumer reporting agencies may furnish consumer reports. However, given the November election results, the fate of the rule &#8211; and indeed of the CFPB itself &#8211; has become unclear.</p>
<p>The FTC too focused on enforcement actions against data brokers. In a series of enforcement actions, starting with <a href="https://www.ftc.gov/news-events/news/press-releases/2024/05/ftc-finalizes-order-inmarket-prohibiting-it-selling-or-sharing-precise-location-data">InMarket</a> and <a href="https://www.ftc.gov/news-events/news/press-releases/2024/04/ftc-finalizes-order-x-mode-successor-outlogic-prohibiting-it-sharing-or-selling-sensitive-location">X-Mode</a> at the beginning of 2024 and culminating with <a href="https://www.ftc.gov/news-events/news/press-releases/2024/12/ftc-takes-action-against-mobilewalla-collecting-selling-sensitive-location-data">Mobilewalla</a> and <a href="https://www.ftc.gov/news-events/news/press-releases/2024/12/ftc-takes-action-against-gravy-analytics-venntel-unlawfully-selling-location-data-tracking-consumers">Gravy Analytics</a> at the end of the year, the agency cracked down on data brokers selling location data, particularly in sensitive contexts. Importantly for this industry, the FTC stated held that brokers cannot simply rely on contractual language with data providers to verify consumer consent; rejected data brokers’ creation of sensitive location segments, such as ones tied to healthcare clinics, places of worship, LGBTQ gatherings, or political rallies; and cracked down on brokers’ practice of monetizing data inappropriately collected and retained from real time bidding (RTB) exchanges. The FTC’s decisions bear implications not only for data brokers, but also for companies up and down the data supply chain, requiring businesses to ensure consumers provide verifiable consent, honor opt outs, block sensitive locations, disclose retention practices and verify data hygiene and accountability. Industry groups too tightened their <a href="https://thenai.org/accountability/precise-location-information-solution-provider-voluntary-enhanced-standards/">best practices</a> to prohibit the use, sale, and transfer of precise location information related to “sensitive points of interest”.</p>
<p>The FTC criticized the ad tech industry beyond just data brokers. In September, the agency released a lengthy report titled, “<a href="https://www.ftc.gov/system/files/ftc_gov/pdf/Social-Media-6b-Report-9-11-2024.pdf">A Look Behind the Screens: Examining the Data Practices of Social Media and Video Streaming Services</a>,” with recommendations addressing data minimization, restrictions on data sharing, protection of kids’ and teens’ personal information, and automated decision making. In a <a href="https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2024/11/data-clean-rooms-separating-fact-fiction">November blog post</a>, the FTC criticized Data Clean Rooms, stating they “are not rooms, do not clean data, and have complicated implications for user privacy, despite their squeaky-clean name.” The FTC added: “Companies shouldn’t view Data Clean Rooms as a way to get around their obligations under the law or the promises they have made to consumers.”</p>
<p>States too kept their eyes on the data broker industry. With Oregon’s data broker law going into effect and the Texas Attorney General <a href="https://www.texasattorneygeneral.gov/news/releases/attorney-general-ken-paxton-notifies-over-100-companies-their-apparent-failure-comply-texas-data">announcing an enforcement sweep</a> against unregistered data brokers, registration requirements now figure in four states, along with California and Vermont. In recent rulemaking under California’s DELETE Act, the CPPA narrowed the “direct relationship” exception in the statutory definition of &#8220;data broker&#8221; to situations where &#8220;a consumer intentionally interacts with a business for the purpose of obtaining information about, accessing, purchasing, using, or requesting the business&#8217;s products or services within the preceding three years.&#8221; That means that the CPPA would consider a business to be a data broker <em>even if it had a direct relationship with consumers</em> as long as it sold personal information about such consumers that the business did not collect directly from them. While still a year out, the DELETE Act requires the CPPA to establish a &#8220;Deletion Request and Opt-Out Platform&#8221; by January 1, 2026. By August 2026, data brokers will have to respond to single-click deletion requests and review the deletion system every 45 days. As a result, opt out rates are expected to skyrocket.</p>
<p>And if the intentions of policymakers weren’t clear enough, the powerful attorneys general of <a href="https://www.texasattorneygeneral.gov/news/releases/attorney-general-ken-paxton-notifies-over-100-companies-their-apparent-failure-comply-texas-data">Texas</a> and <a href="https://cppa.ca.gov/announcements/2024/20241114.html">California</a> announced broad enforcement sweeps against data brokers, culminating in a set of fines and penalties.</p>
<p><strong>Artificial Intelligence </strong></p>
<p>Anyone say AI? No technological or business development has captured the public’s imagination over the past few years like the AI revolution. Whether it’s generative AI that puts personal data and intellectual property at risk or artificial general intelligence (AGI) that threatens human existence, everyone is searching for AI governance frameworks. One of President-Elect Trump’s campaign promises was to repeal President Biden’s Executive Order on AI, and more broadly, AI policy, on day one in office. The Trump Administration will address AI policy primarily through the lens of (a) competition with China on innovation; (b) national security; and (c) energy policy. While Trump’s 2019 Executive Order on AI paid tribute to civil rights, that issue was nowhere near its prominence as in Biden’s 2023 Executive Order or his 2022 “Blueprint for an AI Bill of Rights.”</p>
<p>But here too, the states may have the final say. By stepping back from AI governance and ethics, the Administration will effectively cede territory to the states. Already, Colorado is implementing its <a href="https://leg.colorado.gov/sites/default/files/2024a_205_signed.pdf">Artificial Intelligence Act</a>, focused on AI rendering consequential decisions, and<a href="https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202320240AB2013"> California&#8217;s AB 2013</a> requires generative AI providers to ensure training data transparency. California will likely re-litigate <a href="https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202320240SB1047">SB 1047</a>, the bill requiring developers of “frontier models” to implement prescriptive safety measures. In Texas, Rep. Giovanni Capriglione (R), who authored the state’s privacy law, recently submitted his <a href="https://www.mba.org/docs/default-source/policy/state-relations/draft_texas-ai_10.28.24.pdf?sfvrsn=9f83267e_1">Responsible AI Governance Act</a>. Indeed, dozens or even hundreds of AI laws are making their way through the legislative pipeline in state capitals.</p>
<p><strong>Litigation &#8211; </strong></p>
<p>Historically, privacy wasn’t a field of vibrant litigation. Enforcement was largely the remit of regulatory agencies, such as the FTC and HHS OCR. No more. The past couple of years have seen a surge in privacy litigation, with the plaintiffs’ bar weaving privacy causes of action out of federal and state laws ranging from the VPPA to CIPA, BIPA and even trap and trace legislation and Daniel’s Law. These cases target business practices ranging from integrating cookies, pixels, session replay or SDKs, to using third parties to offer customer service chatbots or track email open rates. Despite some recent cases the industry views as helpful, such as the Supreme Judicial Court of Massachusetts decision in <a href="https://law.justia.com/cases/massachusetts/supreme-court/2024/sjc-13542.html">Vita v. New England Baptist Hospital</a>, rejecting application of wiretapping laws to the use of pixel technology on websites, and Ninth Circuit Court of Appeals decision in <a href="https://cdn.ca9.uscourts.gov/datastore/opinions/2024/06/17/22-16925.pdf">Zellmer vs. Meta</a>, holding that under BIPA, biometric identifiers must be capable of identifying a person, plaintiffs are likely to continue to pepper businesses with demand letters and claims threatening mass arbitration or class action litigation.</p>
<p>***</p>
<p>Privacy regulation and enforcement activity were neither quiet nor predictable last year. Rather, the landscape of privacy and AI law in 2024 continued to evolve at a breakneck speed, with both federal and state governments taking a more active role in defining the regulatory framework. Key developments include a growing focus on protecting consumer health data, enhancing children&#8217;s privacy, and addressing the practices of data brokers. Litigation around privacy violations has surged, highlighting the urgency of robust protections and measures for risk mitigation. As we look ahead to 2025 and beyond, we can expect continued efforts to standardize privacy regulations, with increased emphasis on consumer rights, transparency, and the accountability of companies handling sensitive data. And wherever federal policymakers and regulators take a step back, expect state lawmakers and attorneys general to press forward. There’s never a vacuum in technology regulation, an issue that draws intense media focus and broad political appeal. The intersection of technology and regulation will remain dynamic, requiring both innovation and adaptation to ensure individuals&#8217; privacy and security in an interconnected world. As new challenges arise, the push for stronger enforcement, clearer guidelines for businesses, and more comprehensive protections for vulnerable groups like children will be central to the ongoing dialogue. The coming years will shape the future of privacy and AI law, balancing innovation with the need to protect personal freedoms and individual protections in the coming age.</p>
<p><em>Reprinted with permission from IAPP</em></p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2025/01/07/us-privacy-and-ai-outlook-for-2025-less-feds-more-states/">US Privacy and AI Outlook for 2025: Less Feds, More States</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The NIS 2 Era is Here: Are You Compliance-Ready?</title>
		<link>https://www.goodwinprivacyblog.com/2024/11/01/the-nis-2-era-is-here-are-you-compliance-ready/</link>
		
		<dc:creator><![CDATA[Curtis McCluskey|Jack McCarthy|Mihaela Angelova]]></dc:creator>
		<pubDate>Fri, 01 Nov 2024 16:14:09 +0000</pubDate>
				<category><![CDATA[Cybersecurity Preparedness & Response]]></category>
		<guid isPermaLink="false">https://www.goodwinprivacyblog.com/?p=664</guid>

					<description><![CDATA[<p>With the deadline for Member States to transpose the European Union’s updated Network and Information Systems Directive (Directive (EU) 2022/2555) (“NIS 2” or “Directive”) into national law having passed on 18 October 2024, organisations operating in or servicing the EU market face significant new cybersecurity obligations. The revised Directive, which...</p>
<div style="margin-top: 0px; margin-bottom: 0px;" class="sharethis-inline-share-buttons" data-url=https://www.goodwinprivacyblog.com/2024/11/01/the-nis-2-era-is-here-are-you-compliance-ready/></div>
<p><a class="more-link" href="https://www.goodwinprivacyblog.com/2024/11/01/the-nis-2-era-is-here-are-you-compliance-ready/#more-664">Read More</a></p>
<p>The post <a href="https://www.goodwinprivacyblog.com/2024/11/01/the-nis-2-era-is-here-are-you-compliance-ready/">The NIS 2 Era is Here: Are You Compliance-Ready?</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><img decoding="async" class="wp-image-331 size-full alignleft" src="https://www.goodwinprivacyblog.com/wp-content/uploads/sites/12/2022/03/shutterstock_1702974184_250x170.jpg" alt="" width="250" height="170" /></p>
<p>With the deadline for Member States to transpose the <a href="https://digital-strategy.ec.europa.eu/en/policies/nis2-directive" target="_blank" rel="noopener">European Union’s updated Network and Information Systems Directive (Directive (EU) 2022/2555)</a> (“<strong>NIS 2</strong>” or “<strong>Directive</strong>”) into national law having passed on 18 October 2024, organisations operating in or servicing the EU market face significant new cybersecurity obligations. The revised Directive, which repeals and expands on the original NIS framework (“<strong>NIS 1</strong>”), broadens its regulatory scope and imposes enhanced compliance requirements to address the growing threats in a new era of digitalisation.</p>
<h3>Background of the NIS 2 Directive</h3>
<p>NIS 1 (introduced in 2018) primarily impacted ‘<em>operators of essential services</em>’ and ‘<em>digital service providers</em>’, such as online search engines and cloud services, requiring them to maintain a certain level of cybersecurity. However, NIS 2 expands on these categories, replacing them with ‘<em>essential</em>’ and ‘<em>important</em>’ classifications. If an organisation is deemed ‘<em>important</em>’ or ‘<em>essential</em>’ (defined by various categories, including provision of public electronic communication networks or services, domain name registries or system service providers, i.e. companies which, if disrupted, could have an impact on public safety, security or health or where disruption could result in systemic risks), they are also within scope. Therefore, digital infrastructure and digital providers (including social networking services platforms), manufacturing of critical products including medical devices, food, space, postal and courier services, and public administration fall within the broader scope of NIS 2.</p>
<h3>New Requirements Under NIS 2</h3>
<p>NIS 2 aims to address the increased prevalence of cyber threats across the expanding digital landscape. To bolster security, the Directive imposes comprehensive cybersecurity management and reporting obligations on in-scope organisations. These obligations are structured to prompt entities to actively manage risks, monitor vulnerabilities, and respond to incidents promptly.<br />
Core compliance obligations include:</p>
<ol>
<li><em>Enhanced Security and Risk Management</em>: Organisations must implement comprehensive cybersecurity measures to address risks across network and information systems, including incident detection, vulnerability disclosure, and data encryption.</li>
<li><em>Incident Reporting</em>: In a significant shift from NIS 1, NIS 2 introduces more onerous reporting requirements than the current ‘<em>without undue delay</em>’, and the reporting timelines are in stages. Entities are required to report significant incidents within 24 hours, followed by a more detailed report within 72 hours, and a final report within a month. The definition of ‘<em>significant</em>’ has also been simplified to avoid overreporting. Entities may also be required to notify the general public.</li>
<li><em>Increased Management Accountability</em>: Senior management must approve and oversee cybersecurity measures and may face personal liability if they fail to meet the requirements set out under NIS 2. A ‘<em>management body</em>’ isn’t defined in the Directive, and will be determined individually by member states. This requirement underscores the importance of leadership in driving and maintaining cybersecurity standards, and undertaking continuous training to ensure they have the necessary skills to assess the risks that their entity faces.</li>
<li><em>Supply Chain Security</em>: Recognising the risk posed by third-party providers, NIS 2 mandates that organisations actively monitor the security practices of their suppliers and incorporate these into their own risk management processes. NIS 2 applies to both large and medium-sized organisations in high-risk sectors and indirectly affects certain small entities through the supply chain, imposing standards for incident response, risk management, and compliance.</li>
<li><em>Regular Security Audits</em>: Essential entities are subject to regular audits and spot checks, while important entities undergo audits based on reasonable suspicion.</li>
</ol>
<h3>Key Implications for Organisations</h3>
<p><em>Compliance Costs</em>: The new obligations under NIS 2 are expected to impose additional costs on entities, particularly those newly subject to these requirements. Compliance measures, including additional staff training, consulting cybersecurity experts, and technology investments, will require significant planning and budget allocation.<br />
<em>Fines and Penalties</em>: NIS 2 allows for stringent penalties for non-compliance &#8211; member states have discretion to implement fines of up to €10 million or 2% of global turnover for essential entities and €7 million or 1.4% of global turnover for important entities. Member states also have discretion to implement their own rules on penalties for infringement. This reinforces the EU’s stance on prioritising cybersecurity and serves as a strong deterrent against non-compliance.<br />
<em>Operational Adjustments</em>: Affected organisations must integrate NIS 2’s requirements into their existing cybersecurity framework. For example, risk management practices need updating, and incident response plans should be revised to accommodate the Directive’s quick turnaround times for reporting.</p>
<h3>Steps to Prepare For NIS 2 Compliance</h3>
<ul>
<li><em>Applicability Assessment</em>: Evaluate whether your organisation qualifies as an ‘<em>essential</em>’ or ‘<em>important</em>’ entity under NIS 2 and assess which services and sectors are impacted.</li>
<li><em>Resource Allocation and Protocols Revision</em>: Ensure adequate budget and personnel are in place to implement cybersecurity measures, including regular audits, management training, and incident response.</li>
<li><em>Cybersecurity Expertise Engagement</em>: For entities new to EU cybersecurity regulation, consulting with experts can clarify compliance steps, especially for technical aspects like supply chain security and risk management.</li>
<li><em>Supply Chain Security Strengthening</em>: Evaluate supplier relationships, assess their cybersecurity standards, and ensure they align with NIS 2 requirements.</li>
<li><em>Documentation Preparation</em>: Entities should establish audit trails and reporting mechanisms to meet the documentation and accountability expectations under NIS 2.</li>
<li><em>EU/UK Regulatory Discrepancies</em>: Organisations operating in both the UK and the EU must also be mindful of regulatory discrepancies, as the UK has opted out of NIS 2 due to Brexit and is pursuing its own Cyber Security and Resilience Bill, anticipated in 2025. This means UK-based entities working with EU clients must align with NIS 2 while remaining compliant with UK cybersecurity standards.</li>
</ul>
<h3>Concluding Insights</h3>
<p>The NIS 2 Directive represents a significant step forward in strengthening the EU’s digital security landscape, with its expansive coverage and stringent compliance measures. For organisations, this Directive provides an opportunity to enhance cybersecurity and build resilience against growing digital threats. By preparing now, organisations can not only meet regulatory standards but also strengthen their position as cybersecurity-conscious leaders in their industries.<br />
For assistance with NIS 2 compliance and guidance on implementing effective cybersecurity measures, please contact our <a href="https://www.goodwinlaw.com/en/expertise/practices/data-and-privacy-and-cybersecurity" target="_blank" rel="noopener">Data Privacy &amp; Cybersecurity team</a>.</p>
<div style="margin-top: 0px; margin-bottom: 20px;" class="sharethis-inline-share-buttons" ></div><p>The post <a href="https://www.goodwinprivacyblog.com/2024/11/01/the-nis-2-era-is-here-are-you-compliance-ready/">The NIS 2 Era is Here: Are You Compliance-Ready?</a> appeared first on <a href="https://www.goodwinprivacyblog.com">Data, Privacy &amp; Cybersecurity Insights</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
