<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Perkins on Privacy</title>
	<atom:link href="https://www.perkinsonprivacy.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.perkinsonprivacy.com/</link>
	<description>Perkins Coie Brings You the Latest on Privacy</description>
	<lastBuildDate>Wed, 14 Aug 2024 20:24:43 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.5.5&amp;lxb_maple_bar_source=lxb_maple_bar_source</generator>

 
	<item>
		<title>FTC Order Bans Anonymous Messaging App from Serving Minors</title>
		<link>https://www.perkinsonprivacy.com/2024/08/ftc-order-bans-anonymous-messaging-app-from-serving-minors/</link>
		
		<dc:creator><![CDATA[Janis Kestenbaum, Aaron Haberman and Courtney Otto]]></dc:creator>
		<pubDate>Wed, 14 Aug 2024 20:24:18 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3411</guid>

					<description><![CDATA[<p>The Federal Trade Commission (FTC) and the Los Angeles County District Attorney on July 9, 2024, announced a complaint and proposed stipulated order against NGL Labs, LLC, and two NGL co-founders concerning the “NGL: ask me anything” anonymous messaging app. The complaint alleges that NGL marketed the app to children and teens despite awareness of... <a href="https://www.perkinsonprivacy.com/2024/08/ftc-order-bans-anonymous-messaging-app-from-serving-minors/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>The Federal Trade Commission (FTC) and the Los Angeles County District Attorney on July 9, 2024, announced a <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/NGL-Complaint.pdf">complaint</a> and <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/NGL-ProposedConsentOrder.pdf">proposed stipulated order</a> against NGL Labs, LLC, and two NGL co-founders concerning the “NGL: ask me anything” anonymous messaging app. The complaint alleges that NGL marketed the app to children and teens despite awareness of cyberbullying risks from the app to these groups, sent fake messages to drive up usage, knowingly collected the data of children without parental consent, made deceptive claims about the app and its use of artificial intelligence (AI) to filter out harmful messages, and failed to disclose recurring subscription fees. Notably, the order resolving the claims requires NGL to bar anyone under age 18 from accessing its service through a neutral age gate, which is the first time the FTC has required a business to block minors from an online service.</p>



<span id="more-3411"></span>



<h2 class="wp-block-heading"><strong>Complaint</strong></h2>



<p>NGL operates a social media app that allows users to solicit anonymous comments from “their friends and social media contacts” by selecting a prompt and posting a personalized link on other social media platforms. According to the complaint, once a user posted a prompt inviting anonymous messages, they received automated messages that appeared to come from their contacts but actually came from NGL. The complaint also alleges that to induce users to upgrade to NGL Pro, the paid version of the app, the company sent messages like “I know what you did” and “one of your friends is hiding s[o]mething from u” that appeared to come from the user’s contacts with the false promise that they would learn the identity of the sender if they upgraded.</p>



<p>In addition, NGL claimed to use “world class AI content moderation” but failed to filter out harmful content on the NGL app such as messages like “Everyone hates you,” and the app has been used to send threatening and sexually explicit content, as the company was aware due to reports from users, parents, and media. </p>



<p>The complaint alleges the following violations of law:</p>



<ul>
<li><strong>Children’s Online Privacy Protection Act (COPPA).</strong> The complaint alleges that NGL had actual knowledge it was collecting and maintaining personal information from children under age 13 due to complaints from parents and children that explicitly referenced the child’s age. The complaint alleges that NGL violated COPPA because it nonetheless failed to provide notice and obtain parental consent before any collection or use of children’s personal information, retained the data of children indefinitely, and did not delete children’s personal information upon parental request.</li>



<li><strong>Restore Online Shoppers’ Confidence Act (ROSCA).</strong> The complaint alleges that NGL violated ROSCA by (1) failing to clearly disclose the recurring charges for the NGL Pro version of the app as well as other material terms, namely, that consumers would not, as represented, be told who had sent them anonymous messages, and (2) failing to obtain express informed consent before charging consumers for an upgrade to the NGL Pro version.</li>



<li><strong>Section 5 of the FTC Act and California False Advertising Law—deceptive statements about app features and AI capabilities.</strong> The complaint alleges that NGL misrepresented that (1) users will receive anonymous messages “from their friends or other social media contacts” when some of the messages were automatically sent by NGL and (2) NGL used “world class AI content moderation” to effectively filter out harmful content when explicit content and cyberbullying messages bypass the filters.</li>



<li><strong>Section 5 of the FTC Act</strong>—<strong>unfair marketing of anonymous messaging app to children and teens.</strong> The complaint also alleges that NGL acted unfairly by marketing the NGL app to children and teens “knowing that use of anonymous messaging apps by these groups causes substantial injury” in the form of cyberbullying and exposure to sexually explicit and other harmful content.</li>



<li><strong>California Unfair Competition Law (UCL).</strong> The complaint also alleges that NGL violated the UCL by violating Section 5, COPPA, and ROSCA.</li>
</ul>



<h2 class="wp-block-heading">Order</h2>



<p>Notably, the order bans NGL from offering anonymous messaging apps to users under age 18, which NGL must comply with by:</p>



<ul>
<li><strong>Ceasing all marketing directed to children or teens. </strong>NGL is prohibited from “[e]ngaging in any marketing or advertising activities that are directed to [c]hildren or [t]eens.”</li>



<li><strong>Implementing a neutral age screen that bars users under age 18 from using the app. </strong>The order requires NGL to adopt a neutral age screen, which the order defines in part as a process that “ask[s] age in a neutral manner” by “not default[ing] to an age 18 or over” and “avoids encouraging [u]sers to falsify age information by . . . stating that certain features will not be available” to those under the age of 18. The ordered neutral age screen must restrict access or use to individuals over age 18.</li>
</ul>



<p>The order also prohibits misrepresentations on a host of subjects, such as that messages that a user will receive through a messaging app are from friends, social media contacts, or other live persons; that consumers will be able to see the identity of those who sent them anonymous messages; that cyberbullying will be completely or mostly filtered out through the use of content moderation or AI; or the capabilities of any AI technology.</p>



<p>Related to the ROSCA claims, the order imposes a variety of provisions—such as prohibiting misrepresentations of any material fact of the transaction, including those related to negative option features—requires clear and conspicuous disclosure of key terms of recurring charges, and mandates a simple cancelation method for recurring charges.</p>



<p>The order further prohibits NGL from violating a variety of COPPA provisions and requires NGL to delete children’s personal information it has knowingly collected without having obtained verifiable parental consent under COPPA. </p>



<p>The defendants must pay $5 million ($4.5 million to the FTC and $500,000 to the State of California).</p>



<h2 class="wp-block-heading"><strong>Takeaways</strong></h2>



<p>This case highlights the FTC’s concerns with the risk of cyberbullying and other online harms to teens and children. However, it is unclear whether the unfairness claim alleged here for marketing the service to children and teens or the remedy of an outright ban on letting minors access the service is something the FTC would or could apply in other contexts, or if it was only appropriate here as fencing-in relief given the alleged egregiousness of the conduct. In support of a more limited view, in his concurring statement, <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/ngl-ferguson-concurrence-final-version.pdf" target="_blank" rel="noreferrer noopener">Commissioner Andrew Ferguson</a> observed that the alleged conduct of NGL was “reprehensible” because it was “tailormade to manipulate the vulnerable teenage psyche.” And as both he and <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/2024.7.8-holyoak-statement-re-ngl.pdf" target="_blank" rel="noreferrer noopener">Commissioner Melissa Holyoak</a> explained in their respective concurring statements, not all anonymous messaging services offered to teens and/or children are necessarily unfair. Rather, such services can be beneficial, such as in the context of a mental health app. Commissioner Ferguson also highlighted potential First Amendment issues, arguing that a limitation on the promotion of anonymous messaging platforms to children and teens could “be in serious tension with the recognized First Amendment rights of minors,” such as where teens seek to express a disfavored opinion. </p>



<p>The other commissioners did not respond to the Ferguson and Holyoak concurring statements, and it remains to be seen what circumstances would support an unfairness claim for offering anonymous messaging to teens and children or the remedy of an outright ban on allowing minors to access an online service.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Cybersecurity for Lawyers: A Series</title>
		<link>https://www.perkinsonprivacy.com/2024/08/cybersecurity-for-lawyers-zero-trust-and-the-plight-of-the-royal-food-taster/</link>
		
		<dc:creator><![CDATA[Andrew Pak and Emma Roberts]]></dc:creator>
		<pubDate>Fri, 09 Aug 2024 18:34:34 +0000</pubDate>
				<category><![CDATA[Cybersecurity]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3403</guid>

					<description><![CDATA[<p>Introduction If you are an attorney covering cybersecurity, not only do you have to stay on top of ever-evolving legal obligations and risks, you have to be able to speak competently with your technical counterparts. While there are plenty of technical resources, very few are geared to the needs of cybersecurity counsel. With that said,... <a href="https://www.perkinsonprivacy.com/2024/08/cybersecurity-for-lawyers-zero-trust-and-the-plight-of-the-royal-food-taster/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<h3 class="wp-block-heading">Introduction</h3>



<p>If you are an attorney covering cybersecurity, not only do you have to stay on top of ever-evolving legal obligations and risks, you have to be able to speak competently with your technical counterparts. While there are plenty of technical resources, very few are geared to the needs of cybersecurity counsel. With that said, our goal in this series, “Cybersecurity for Lawyers,” is to talk through a variety of cybersecurity topics and issues of the day, with a particular focus on providing the relevant basics and practical context around complex technical issues for in-house counsel who handle cybersecurity issues.</p>



<p>In this first post in this series, we talk broadly about the concept of Zero Trust (and royal food tasters), which is a term you may see come up a lot in the cybersecurity arena. Zero Trust is not a specific tool, product, or solution, but rather a security philosophy that “assumes the breach” and therefore justifies the expenditure of resources that wouldn’t be required if your perimeter were completely secure (because it can’t be). At the same time, there are many different ways to implement a Zero Trust framework, and some are better than others depending on the specific organization and scenario. </p>



<p>Please feel free to send an email to <a href="mailto:APak@perkinscoie.com">Andrew Pak</a> to suggest any topics you would like us to cover.</p>



<span id="more-3403"></span>



<h2 class="wp-block-heading">Cybersecurity for Lawyers: Zero Trust and the Plight of the Royal Food Taster</h2>



<p>You are the head of security for the new empress, in charge of meals. You’ve learned from your predecessor, no longer alive, that you can’t always trust your cooking staff because the last monarch was poisoned. You devise a new security protocol that includes terminating the entire kitchen staff and house staff immediately and bringing in a new pool of trusted employees who will be vetted with background checks, psychological profiling, and months of surveillance prior to hiring. You present your plan, and the empress asks, “But how can we trust the people who do the hiring? How can we ensure someone doesn’t masquerade as a vetted hire?” You respond with silence, causing the empress to shake her head and say, “You are missing the problem; there’s the potential for security breaches every step of the way even if we’ve vetted these people at the beginning of their employment! So, I have a new title for you: Royal Food Taster. I hope you are hungry and have a strong stomach.”</p>



<p>The empress is focused on Zero Trust. She does not assume that the people in her kitchen can be trusted—even if they were well vetted at the door—or that the food placed before her is the same that her kitchen prepared. She therefore wants you to <strong>Never Trust, Always Verify.</strong></p>



<h3 class="wp-block-heading"><strong>Principles of Zero Trust</strong></h3>



<p>Many products and services incorporate the term “Zero Trust,” but there is nothing proprietary about the term—it refers to a security strategy for protecting networks. If you’ve heard the phrase “assume the breach,” then the concept behind Zero Trust will be familiar. Unlike traditional security models that assume that users within an organization’s network are trustworthy, the Zero Trust model of security fundamentally relies on “never trust, always verify.” The Zero Trust security strategy focuses on enforcing security policies on each user, device, application, and piece of data, rather than protecting only the network perimeter.<a id="_ftnref1" href="#_ftn1"><sup>[1]</sup></a> As Zero Trust security systems become increasingly popular within organizations, it is important to understand the ins and outs of Zero Trust and the opportunities and implications of the security strategy. While this article is not meant to survey all of the definitions of Zero Trust available, these are some core principles:<a id="_ftnref2" href="#_ftn2"><sup>[2]</sup></a></p>



<ol>
<li><strong>Continuous Explicit Verification</strong>. Zero Trust intentionally makes all network systems and assets requiring protection inaccessible by default. Every user, device, location, and workload requiring protection must be authenticated and authorized, based on all available data, prior to accessing any resource.</li>



<li><strong>Least Privilege</strong>. In a Zero Trust system, each user and device has “least-privilege” access to an organization’s resources. Each user is given the minimum level of permission required to complete a task and must request additional permission to access further resources, minimizing users’ (and potential threat actors’) exposure to sensitive parts of an organization’s network.</li>



<li><strong>Assuming the Breach</strong>. Zero Trust assumes that hackers have already breached an organization’s network. Thus, actions to minimize the scope and reach of an ongoing breach are a key strategy. Because Zero Trust assumes attackers are both inside and outside of the network, organizations using the Zero Trust security model may implement a variety of breach minimization tools, such as segmentation, microsegmentation (making it more difficult for a hacker to traverse across portions of the network), threat detection, data encryption, and multi-factor authentication (MFA), which are all examples of practices that support a Zero Trust security model.</li>
</ol>



<p>While an organization may appropriately employ some or all of these strategies, they are all supportive of a Zero Trust strategy because they assume that threat actors can bypass the security perimeter, and are mechanisms to search for or mitigate malicious internal activity within that perimeter.</p>



<h3 class="wp-block-heading">Examples of Zero Trust Strategies</h3>



<p>MFA in many ways exemplifies the core principles of Zero Trust because it adds an additional authentication check even after a user presents an accurate login and password. But beyond the implementation of a single control, a Zero Trust model would include multiple ways to gauge the overall likelihood that a particular user account is controlled by a malicious actor. For instance, while not a guaranteed sign of malicious versus non-malicious activity, an organization may add additional security checks to any device that attempts to log in outside of normal working hours or from an unusual geographic location, or look for other user activity that suggests malicious intent. While these sorts of checks do not guarantee that a particular user is who they claim to be, they are examples of the layers of additional checks an organization could do on an account that has already passed an authentication challenge, and they are therefore consistent with Zero Trust principles.</p>



<h3 class="wp-block-heading"><strong>A Broad Range of Controls Can Support a Zero Trust Strategy</strong></h3>



<p>Many controls and tools can accurately be called consistent with Zero Trust principles, but some security checks are more effective than others. Take, for example, the plight of our Royal Food Taster, who has to taste a monarch’s food for poison. That is a form of Zero Trust. The food taster would argue that there are other forms of Zero Trust, such as monitoring all the people who come in contact with the food, monitoring the food itself at all times, or announcing rewards for tips regarding assassination attempts. The food taster would be right in some sense, because instead of trusting that the kitchen staff are all loyal subjects based on the vetting process, it assumes any one of them can have malicious intentions. But while all of these practices employ a Zero Trust strategy, unfortunately for our food taster, some practices are more effective than others. In the same way, Zero Trust controls come in many varieties, with some being more effective than others. Nevertheless, an effective Zero Trust strategy layers many controls to reach a desired effect. While the use of a food taster may be the most effective check available at the time, when dealing with something as important as the life of the empress, a good strategy would be to also employ the less drastic measures (e.g., vetting, monitoring), thus providing a layered approach to security and perhaps saving a few food tasters.</p>



<p>The thing to remember about Zero Trust is that it is a strategy, and a very broad and amorphous one at that—not a specific tool, product, or even control. Anything that amounts to a form of control of verification on an already authenticated user is arguably Zero Trust, but that doesn’t make that control in and of itself a good or bad idea for a particular organization’s security posture, or otherwise convert the entirety of its cybersecurity program into a Zero Trust architecture. There are controls within a Zero Trust framework that might make sense for one organization but not another. And some of the tools that use the term “Zero Trust” in their branding and/or marketing materials would actually represent a downgrade in security if used by certain companies. If someone tells you their organization employs Zero Trust controls, that means nothing on its own because there are so many different ways to layer, configure, and implement these controls. Don’t get us wrong—the development of Zero Trust principles has been incredibly important to improving cybersecurity writ large. But its main value is its recognition that there is no such thing as a perfectly secure perimeter, which serves as a foundational assumption for a good cybersecurity program.</p>



<p><em>The authors wish to acknowledge Summer Associate Laurel McCabe’s contributions to this blog</em>.</p>



<hr class="wp-block-separator has-alpha-channel-opacity">



<p><a href="#_ftnref1" id="_ftn1"><sup>[1]</sup></a> <em>See </em><a href="https://www.nist.gov/publications/zero-trust-architecture">Zero Trust Architecture | NIST</a>.</p>



<p><a href="#_ftnref2" id="_ftn2"><sup>[2]</sup></a> <em>See </em><a href="https://learn.microsoft.com/en-us/security/zero-trust/zero-trust-overview">What is Zero Trust? | Microsoft Learn</a>; <em>see also</em><em> </em><a href="https://www.nist.gov/publications/zero-trust-architecture">Zero Trust Architecture | NIST</a>.<em></em></p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Illinois’ Plan to Limit Privacy Violation Damages Opens New Door</title>
		<link>https://www.perkinsonprivacy.com/2024/07/illinois-plan-to-limit-privacy-violation-damages-opens-new-door/</link>
		
		<dc:creator><![CDATA[Ryan Spear, Nicola Menaldo, Mylan Traylor and Akua Asare-Konadu]]></dc:creator>
		<pubDate>Wed, 31 Jul 2024 16:18:05 +0000</pubDate>
				<category><![CDATA[Biometrics]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3395</guid>

					<description><![CDATA[<p>Changes to Illinois’ Biometric Information Privacy Act are awaiting Gov. J.B. Pritzker’s (D) signature. The legislation offers much-needed clarity for businesses but has raised questions about whether courts would apply the changes retroactively to past or ongoing lawsuits. We believe that the legislative history of the measure provides judges with a powerful basis to limit... <a href="https://www.perkinsonprivacy.com/2024/07/illinois-plan-to-limit-privacy-violation-damages-opens-new-door/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>Changes to Illinois’ Biometric Information Privacy Act are awaiting Gov. J.B. Pritzker’s (D) signature. The legislation offers much-needed clarity for businesses but has raised questions about whether courts would apply the changes retroactively to past or ongoing lawsuits.</p>



<span id="more-3395"></span>



<p>We believe that the legislative history of the measure provides judges with a powerful basis to limit excessive damages for defendants in both future and pending litigation.</p>



<h2 class="wp-block-heading">Response to Ruling</h2>



<p>The legislation, known as <a href="https://ilga.gov/legislation/fulltext.asp?DocName=&amp;SessionId=112&amp;GA=103&amp;DocTypeId=SB&amp;DocNum=2979&amp;GAID=17&amp;LegID=152094&amp;SpecSess=&amp;Session=" target="_blank" rel="noreferrer noopener">SB 2979</a>, amends BIPA by defining “electronic signature” to expressly allow electronic consent and capping statutory damages on a per-person basis instead of a per-scan basis for violations involving the same individual.</p>



<p>If a company violates Section 15(b) or 15(d) of BIPA multiple times in the same way with respect to the same person, it will face a single instance of $1,000 or $5,000 statutory damages, not multiple penalties for each scan or disclosure.</p>



<p>SB 2979 is a direct response to the Illinois Supreme Court’s concerns in <a href="https://www.bloomberglaw.com/public/document/CothronvWhiteCastleSysInc2023IL128004CourtOpinion?doc_id=XIV85CKG000N" target="_blank" rel="noreferrer noopener"><em>Cothron v. White Castle System</em></a><em>. </em>The court decided that each instance of a company collecting biometric data without consent counted as a separate violation and a separate damage award under BIPA.</p>



<p><em>Cothron</em> opened the door to potentially astronomical damages for many companies, especially those using in biometric timekeeping that requires employees to scan their fingerprints multiple times each day. The court noted that White Castle’s potential damages could exceed $17 billion.</p>



<p>The court acknowledged that its ruling could create ruinous damages awards for some companies, but it declined to hold that such damages aren’t permitted under BIPA. Instead, it noted that courts have discretion to fashion damage awards that fairly compensate plaintiffs, without destroying a defendant’s business. It requested that “the legislature review these policy concerns and make clear its intent regarding the assessment of damages.”</p>



<p>n limiting statutory damages under BIPA to a per-person basis, SB 2979 reflects that it isn’t the legislature’s intent to authorize a damage award that would result in the financial disruption of a business. According to Rep. Ann Williams (D), the legislature sought to make BIPA awards “a little more reasonable, consistent with the request made by the Court in <em>Cothron</em>.”</p>



<h2 class="wp-block-heading">Legislative Intent</h2>



<p>SB 2979 faced pushback from some lawmakers and industry lobbyists because it doesn’t clearly state whether it applies retroactively. But the legislative history includes clear language suggesting judges should consider retroactive application in pending cases and even when deciding whether to reduce past awards under the statute.</p>



<p>Generally, if the Illinois legislature indicates the temporal reach of an amendment in the bill or through legislative history, courts must honor the intent in the absence of specific exceptions, which aren’t relevant here. SB 2979 doesn’t directly address retroactivity, but there are hints of legislative intent.<a href="https://www.bloomberglaw.com/public/document/FirstMortgCovDina2014ILApp2d130567381IllDec71211NE3d343CourtOpini?doc_id=X1LUFN4003"></a></p>



<p>During a House floor debate on May 16, Rep. Daniel Didech (D) asked if the amendment applied retroactively to past or pending cases. Williams replied that it didn’t but noted that “a reviewing court could take judicial notice of our amendment to the Act in determining an initial award or in reducing an award.”<a href="https://drive.google.com/file/d/1saUJndfsZ6w8uNzn9jeiJtaFHVUqfJ9F/view?usp=drive_link"></a></p>



<p>Other bill sponsors, including Sen. Bill Cunningham (D), stated that one of the reasons the legislature left out a specific retroactivity provision was to avoid pushback for potentially undoing previous judgments. However, Cunningham advised judges in pending lawsuits to note the legislature was answering the Illinois Supreme Court’s invitation for guidance about how liability should be accrued: on a per-person, not per-violation basis.</p>



<p>The lack of an explicit retroactive provision in SB 2979 may disappoint defendants in ongoing BIPA litigation. However, the bill offers multiple ways for judges to use their discretion to limit BIPA.</p>



<p>While defendants may be unable to argue that SB 2979 technically applies retroactively, they can argue that courts should take judicial notice of the bill and its legislative history, which show the legislature’s intent to limit BIPA damages to a single violation and to prevent ruinous damages for defendants.</p>



<p>Finally, if the statements by Williams and Cunningham blur the legislative intent, courts will decide whether the amendment is about procedures or actual rights. Usually, if a law only changes how<em> </em>things are done (a procedural amendment), it can apply retroactively. An amendment affecting a statute’s remedy provision is generally considered to be a procedural change and thus would support retroactive application.</p>



<p>Reproduced with permission. Published June 18, 2024. Copyright 2024 Bloomberg Industry Group 800-372-1033. For further use please visit <a href="https://www.bloombergindustry.com/copyright-and-usage-guidelines-copyright/" target="_blank" rel="noreferrer noopener">https://www.bloombergindustry.com/copyright-and-usage-guidelines-copyright/</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>CPPA Regulatory Delays and Enforcement Updates: Takeaways from July Board Meeting</title>
		<link>https://www.perkinsonprivacy.com/2024/07/cppa-regulatory-delays-and-enforcement-updates-takeaways-from-july-board-meeting/</link>
		
		<dc:creator><![CDATA[James G. Snell, Peter Hegel, Rohan Andresen and Francys Guevara]]></dc:creator>
		<pubDate>Thu, 25 Jul 2024 15:29:48 +0000</pubDate>
				<category><![CDATA[CCPA]]></category>
		<category><![CDATA[Regulatory Enforcement]]></category>
		<category><![CDATA[State Privacy Laws]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3393</guid>

					<description><![CDATA[<p>On July 16, the California Privacy Protection Agency (CPPA) held a public meeting of its Board (the Board). Four days before the meeting, the CPPA released revised draft rulemaking totaling several hundred pages—including a revised combined draft rulemaking package on risk assessment regulations, cybersecurity audit regulations, and automated decision-making technology (ADMT) regulations. The meeting itself... <a href="https://www.perkinsonprivacy.com/2024/07/cppa-regulatory-delays-and-enforcement-updates-takeaways-from-july-board-meeting/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>On July 16, the California Privacy Protection Agency (CPPA) held a public meeting of its Board (the Board). Four days before the meeting, the CPPA released revised draft rulemaking totaling several hundred pages—including a revised combined draft <a href="https://cppa.ca.gov/meetings/materials/20240716_item8_draft_text.pdf" target="_blank" rel="noreferrer noopener">rulemaking package on risk assessment regulations, cybersecurity audit regulations, and automated decision-making technology (ADMT) regulations</a>.</p>



<p>The meeting itself focused much more on ADMT and artificial intelligence concepts than previous meetings, but it nonetheless resulted in several important updates related to privacy. Below, we summarize several key takeaways from the July Board meeting that provide insight into future compliance considerations.</p>



<span id="more-3393"></span>



<h2 class="wp-block-heading"><strong>CPPA Defers Action on Revised Draft Regulations</strong></h2>



<p>Building up to the July meeting, there was some preliminary indication that the Board would be expected to vote in the meeting on whether or not to enter the revised draft rulemaking package into final rulemaking procedures. Although the Board engaged in robust back-and-forth discussions on the revised draft regulations, it ultimately concluded that a more fulsome economic analysis must first be completed and deferred voting on entering into final rulemaking procedures until a future meeting (likely the Board’s next meeting in September). The Board and CPPA staff tacitly agreed that, for the next meeting, the draft regulations package will likely look similar to its current form but with several proposed alternatives based on the results of the economic analysis conducted.</p>



<p>The Board was particularly divided on issues relating to the proposed ADMT regulations. Specifically, the Board was very divided on whether the current requirements for conducting a risk assessment are overbroad—especially since under the current proposed draft, risk assessments are required in all instances when an entity is using ADMT for a “significant decision concerning a consumer.” Similarly, the Board debated whether the definition of ADMT is overbroad, expressing concern that current definitions would encapsulate simple technologies that may not involve otherwise high-risk processing of personal information.</p>



<p>Ultimately, the Board gave the CPPA staff a series of topics to research with the expectation that the staff will return with proposed alternatives—as well as a more thorough economic analysis—at the September meeting.</p>



<h2 class="wp-block-heading"><strong>New Enforcement Priorities and More Advisories on the Horizon</strong></h2>



<p>Separate from the discussion surrounding the proposed draft regulatory package, the CPPA Deputy Director of Enforcement Michael Macko presented a summary of the past year’s enforcement efforts (compiled in a <a href="https://cppa.ca.gov/meetings/materials/20240716_item6_enforcement_update_and_priorities.pdf" target="_blank" rel="noreferrer noopener">slide deck</a> released as part of the meeting materials). Macko highlighted the CPPA Enforcement Division’s infrastructural improvements with increases in staff and case management capacity, which allowed the Enforcement Division to handle over 2,000 complaints in the past year. The most common categories of complaint pertained to (i) consumer deletion rights, (ii) alleged improper collection, use, or storage of personal information, and (iii) consumer rights to opt out of the “sale” and “sharing” of their personal information. Notably, the Enforcement Division grew from only 10% of its attorney capacity at the beginning of the year to over 82% of its attorney capacity by the end of the fiscal year, indicating that the Enforcement Division is well poised in the coming year to take on additional enforcement actions.</p>



<p>Macko also unveiled a new set of priorities that he stated will inform enforcement efforts for the coming year, specifically focusing on:</p>



<ol>
<li>Businesses that fail to honor opt-out requests unless a consumer submits verification.</li>



<li>Businesses that sell or share personal information without notices / opt-out mechanisms.</li>



<li>Businesses that use dark patterns to prevent consumers from exercising their rights.</li>



<li>Businesses that violate the law in ways that affect vulnerable populations or groups.</li>
</ol>



<p>Finally, Macko touched on the CPPA Enforcement Division’s issuance of enforcement advisories (as seen recently in the CPPA’s first enforcement advisory, issued in <a href="https://cppa.ca.gov/pdf/enfadvisory202401.pdf" target="_blank" rel="noreferrer noopener">April, focusing on data minimization</a>), stressing that advisories are intended to deter violations of the law and hinting that another enforcement advisory may be issued soon.</p>



<h2 class="wp-block-heading"><strong>CPPA Prioritizes Seeking GDPR Adequacy Decision</strong></h2>



<p>The July Board meeting also covered the CPPA’s cooperation with other state, federal, and international agencies, with a particular focus on seeking an adequacy decision under the General Data Protection Regulation (GDPR). For next steps, the CPPA agreed that they will invite various European regulators to future meetings to clarify what steps would be necessary to obtain an adequacy decision, and the Board expressed a desire to work closely with the California state government to promote legislative action needed to support such a decision.</p>



<p class="has-text-align-center"> *                       *                       *                       *</p>



<p>If the July Board meeting is to serve as any indication, future Board meetings will continue to address privacy, enforcement, and other issues, with a particular focus on artificial intelligence and ADMT concerns. Meanwhile, the draft rulemaking package is not expected to significantly change before the September CPPA Board meeting, and the draft regulations include many provisions that companies may want to comment on. Perkins Coie has been involved in rulemaking since the California Consumer Privacy Act was passed and will continue to assist clients seeking practical changes to the draft regulations.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>“Biometric Identifiers Must Identify”: The Ninth Circuit Clarifies the Scope of BIPA</title>
		<link>https://www.perkinsonprivacy.com/2024/07/biometric-identifiers-must-identify-the-ninth-circuit-clarifies-the-scope-of-bipa/</link>
		
		<dc:creator><![CDATA[Susan Fahringer, Lauren Tsuji, Hayden Schottlaender and Emma Roberts]]></dc:creator>
		<pubDate>Tue, 23 Jul 2024 17:05:44 +0000</pubDate>
				<category><![CDATA[Biometrics]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3391</guid>

					<description><![CDATA[<p>The U.S. Court of Appeals for the Ninth Circuit issued an opinion in Zellmer v. Meta Platforms, Inc., on June 17, 2024, affirming dismissal of a putative class action filed under the Illinois Biometric Information Privacy Act. In what is expected to be an influential opinion, the panel held that the “face signatures” at issue were not... <a href="https://www.perkinsonprivacy.com/2024/07/biometric-identifiers-must-identify-the-ninth-circuit-clarifies-the-scope-of-bipa/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>The U.S. Court of Appeals for the Ninth Circuit issued an opinion in <em>Zellmer v. Meta Platforms, Inc.,</em> on June 17, 2024, affirming dismissal of a putative class action filed under the Illinois Biometric Information Privacy Act. In what is expected to be an influential opinion, the panel held that the “face signatures” at issue were not covered by the statute because they could not be used to identify a person.</p>



<p><a href="https://www.perkinscoie.com/en/news-insights/2024_0708-biometric-identifiers-ninth-circuit-clarifies-the-scope-of-bipa.html" target="_blank" rel="noreferrer noopener">Read the full Update here.</a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>A Midsummer State Privacy Law Update</title>
		<link>https://www.perkinsonprivacy.com/2024/07/a-midsummer-state-privacy-law-update/</link>
		
		<dc:creator><![CDATA[Peter Hegel, Rohan Andresen and Francys Guevara]]></dc:creator>
		<pubDate>Tue, 09 Jul 2024 21:13:45 +0000</pubDate>
				<category><![CDATA[CCPA]]></category>
		<category><![CDATA[Regulatory Enforcement]]></category>
		<category><![CDATA[State Privacy Laws]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3387</guid>

					<description><![CDATA[<p>APRA Cancellation, Rhode Island’s Privacy Act, and CPPA’s International Cooperation In an active summer on the privacy front, we share a few recent updates: Cancellation of APRA House Markup On the morning of June 27, 2024, as congressional staffers and audience members prepared to hear the latest updates on the American Privacy Rights Act (APRA),... <a href="https://www.perkinsonprivacy.com/2024/07/a-midsummer-state-privacy-law-update/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<h2 class="wp-block-heading">APRA Cancellation, Rhode Island’s Privacy Act, and CPPA’s International Cooperation</h2>



<p>In an active summer on the privacy front, we share a few recent updates:</p>



<p><strong>Cancellation of APRA House Markup</strong></p>



<p>On the morning of June 27, 2024, as congressional staffers and audience members prepared to hear the latest updates on the American Privacy Rights Act (APRA), the House Committee on Energy and Commerce announced that it was canceling its meeting to mark up and vote on the latest draft of the APRA. The next steps are unclear.</p>



<span id="more-3387"></span>



<p><strong>Rhode Island Data Privacy Law Diverges from the Pack</strong></p>



<p>Marking the latest addition to state comprehensive consumer privacy laws, on June 28, 2024, Rhode Island Governor Daniel McKee transmitted the Data Transparency and Privacy Protection Act (the Act) (<a href="https://legiscan.com/RI/text/S2500/id/3009034/Rhode_Island-2024-S2500-Amended.pdf" target="_blank" rel="noreferrer noopener">SB2500</a> &amp; <a href="https://legiscan.com/RI/text/H7787/id/3008296/Rhode_Island-2024-H7787-Amended.pdf" target="_blank" rel="noreferrer noopener">HB7787</a>) to be passed without his signature. The Act enters into effect on January 1, 2026.</p>



<p><em>Whom Does the Act Apply to?</em></p>



<p>Notably, the Act applies to a broader swath of entities than most other state privacy laws. Similar to other privacy laws, the Act applies to “for-profit entities that conduct business in [Rhode Island] or for-profit entities that produce products or services that are targeted to residents of the state and that during the preceding calendar year did any of the following:</p>



<ol>
<li>Controlled or processed the personal data of not less than thirty-five thousand (35,000) customers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction.</li>



<li>Controlled or processed the personal data of not less than ten thousand (10,000) customers and derived more than twenty percent (20%) of their gross revenue from the sale of personal data.”</li>
</ol>



<p>Critically, though, the Act includes a number of required privacy policy disclosures that would apply to “[a]ny commercial website or internet service provider conducting business in Rhode Island or with customers in Rhode Island or otherwise subject to Rhode Island jurisdiction”—regardless of the size of the entity or the quantity of data processing conducted by the entity.</p>



<p><em>The Act’s Focus on Targeted Advertising</em></p>



<p>The Act requires any commercial website or internet service provider that conducts business in Rhode Island or with customers in Rhode Island to post a notice that identifies “all third parties to whom the controller has sold or <strong><em>may </em></strong>sell customers’ personally identifiable information” (emphasis added). In addition to a new requirement in privacy disclosures to name the entities that a controller currently sells data to, Rhode Island ups the state privacy law ante by requiring controllers to identify all third parties that the controller may, at some undefined point in the future, be able to sell customers’ personally identifiable information to. Interestingly, and in contrast with the rest of the Act, this provision applies to “personally identifiable information,” which, unlike “personal data” and “sensitive data,” is not defined anywhere in the Act. These ambiguities raise questions about how entities can comply with the Act as well as how the Act will be enforced.</p>



<p>The Act generally aligns with the basic provisions of other state privacy laws; however, it does not include some of the more novel provisions that have become commonplace in recent laws. For instance, the Act does not contain any requirements for entities to adhere to data minimization practices, does not provide any further protections for individuals between the ages of 13 and 17, and does not include a temporary cure period following the effective date. Additionally, unlike some recently proposed legislation, such as <a href="https://ago.vermont.gov/blog/2024/06/14/statement-attorney-general-clark-governors-veto-h121-vermont-data-privacy-act" target="_blank" rel="noreferrer noopener">Vermont’s privacy bill, which was ultimately vetoed by the governor</a>, the Act does not contain a private right of action, and enforcement is left entirely to the discretion of the attorney general. Furthermore, the Act does not require the recognition of universal opt-out mechanisms.</p>



<p>Because of the lack of a cure provision and because of the novel privacy policy disclosure requirements, entities should take this opportunity to review their privacy practices to ensure they meet state privacy law requirements by the effective date.</p>



<p><strong>CPPA Announces Cooperation with CNIL</strong></p>



<p>On Tuesday, June 25, in Paris, the <a href="https://cppa.ca.gov/announcements/2024/20240625.html" target="_blank" rel="noreferrer noopener">California Privacy Protection Agency (CPPA) signed a “declaration of cooperation” with the French Commission Nationale de l’Informatique et des Libertés (CNIL)</a>, allowing both authorities to “collaborate on their efforts to safeguard personal information and advance privacy.” According to the CPPA’s press update, “[t]his declaration establishes a general framework of cooperation to facilitate joint internal research and education related to new technologies and data protection issues, share best practices, and convene periodic meetings.”</p>



<p>This is the latest entry into cross-border collaborative arrangements by the CPPA (the CPPA previously joined other international initiatives, including the <a href="https://www.cppa.ca.gov/announcements/2023/20230512.html" target="_blank" rel="noreferrer noopener">Asia Pacific Privacy Authorities</a> and the <a href="https://www.cppa.ca.gov/announcements/2022/20221027.html" target="_blank" rel="noreferrer noopener">Global Privacy Assembly</a>), indicating the CPPA’s intention to coordinate with other privacy regulators around the globe. This follows on the heels of recent remarks from CPPA Board members that the CPPA <a href="https://cppa.ca.gov/meetings/materials/20240510_transcript.pdf" target="_blank" rel="noreferrer noopener">wishes to obtain an adequacy decision under the General Data Protection Regulation (GDPR)</a>. These developments suggest the CPPA’s desire to become a player on the global stage.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>FCC Proposes New Internet Routing Security Rules for Telecoms</title>
		<link>https://www.perkinsonprivacy.com/2024/06/fcc-proposes-new-internet-routing-security-rules-for-telecoms/</link>
		
		<dc:creator><![CDATA[Marc S. Martin, Brandon Thompson and Joshua Perez]]></dc:creator>
		<pubDate>Fri, 28 Jun 2024 16:15:27 +0000</pubDate>
				<category><![CDATA[Cybersecurity]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3384</guid>

					<description><![CDATA[<p>Building on its renewed jurisdictional authority over broadband internet access service providers following the reinstatement of net neutrality, the Federal Communications Commission has adopted proposed internet routing security rules in a notice of proposed rulemaking designed to prevent foreign manipulation of internet traffic. Read the full Update here. <a href="https://www.perkinsonprivacy.com/2024/06/fcc-proposes-new-internet-routing-security-rules-for-telecoms/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>Building on its renewed jurisdictional authority over broadband internet access service providers following the reinstatement of net neutrality, the Federal Communications Commission has adopted proposed internet routing security rules in a notice of proposed rulemaking designed to prevent foreign manipulation of internet traffic.</p>



<p><a href="https://www.perkinscoie.com/en/news-insights/fcc-proposes-new-internet-routing-security-rules-for-telecoms.html" target="_blank" rel="noreferrer noopener">Read the full Update here</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Texas AG Turns Up the Heat on Privacy and Data Security</title>
		<link>https://www.perkinsonprivacy.com/2024/06/texas-ag-turns-up-the-heat-on-privacy-and-data-security/</link>
		
		<dc:creator><![CDATA[Hayden Schottlaender, Samantha Ettari, Michael Nguyen, Elijah Roden and Perkins Coie]]></dc:creator>
		<pubDate>Fri, 28 Jun 2024 14:23:29 +0000</pubDate>
				<category><![CDATA[Biometrics]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3382</guid>

					<description><![CDATA[<p>The Texas Data Protection and Security Act goes into effect on Monday, July 1, 2024. Eliminating any speculation that this omnibus consumer privacy law might sit on the cupboard shelf, unenforced, the Texas attorney general announced that his office has formed a task force to enforce the TDPSA, along with Texas’ several other data privacy... <a href="https://www.perkinsonprivacy.com/2024/06/texas-ag-turns-up-the-heat-on-privacy-and-data-security/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>The Texas Data Protection and Security Act goes into effect on Monday, July 1, 2024. Eliminating any speculation that this omnibus consumer privacy law might sit on the cupboard shelf, unenforced, the Texas attorney general announced that his office has formed a task force to enforce the TDPSA, along with Texas’ several other data privacy laws. This announcement was consistent with the Texas AG office’s recent enforcement of Texas’ biometrics law and newly enacted Data Broker Law. Data privacy enforcement in Texas is just beginning to heat up.</p>



<p><a href="https://www.perkinscoie.com/en/news-insights/2024_0625-texas-data-privacy-law-enforcement.html" target="_blank" rel="noreferrer noopener">Read the full Update here</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>California Attorney General Announces Children’s Privacy Settlement with Mobile Game</title>
		<link>https://www.perkinsonprivacy.com/2024/06/california-attorney-general-announces-childrens-privacy-settlement-with-mobile-game/</link>
		
		<dc:creator><![CDATA[Janis Kestenbaum]]></dc:creator>
		<pubDate>Thu, 27 Jun 2024 14:48:00 +0000</pubDate>
				<category><![CDATA[CCPA]]></category>
		<category><![CDATA[Privacy Compliance]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3374</guid>

					<description><![CDATA[<p>On June 18, 2024, the California Attorney General announced a settlement with Tilting Point Media LLC, the developer and publisher of the mobile game “SpongeBob: Krusty Cook-Off” (SpongeBob app), resolving allegations of unauthorized disclosure of children’s personal information under the federal Children’s Online Privacy Protection Act (COPPA) and the California Consumer Privacy Act (CCPA), as... <a href="https://www.perkinsonprivacy.com/2024/06/california-attorney-general-announces-childrens-privacy-settlement-with-mobile-game/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>On June 18, 2024, the California Attorney General <a href="https://oag.ca.gov/news/press-releases/attorney-general-bonta-la-city-attorney-feldstein-soto-announce-500000" target="_blank" rel="noreferrer noopener">announced</a> a settlement with Tilting Point Media LLC, the developer and publisher of the mobile game “SpongeBob: Krusty Cook-Off” (SpongeBob app), resolving allegations of unauthorized disclosure of children’s personal information under the federal Children’s Online Privacy Protection Act (COPPA) and the California Consumer Privacy Act (CCPA), as well as claims of unlawful advertising tactics under the California Unfair Competition Law (UCL). The settlement includes a $500,000 civil penalty and injunctive relief. The Los Angeles City Attorney, who has concurrent authority with the California Attorney General to enforce the UCL, joined the complaint and settlement.</p>



<span id="more-3374"></span>



<p><strong>Complaint</strong></p>



<p>According to the <a href="https://oag.ca.gov/system/files/attachments/press-docs/Complaint%20People%20v%20Tilting%20Point%20Media%20LLC%20%28filed%29.pdf" target="_blank" rel="noreferrer noopener">complaint</a>, the SpongeBob app was directed to children under COPPA and the company had actual knowledge that children were using its app even though Tilting Point’s terms of service and privacy policy stated that consumers under age 13 were not authorized to use the company’s services.</p>



<p><em>COPPA and CCPA. </em>As to both COPPA and the CCPA, the complaint alleges that the age screen was not neutral because it defaulted to a 1953 birth year and thereby likely encouraged children to indicate that they were older than they actually were and steered them to the adult version of the game with broader data collection designed for older users. In addition:</p>



<ul>
<li>Tilting Point incorrectly configured third-party software development kits (SDKs) embedded in the SpongeBob app to collect and disclose personal information for targeted advertising without the necessary consent, regardless of what age consumers indicated. Under COPPA, parental consent is required to collect personal information online from children under age 13. Under the CCPA, businesses may not sell or share the personal information of children under age 13 without affirmative authorization from the parent or, in the case of children between age 13 through 15, affirmative authorization from the consumer. The complaint alleges that Tilting Point failed to obtain (1) the requisite parental consent for children under age 13 under COPPA and the CCPA and (2) the consumer’s opt-in consent for the sale or sharing of personal information for consumers between age 13 through 15 under the CCPA.</li>



<li>Tilting Point failed to provide the required disclosures to parents and consumers of its practices with respect to targeted advertising and children. More specifically, with respect to COPPA, the complaint alleges that Tilting Point failed to describe its data processing practices in a direct notice to parents or on its website privacy policy. With respect to the CCPA, the complaint alleges that Tilting Point’s privacy policy insufficiently disclosed the collection, sale, or sharing of children’s personal data or the use and purpose of SDKs sufficient to allow consumers or parents to understand and exercise their CCPA rights.  </li>
</ul>



<p><em>UCL.</em> According to the complaint, Tilting Point engaged in a range of deceptive and unfair advertising tactics in the SpongeBob app:</p>



<ul>
<li>Displaying ads that were not clearly labeled as such that included full-screen videos that did not have clear exit methods; </li>



<li>Displaying ads that could not be stopped or dismissed until the player engaged with the ad or downloaded unnecessary apps;</li>



<li>Using unfair, deceptive, or other manipulative tactics to encourage excessive ad viewing by children and teens, as well as causing children and teens to inadvertently engage with ads or download additional apps; and</li>



<li>Displaying ads that were age-inappropriate (e.g., for a gambling app and a game about growing marijuana). </li>
</ul>



<p><strong>Order</strong></p>



<p>The proposed <a href="https://oag.ca.gov/system/files/attachments/press-docs/Proposed%20Final%20Judgment%20and%20Permanent%20Injunction%20People%20v%20Tilting%20Point%20Media%20%28filed%29.pdf" target="_blank" rel="noreferrer noopener">stipulated court order</a> requires payment of a $500,000 civil penalty and subjects Tilting Point to a permanent injunction that includes the following key provisions:</p>



<ul>
<li><strong>COPPA. </strong>Comply with relevant provisions of COPPA related to the disclosure of children’s data in the SpongeBob app and all of its games directed to children (e.g., providing a direct notice to parents and obtaining verifiable parental consent).</li>



<li><strong>CCPA. </strong>Comply with relevant provisions of the CCPA related to the sale or sharing of personal information of children under 16 (e.g., refrain from selling or sharing the personal information of consumers less than 13 years old without parental consent and refrain from selling or sharing the personal information of consumers at least 13 and less than 16 years old without the consumer’s affirmative opt-in consent, where Tilting Point has actual knowledge or willfully disregards the consumer’s age).</li>



<li><strong>Neutral Age Screens. </strong>Use only neutral age screens that encourage children to enter their age accurately.</li>



<li><strong>SDK Compliance.</strong> Implement and maintain an SDK governance program to review the use of SDKs in apps that are directed to children (including mixed audience apps) that collect personal information, including evaluating their configuration settings or controls and contracts, confirming and documenting measures to ensure the sale or sharing of personal information complies with the order. Tilting Point must annually assess compliance.</li>



<li><strong>Data Minimization.</strong> Collect only what personal information is reasonably necessary for a child to participate in any online service directed to children or for consumers age 13 through 15 to participate in any activity or game. </li>



<li><strong>Advertising. </strong>Ensure that any ad displayed on any website or online service directed to children under age 13 complies with the following: (1) identifies the ad as an ad and not part of gameplay; (2) includes a prominent “X” or “Close” button to promptly close the ad in one click; (3) does not manipulate or deceive consumers into engaging with the ad or downloading or installing unnecessary apps, making unnecessary purchases, or providing unnecessary personal information; and (4) does not promote activities in which children cannot legally engage or products they cannot legally possess.</li>
</ul>



<p><strong>Takeaways</strong></p>



<p>This case illustrates that state regulators are scrutinizing children’s privacy practices under COPPA and a growing body of state laws. The case also highlights that:</p>



<ul>
<li>Regulators give little, if any, weight to statements in privacy policies or terms of service that children are not authorized to use a service in determining whether children’s privacy laws apply.</li>



<li>Age screens should be designed to operate neutrally (e.g., allowing individuals to freely enter month and year of birth).</li>



<li>Regulators expect businesses to confirm that third-party SDKs integrated into apps function as intended.</li>



<li>Regulators are increasingly focused on whether businesses are properly identifying promotional content to children as advertising, as reflected in the <a href="https://www.ftc.gov/system/files/ftc_gov/pdf/p214505kidsadvertisingstaffperspective092023.pdf" target="_blank" rel="noreferrer noopener">FTC’s staff report</a> on so-called “stealth advertising” to kids, in addition to their broader focus on online child safety issues.</li>
</ul>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>A New Privacy Paradigm: Understanding Maryland’s Trailblazing Approach to Online Privacy</title>
		<link>https://www.perkinsonprivacy.com/2024/06/a-new-privacy-paradigm-understanding-marylands-trailblazing-approach-to-online-privacy/</link>
		
		<dc:creator><![CDATA[James G. Snell, Peter Hegel, Rohan Andresen and Francys Guevara]]></dc:creator>
		<pubDate>Thu, 20 Jun 2024 19:38:55 +0000</pubDate>
				<category><![CDATA[CCPA]]></category>
		<category><![CDATA[Regulatory Enforcement]]></category>
		<category><![CDATA[State Privacy Laws]]></category>
		<guid isPermaLink="false">https://www.perkinsonprivacy.com/?p=3360</guid>

					<description><![CDATA[<p>The end of Maryland’s legislative session has ushered in one of the year’s most ambitious and comprehensive consumer privacy laws. Maryland Governor Wes Moore officially signed into law the Maryland Online Data Privacy Act (MODPA) on May 9, 2024. Set to take effect on October 1, 2025, this law not only expands the online protections... <a href="https://www.perkinsonprivacy.com/2024/06/a-new-privacy-paradigm-understanding-marylands-trailblazing-approach-to-online-privacy/">Continue Reading…</a></p>]]></description>
										<content:encoded><![CDATA[<p>The end of Maryland’s legislative session has ushered in one of the year’s most ambitious and comprehensive consumer privacy laws. Maryland Governor Wes Moore officially signed into law the <a href="https://mgaleg.maryland.gov/2024RS/bills/sb/sb0541E.pdf" target="_blank" rel="noreferrer noopener">Maryland Online Data Privacy Act (MODPA)</a> on May 9, 2024. Set to take effect on October 1, 2025, this law not only expands the online protections consumers have come to expect from state privacy laws, but it also introduces additional measures designed to protect consumer data, including, among other things:</p>



<ul>
<li>Increased protections for processing sensitive data.</li>



<li>Protections for consumer health data.</li>



<li>New standards for processing biometric data.</li>



<li>Increased protections for treatment of youth data.</li>



<li>New limitations for loyalty programs.</li>



<li>Heightened data minimization standards.</li>
</ul>



<span id="more-3360"></span>



<p>Historically, states have modeled their privacy frameworks on either the <a href="https://leginfo.legislature.ca.gov/faces/codes_displayText.xhtml?division=3.&amp;part=4.&amp;lawCode=CIV&amp;title=1.81.5" target="_blank" rel="noreferrer noopener">California Consumer Privacy Act</a> (CCPA) or the <a href="https://law.lis.virginia.gov/vacodefull/title59.1/chapter53/" target="_blank" rel="noreferrer noopener">Virginia Consumer Data Protection Act</a> (VCDPA). The MODPA sets a new benchmark for data protection, with additional restrictions that businesses will need to navigate. Below, we discuss some noteworthy provisions that distinguish MODPA from other state privacy laws.</p>



<h2 class="wp-block-heading">Lower Threshold for Defining Controllers Under the Law</h2>



<p>MODPA applies to persons who control or process the personal data of at least 35,000 Maryland residents (consumers), or to persons who control or process the personal data of at least 10,000 consumers and derive more than 20% of their gross revenue from the sale of personal data. Compared to other states with similar populations, MODPA has a lower applicability threshold, meaning that smaller businesses doing business in Maryland may have to navigate the robust data protection framework there.</p>



<h2 class="wp-block-heading">Sale and Collection of Sensitive Data</h2>



<p>MODPA broadly prohibits the sale of sensitive data and the collection, processing, or sharing of sensitive data unless it is “strictly necessary” to provide or maintain a specific product or service requested by the consumer. The exceptions to these prohibitions, however, may still allow controllers to sell personal data in cases where the consumer affirmatively requests a service, directs the controller to disclose the personal data, or intentionally uses the controller to interact with a third party.</p>



<h2 class="wp-block-heading">Consumer Health Data Protections</h2>



<p>Going beyond other state laws, MODPA includes robust protections for consumer health data, akin to the health data definitions found in Washington’s My Health My Data (MHMD) and Nevada’s Consumer Health Data law. Under MODPA, “consumer health data” includes any personal data used to identify a consumer’s physical or mental health status. The definition also explicitly includes gender-affirming treatment and reproductive or sexual care, and all consumer health data is considered sensitive data. MODPA also imposes a smaller radius than other laws for virtual boundaries around certain health facilities and prohibits the use of geofencing technology to create a virtual boundary within 1,750 feet (as opposed to 2,000 feet under Washington’s MHMD) of any mental health, reproductive, or sexual health facility, for the purpose of tracking or collecting data from consumers, or sending them health-related notifications.</p>



<h2 class="wp-block-heading">Biometric Data</h2>



<p>Additionally, MODPA’s definition of biometric data is broader than that found in other state laws. Where most other state laws have defined biometric data to mean data generated by automatic measurements of the biological characteristics of a consumer used to uniquely authenticate a consumer, MODPA broadens the definition to mean data that <strong><em>can be </em></strong>used to uniquely identify a consumer. This modification creates uncertainty regarding review and enforcement of what data could be considered biometric data under MODPA.</p>



<h2 class="wp-block-heading">Increased Protection for Children</h2>



<p>Maryland is now one of two states to offer enhanced protections for all minors over 13 years old. Most notably, MODPA prohibits controllers from selling or processing personal data for the purposes of targeted advertising to consumers the controller knows or should know are under 18 years old. This markedly expands the rights of minors in two ways: first, it incorporates a more subjective “should have known” standard which may increase the obligations controllers have to manage knowledge of the age of their consumers; and second, it moves away from an opt-in standard towards a complete bar on the sale or processing of personal data for the purposes of targeted advertising to minors. While other states have opt-in provisions to allow controllers to process the data of minors who give their consent, there is no consent provision found in this section of MODPA. Although MODPA allows controllers to sell personal data in cases where the consumer affirmatively requests a service, the extent to which that applies here is unclear.</p>



<h2 class="wp-block-heading">Limitations on Bona Fide Loyalty Programs</h2>



<p>Additionally, MODPA restricts controllers from offering bona fide loyalty programs where the sale of personal data is a “condition of participation in the program.” While “condition of participation” is undefined under MODPA, this new language places potential limitations on bona fide loyalty programs not found in any other state’s privacy law.</p>



<h2 class="wp-block-heading">Data Minimization</h2>



<p>Lastly, MODPA requires controllers to comply with data minimization standards by precluding controllers from processing data that is not reasonably necessary or compatible with disclosed purposes, unless the consumer consents. And, when it comes to processing sensitive data, MODPA employs a higher standard, restricting controllers from collecting, processing, or sharing sensitive data concerning a consumer unless it is <strong><em>strictly</em></strong> necessary to provide or maintain a specific product or service requested by the consumer. “Strictly necessary” is not defined and, thus, it is unclear how this provision will be interpreted for enforcement.</p>



<h2 class="wp-block-heading">Enforcement</h2>



<p>With its new approach to biometric data, children’s privacy, and data minimization, MODPA is poised to significantly impact both consumers and businesses. The bill does not grant consumers a private right of action, but consumers are not prevented from pursuing any other remedy provided by law. Additionally, MODPA empowers the Maryland Attorney General with discretion to provide a 60 day cure-period that sunsets on April 1, 2027. While the statutory text does not explicitly grant the attorney general with rulemaking authority, Maryland Code § 13-205 allows the Division of Consumer Protection to engage in permissive rulemaking with respect to unfair or deceptive trade practices.</p>



<p class="has-text-align-center">* * * * *</p>



<p>MODPA contains many provisions that may be over and above what companies are currently required to adhere to under state laws. As a result, businesses should take this opportunity to consider reviewing their privacy practices in light of this new state law.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
