top of page

NYLJ “Regulating Commercial Speech in ‘Pixel Space’: Court Examines the New York Algorithmic Pricing Disclosure Act”

  • Stephen M. Kramarsky
  • 6 days ago
  • 9 min read

By Steve Kramarsky


How much is the Uber going to cost? There is only one way to answer that question: open the app and check. Is it rush hour, a holiday, or a rainy Tuesday evening? Is there something undefinable that makes Uber think you really need a ride? The cost for the same trip may be different for you and the person standing right next to you on the corner, and sometimes just opening a competing app can make the price go down. The only way to know is to check. Rideshare services are the most familiar example of a technology called algorithmic pricing, in which the service provider uses available data about the individual customer to set the offered price, but that technology is rapidly spreading.

Most consumers are familiar with dynamic pricing models, like those used for airfare or hotel reservations, that reward strategic timing and comparison shopping. In fact, most digital commerce markets, at least those with dynamic pricing like Amazon, are in a constant state of pricing flux, adjusting offered prices based on demand and consumer behavior.


But algorithmic pricing is different: it uses the customer’s personal data to set an individual price applicable only to that specific shopper. Different shoppers are charged different prices based on data collected from a variety of sources, including their past purchases, their browsing history, their zip code, or even the kind of phone they are using. AI pricing models can use that information to generate the price at which a particular individual is most likely to buy.


Proponents of the technology assert that algorithmic pricing models can result in more efficient markets and lower prices for some consumers. Opponents find the surveillance aspects troubling and worry that the technology can be used in discriminatory or consumer-hostile ways. In addition, AI models trained to respond in real time to competitors’ pricing can result in reduced price competition, creating antitrust concerns in concentrated markets. For these and other reasons, several states have moved to regulate the use of algorithmic pricing technology, with predictable pushback from retailers.


New York is the first state to pass a law requiring the disclosure of algorithmic pricing, the Algorithmic Pricing Disclosure Act, N.Y. Gen. Bus. Law § 349-a. The law was set to go into effect in July 2025, but was stayed pending the resolution of a challenge brought in the Southern District of New York by the National Retail Federation, an industry trade group. Recently, the court dismissed that challenge, and New Yorkers should expect to see the required disclosure popping up soon. Judge Rakoff’s opinion in that case, National Retail Fed’n v. James, No. 25-cv-5500 (JSR), 2025 WL 2848212 (S.D.N.Y. Oct. 8, 2025), offers a thorough analysis of the issues surrounding regulation of this kind of technology, and it is worth a closer look as the battle is likely to continue in New York and across the country.


The Challenge to the New York Law

As originally drafted, New York’s new algorithmic pricing law required the disclosure of algorithmic pricing technology and prohibited the use of protected-class data (personal information about race, gender, or other protected traits) to provide differential pricing. During the legislative process, the anti-discrimination provisions were removed, and the law as enacted contains only the disclosure requirement, though discriminatory pricing practices could still be actionable under the New York Human Rights Law (among other statutes). The law requires, in relevant part, that any entity doing business in New York that “sets the price of a specific good or service using personalized algorithmic pricing” include with its pricing “a clear and conspicuous disclosure that states: ‘THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.’” N.Y. Gen. Bus. Law § 349-a(2).


While this disclosure may not seem especially onerous, some retailers object to the law on the grounds that the required statement might give the impression that algorithmic pricing is harmful to consumers. On July 2, 2025, the National Retail Federation filed for an injunction to block the law on First Amendment grounds, naming New York Attorney General Letitia James as defendant. NRF is a retail trade association whose members use price-setting technologies to publish prices to consumers in New York and who are therefore subject to the disclosure requirement. National Retail Fed’n, 2025 WL 2848212, at *2.


It argued that the law compels “a broad range of retailers,” including its members, to express a misleading and controverted government-scripted opinion without justification, thereby violating its members’ First Amendment rights. The court denied the injunction and granted defendant’s motion to dismiss after an extensive analysis focused on the appropriate regulation of commercial speech in the e-commerce context.

The First Amendment Framework

The first question for a court considering a First Amendment challenge to a regulation of commercial speech is what level of scrutiny applies. Commercial speech is entitled to First Amendment protection, but that protection is less stringent than in the context of artistic expression or core political speech. In short, commercial speech regulations can be entitled to “intermediate scrutiny” or the even less stringent “reasonable relation” test, depending on what the regulation requires.

A law that prohibits or restricts commercial speech must survive so-called “intermediate” scrutiny in order to pass constitutional muster. Under this test, set out by the Supreme Court in Central Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n of New York, 447 U.S. 557 (1980), the regulation must directly advance a substantial governmental interest and must not be overly restrictive.


By contrast, a law that requires the disclosure of “purely factual and uncontroversial information” about the product offered, rather than restricting or prohibiting speech, is governed by a more permissive standard set out in Zauderer v. Off. of Disciplinary Counsel, 471 U.S. 626 (1985). Under the Zauderer standard, a commercial disclosure law passes muster as long as it is “reasonably related” to the state’s interest in preventing deception of consumers and is not unduly burdensome. The more relaxed standard under Zauderer comes from the fact that the First Amendment protection of commercial speech is mostly justified by the value of that speech to consumers, so requiring more disclosures, without prohibiting any particular speech, will generally be seen as constitutionally permissible in the commercial context.


What Standard Applies?

In the NRF case, the court’s first job was to determine which standard should apply to the New York Algorithmic Pricing Disclosure law. Defendant argued that the Zauderer “reasonable relation” standard should apply, because the law mandates only the disclosure of “purely factual and uncontroversial” commercial speech. National Retail Fed’n, 2025 WL 2848212, at *3. Plaintiff sought review under the stricter Central Hudson test, arguing that the regulation is based on the content and speaker at issue, and that the compelled disclosure is neither purely factual nor uncontroversial. The court rejected plaintiff’s arguments and applied the Zauderer standard.


First, the court noted that the “content and speaker” test is applicable only to restrictions or prohibitions on commercial speech, not to compelled disclosures. Restrictions on speech can unfairly limit market participation and inhibit the free flow of ideas if applicable only to certain speakers or subjects, so they should be examined carefully. Regulations that require additional disclosure do not create any such concerns because they promote the free flow of information rather than restricting it.


Second, the court held that the statement required by the law (“THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA”) is factual and accurate. Plaintiff conceded that the regulated technology uses personal data gathered by the seller to set prices and the disclosure only appears when the technology is in use. However, plaintiff pointed to Ninth Circuit precedent holding that even “literally true” disclosures can be subject to heightened scrutiny if they are misleading and thus not “pure” statements of fact.


The court rejected this argument, noting that no similar Second Circuit precedent exists and that plaintiff had failed to show the required disclosure was misleading.

The case relied on by plaintiff involved a required disclosure that a certain chemical was “known to cause cancer.” The Ninth Circuit held that statement was not “purely factual” because the statutory definition of the word “known,” in terms of scientific consensus, was different from what the ordinary person might understand the word to mean. Here plaintiff identified no similar ambiguity. It simply argued that the fact of the disclosure, in the context of a consumer protection law, would give consumers the misleading, imaginary, and unsubstantiated impression that price-setting algorithms are dangerous and involve non-consensual invasive surveillance. The court rejected these arguments as speculative and irrelevant to the Zauderer analysis.


In addition, plaintiff argued that a more stringent First Amendment analysis was warranted because the regulation compelled it to “take sides” in a public debate on a controversial issue. The court rejected this argument as well, noting that a disclosure is not rendered controversial merely because the regulated entity would prefer not to make it. The court noted Second Circuit precedent applying Zauderer to laws requiring disclosure in controversial contexts, such as social media moderation policies and reproductive health.


It held that the mere fact that a disclosure touches on a controversial subject does not require heightened scrutiny. In addition, the court noted that plaintiff had not shown the disclosure at issue to be controversial. Even assuming there is robust public debate about the benefits of algorithmic pricing technology, the required statement does not take sides in that debate; it merely discloses that the technology is in use.


Pixel Space: Applying Zauderer to the Algorithmic Pricing Law

Having examined and rejected these and related arguments by the plaintiff, the court held that Zauderer controlled its analysis of the disclosure requirement. Under that relaxed standard, the law passes First Amendment muster if it is “reasonably related” to a legitimate government interest and is not “unjustified or unduly burdensome.”


Here, the court held that the disclosure requirement is reasonably related to the state’s legitimate interest in ensuring that consumers are informed about the ways prices are set, and about how their personal data is being used. Relying on recent Second Circuit precedent, the court rejected plaintiff’s argument that a disclosure must be corrective or address an identified, existing harm to pass the Zauderer test. Rather, the court held that the government has an interest in ensuring that consumers can make informed buying decisions, and that the algorithmic pricing disclosure is reasonably related to that interest.


On the issue of undue burden, plaintiff raised the novel argument of “limited ‘pixel space.’” While the law mandates certain disclosure language, it does not prohibit any additional speech. Sellers can include whatever material they consider appropriate or necessary, as long as the disclosure appears at or near the indication of the price. Plaintiff argued that in practice the disclosure requirement nonetheless tends to displace its members’ own speech because there is limited pixel space on any given online product page and the Act requires plaintiff’s members to sacrifice a meaningful portion of that limited space to the mandatory disclosure.

The court rejected this argument. While some courts have held that a disclosure requirement can be unduly burdensome if it has the effect of entirely drowning out the regulated party’s own speech, that is not the case here. The court noted that the disclosure is one sentence long and can be in any font or format so long as it is easily visible and appears near the price. It also noted that, unlike in the case of physical packaging where a disclosure may take up much of the available space, the pixel space is by its nature not so space-limited. Under Zauderer, a regulation is not unduly burdensome merely because the seller would prefer to use the space for another purpose, especially when the space can be a scrolling field or pop-up window.


Having found that plaintiff could not sustain its First Amendment challenge to the disclosure requirements under the law, the court denied injunctive relief and dismissed the case.


The First Shot in the War

New York is the first state to enact a law seeking to regulate this technology in any way, and the regulation is as light as it can possibly be: disclose the information and let consumers choose. Yet the industry pushed back hard against the law, filing suit through a major trade organization with amicus briefs from the U.S. Chamber of Commerce and others. These briefs envision an array of disasters arising from the law, ranging from customer confusion to higher prices to the collapse of the entire e-commerce ecosystem, all from a simple disclosure. The court was unimpressed by these arguments, but it is worth asking why the industry was so alarmed by a simple disclosure requirement.


One reason may be that any regulation makes it easier for the next regulation to pass. Several states are considering regulations on algorithmic pricing technology, including not only disclosure requirements, but curbs on the kind of data that can be used and the kind or price differentiation that can be offered. A bill in California that would have banned “surveillance pricing” failed to advance this year, but the state passed a measure to counter the use of algorithmic pricing technologies for anticompetitive price collusion. In this environment, it is not surprising that the industry is wary.


More fundamentally, the addition of a disclosure requirement is undesirable for sellers because it represents “friction,” an industry term for anything that stands between the consumer and the easy completion of an electronic transaction. Industry studies demonstrate that any additional pop-up, button press, or other distraction creates a measurable loss of sales, even for products that consumers desire, at least in the moment, and have placed in their carts. The state’s interest in informed choice is somewhat at odds with the frictionless e-commerce experience, and in the coming years that balance will likely play out in the legislatures and the courts.

This article first appeared in the New York Law Journal on December 15, 2025.

 
 

© 2025 Dewey Pegno & Kramarsky LLP                                                                                  

In some jurisdictions, this may be considered attorney advertising.

bottom of page