In its June 27, 2025, decision in Free Speech Coalition, Inc. v. Paxton, 606 U.S. (2025), the United States Supreme Court upheld Texas House Bill 1181, a statute requiring age verification via government-issued identification or comparable transactional data before users may access online materials considered harmful to minors. Writing for the majority, Justice Thomas described the law as a constitutionally permissible extension of a state’s power to protect children from obscene material, reasoning that it imposed only “incidental” burdens on the rights of adults to access protected speech. The Court held that this burden did not warrant strict scrutiny and that the statute survived the intermediate scrutiny applicable to laws incidentally affecting First Amendment rights.
What the Court overlooked, however—almost entirely—is the dangerous confusion at the heart of the law between identity and authorization. In modern cybersecurity and privacy architecture, these are separate concepts. Identity refers to who a person is; authorization refers to what that person is allowed to do. You may need to know that someone is over 18 to allow them into a particular website. Still, you do not necessarily need to know their name, address, driver’s license number, or the precise contours of their government-issued biometric profile to make that determination. By conflating these two functions, the Court has endorsed a regime that not only over-collects deeply personal information but also places that information in environments that are both insecure and unregulated.
The result is a system that compels users to surrender far more than proof of age. It creates digital dossiers linking users’ sexual preferences and website behavior to their real-world identities. And it does so without offering the basic privacy safeguards that would typically accompany data collection in contexts like banking, healthcare, or even voting.
The Court seems to imagine that H.B. 1181 operates much like an ID check at the entrance to a strip club or liquor store—a brief visual inspection, forgotten as soon as the transaction ends. But this analogy collapses in the context of digital surveillance and persistent data storage. Unlike a bouncer at the door, a website must log, store and be prepared to prove that it performed age verification correctly, on pain of ruinous fines. As a result, websites are forced to retain identity verification logs, transactional records and often scans or photographs of users’ government IDs. In short, they become data custodians of information far more sensitive than the content they host.
Even when third-party services perform the verification, users’ data remains in play. Most modern age verification solutions begin with scanning a driver’s license or passport. A verification token may then be issued, which in theory could be used anonymously. In practice, however, such tokens are rarely secure from tracking or re-identification. A token might include a hash of the user’s name, or be associated with device fingerprints, IP addresses, or cookies—metadata that allows advertisers, law enforcement, or hackers to link the token to a real-world identity. Even where the token does not explicitly identify the user, it may become functionally identifying when aggregated with browser type, time zone, screen resolution and the distinctive fingerprints of modern web sessions.
This becomes especially troubling when one considers the nature of the content at issue. A statute that requires identity verification to read news articles or shop for groceries would be problematic; one that does so for pornography is catastrophic. The data trail created under H.B. 1181 is not simply one of access, but of stigma. It risks revealing not just that a person visited an adult site, but what content they accessed, at what times and from which location. In a world where law enforcement and civil litigants routinely subpoena metadata, where foreign and domestic threat actors constantly seek compromising information for extortion, and where reputational ruin can follow from the mere suggestion of illicit online activity, this is not a hypothetical concern.
The dangers are not new. The 2015 Ashley Madison data breach exposed the names, emails and personal preferences of users of an extramarital dating service, leading to blackmail, suicides, divorces and widespread reputational destruction. See Andy Greenberg, Hackers Finally Post Stolen Ashley Madison Data, Wired (Aug. 18, 2015). Similarly, concerns have been raised over MindGeek (the parent company of Pornhub) and its data practices, especially the lack of transparency around user tracking and the retention of session data that can be matched to IP addresses and browsing history. See Isabella Higgins, Call for Porn Sites to Have Stricter Age Verification after MindGeek Revelations.
These incidents illustrate the profound risks associated with linking identity to the consumption of adult content. They also highlight the inadequacy of the legal framework under which these laws operate. Texas’s H.B. 1181 does not require websites to adopt data minimization strategies, impose time limits on record retention, or encrypt identity data to any particular standard. Nor does it limit what third-party age verification providers may do with the information they collect. The law is silent on whether providers may sell, analyze, or repackage the data for other commercial or government use.
There are alternatives—technical and legal—that could achieve the state’s objectives without destroying anonymity. Attribute-based credentials and zero-knowledge proofs could allow users to demonstrate they are over 18 without revealing any other information. The European Union’s GDPR and the NIST Cybersecurity Framework both recommend architectures that separate identification from authorization, limiting the unnecessary exposure of personal data. But these solutions require affirmative design choices and legal mandates that value privacy. H.B. 1181 reflects no such concern. Instead, it mandates a “commercially reasonable” method of age verification—language so vague it virtually guarantees that identity will be the default currency of access.
In this way, the law does not simply burden speech. It rewrites the architecture of online access, turning adult content websites into surveillance tools and placing users at risk of catastrophic exposure. By treating identity as a prerequisite to access, and by endorsing the long-term storage of that identity data as a compliance obligation, Texas has erected a system that is both technologically dangerous and constitutionally suspect.
The Supreme Court’s decision in Free Speech Coalition v. Paxton missed the opportunity to grapple with these realities. It viewed the law as a modest extension of historical norms, failing to recognize that digital access creates digital records—and that digital records, especially of private conduct, do not remain private for long.
In doing so, the Court has not merely upheld a content restriction. It has greenlit the creation of a surveillance infrastructure for adult speech, one in which users must permanently link their names, faces and licenses to their most intimate online behaviors. The chill on speech is not incidental. It is foundational.
And the cost is not theoretical. It is borne, every day, by those who must now choose between exercising their rights and preserving their privacy.
Recent Articles By Author
