Image: The Supreme Court Just Decided AI Can't Own Anything. The Fa
The justices declined to hear Thaler v. Copyright Office on Monday. In doing so, they quietly closed the most consequential legal door in AI history - without writing a single opinion.
Stephen Thaler has been fighting the same war for eight years. In 2018, his AI system DABUS created "A Recent Entrance to Paradise" - a visual work depicting train tracks passing through a luminous portal flanked by purple and green foliage. Thaler applied for copyright. The U.S. Copyright Office said no. Federal courts said no. The D.C. Circuit said no in 2025. And on Monday, the Supreme Court said: we're not even going to hear about it.
The court's denial of certiorari carries no written opinion. No majority ruling, no dissents, no binding legal reasoning. Just silence. And yet that silence is now the law of the land.
The Copyright Act of 1976 requires human authorship as a "bedrock requirement." Every court that has looked at Thaler's case has agreed on that point. With SCOTUS refusing to revisit it, the principle stands: if a human didn't create it, it isn't protectable.
The downstream implications are significant. The Copyright Office has already separately rejected images generated via Midjourney, even in cases where human artists argued they were the creative drivers behind AI-assisted works. That territory remains contested - but the pure "AI created it independently" argument is now dead in U.S. law with no clear path for revival.
"Even if it later overturns the Copyright Office's test in another case, it will be too late. The Copyright Office will have irreversibly and negatively impacted AI development and use in the creative industry." - Thaler's legal team, in their SCOTUS filing
That's a dramatic framing, but it points at something real. The AI creative industry is now operating in a legal environment where the outputs of its core product - generative AI - have zero inherent IP protection. Anyone can copy, redistribute, or commercialize purely AI-generated content. The entity that spent the compute generating it has no recourse.
Most coverage of this ruling focuses on Thaler's specific loss. That's the wrong frame. The more important question is what this does to the business models quietly being built on top of AI generation.
Startups across creative AI - stock image generators, AI music platforms, automated video production tools - have been implicitly assuming some form of downstream IP protection for their outputs. That assumption is now formally incorrect in the U.S. When a company generates 10,000 AI images and posts them across its platform, a competitor can legally scrape, repackage, and resell every single one. There is no copyright to enforce.
This is Thaler's second SCOTUS defeat. The court previously refused to hear his argument that AI-generated inventions should qualify for patent protection - following USPTO rejections on the same human-authorship grounds. The pattern is consistent: U.S. IP law does not recognize non-human creators, and the court is not inclined to force the question.
The Trump administration, notably, urged the court not to hear the appeal. That's worth sitting with. An administration that has been publicly aggressive about AI deregulation and U.S. AI dominance sided with the Copyright Office's restrictive position here. The rationale is presumably economic: if AI outputs were copyrightable, it would primarily benefit large AI companies and their corporate clients - not the human artists who vote and lobby.
There is a legal crack left open. Thaler's case was extreme: DABUS created the work with no claimed human creative input. The Copyright Office and courts have suggested, without fully deciding, that substantially human-directed AI work might qualify for protection. "AI-assisted" may land differently than "AI-generated."
This is where the real legal battle is moving. How much human direction is enough? A detailed prompt? Selecting from generated options? Iterative refinement over dozens of generations? These are unanswered questions, and they are questions that actually matter for how creative AI tools get built and sold going forward.
The Thaler ruling doesn't answer them. But it does make it more urgent that someone does.
Congress could act. The Copyright Act hasn't been substantially updated since 1998's DMCA, and calls for AI-specific IP legislation have been growing for two years. Whether there's political will to resolve an issue this complex in a divided Congress is another matter entirely.
More likely, the vacuum gets filled by contracts. Companies deploying generative AI will increasingly rely on Terms of Service provisions, trade secrets, and technical obfuscation to protect their outputs - not copyright. That's a fragile foundation, and it disadvantages smaller players who lack legal infrastructure.
Thaler's lawyers called the court's continued silence "no longer helpful." They're right, but for the wrong reasons. The silence isn't primarily a problem for Stephen Thaler and DABUS. It's a problem for the entire generative AI economy - which has been building on an IP assumption that just got quietly demolished without a single word of explanation.