The federal government is running a live experiment on 30 million Americans. An AI program called WISeR is delaying and denying medical care for seniors. The nation’s top health official says AI could make the FDA “irrelevant.” And somehow, nobody in charge sees the contradiction.
The Quiet Launch That’s Ruining Lives
In January 2026, CMS launched the Wasteful and Inappropriate Service Reduction Model — WISeR — introducing AI-powered prior authorization to traditional Medicare for the first time at scale.
Six states are the guinea pigs: Arizona, New Jersey, Ohio, Oklahoma, Texas, and Washington. Thirteen categories of elective procedures — spinal surgeries, nerve stimulators, epidural injections — now require pre-approval before Medicare pays.
The gatekeepers aren’t doctors. They’re not even government agencies. They’re six private tech companies using AI to decide whether your grandmother gets her spinal procedure.
The kicker? These companies earn a percentage of the savings from denied care. They literally profit when they say no.
Seniors Are Already Paying the Price
Early data from Washington state is brutal. A report commissioned by Senator Maria Cantwell found Medicare patients subject to WISeR are waiting two to four times longer for procedures their doctors recommended. Average wait times ballooned from two weeks to four to eight weeks.
“AI is being used as a denial device for the CMS system,” Cantwell told HHS Secretary Robert F. Kennedy Jr. during an April 22 Senate hearing.
An 83-year-old man was denied coverage for a spinal procedure to treat debilitating nerve pain — a case Kennedy himself called “unacceptable.” But unacceptable things keep happening when the system is designed to produce them.
Andrew Jones, CEO of Wenatchee-based Confluence Health, called the program “pure waste.” Traditional Medicare’s greatest advantage was always simplicity and low overhead. WISeR destroys that. If a provider skips the prior authorization? The claim gets auto-flagged for pre-payment review anyway. Catch-22 by design.
Pay-for-Denial Is Exactly What It Sounds Like
The most alarming part of WISeR isn’t the AI. It’s the incentive structure.
Private companies earn revenue based on how much Medicare spending they prevent. This is a bounty system for claim denials. The parallels to Medicare Advantage are immediate — private insurers there faced years of lawsuits over using algorithms to inappropriately deny care. A 2022 Inspector General report found MA plans denied 13% of requests that actually met Medicare criteria.
Now the same dynamic is being grafted onto traditional Medicare, which historically operated without these gates.
CMS points to real numbers: healthcare waste accounts for up to 25% of total U.S. spending, and Medicare faces projected insolvency by 2033. The agency estimates WISeR could address $134 to $185 billion in annual waste and fraud.
Fair enough. But there’s a difference between cutting waste and paying companies to deploy opaque algorithms that deny care to elderly people. One is reform. The other is a perverse incentive wearing a lab coat.
RFK Jr.’s FDA Fantasy
The same week these hearings happened, Kennedy told lawmakers: “AI is going to revolutionize medicine, and it may — some day, at some point — make FDA even irrelevant.”
He then launched into an anecdote about AI developing a cancer treatment for a dog. Lawmakers looked bewildered. Cable news panels had a field day.
The juxtaposition is staggering. One arm of HHS deploys AI to restrict healthcare access for seniors. The same secretary fantasizes about AI replacing the agency that ensures drug safety for 330 million Americans.
AI still hallucinates. It produces biased outputs. It cannot conduct clinical trials. The idea that it replaces the FDA isn’t visionary — it’s reckless. But it reveals the philosophy: AI as a tool for shrinking government, period.
This Is Part of a Much Bigger Pattern
WISeR didn’t emerge from nowhere. Private insurers have deployed AI for claim denials for years:
- UnitedHealth Group’s NaviHealth algorithm faced a class-action lawsuit for systematically denying coverage to elderly patients in post-acute care
- Cigna was accused of using AI to reject over 300,000 claims in two months — doctors spent an average of 1.2 seconds per case
The difference now is that the federal government is doing it on the public Medicare program. If the pilot gets deemed successful based on cost savings alone — which the pay-for-denial structure virtually guarantees — it could expand nationwide.
CMS has built in some safeguards: “gold-card” exemptions for compliant providers, 72-hour response requirements. But the fundamental tension remains. The entities making approval decisions profit from saying no.
The Real Question Nobody’s Answering
When an AI denies a Medicare claim, who’s responsible? The tech company? CMS for hiring them? The algorithm itself?
Cantwell’s report found patients receiving denials with no explanation. A black box standing between seniors and their doctors.
AI has legitimate, even exciting healthcare applications. AI-assisted diagnostics catch cancers earlier. Machine learning accelerates drug discovery. NLP reduces physician paperwork.
But there’s a hard line between AI that helps doctors provide better care and AI that blocks patients from getting care their doctors already recommended. WISeR plants itself firmly on the wrong side.
The Bottom Line
The collision of WISeR and Kennedy’s FDA comments paints a clear picture: an administration that views AI primarily as a cost-cutting and government-shrinking tool, even when that means inserting algorithmic gatekeepers between elderly Americans and their doctors.
The technology isn’t the villain. The incentive structure is. A system that pays companies to deny care will deny care. A secretary who dreams of AI replacing drug safety regulators isn’t a visionary — he’s a liability.
If we’re going to let algorithms make life-altering healthcare decisions for millions of seniors, the absolute minimum is designing the system to help them. Right now, it’s designed to save money. And those are not the same thing.