Sunday, February 15, 2026

Dr. Oz desires AI avatars to switch rural well being employees. Critics are cautious : NPR

Centers for Medicare and Medicaid Services Administrator Mehmet Oz speaks at an Action for Progress event about plans to transform behavioral health, on Feb. 2, 2026, in Washington, D.C.

Facilities for Medicare and Medicaid Companies Administrator Mehmet Oz speaks at an Motion for Progress occasion about plans to rework behavioral well being, on Feb. 2, 2026, in Washington, D.C.

Heather Diehl/Getty Photos


conceal caption

toggle caption

Heather Diehl/Getty Photos

Dr. Mehmet Oz is pitching a controversial repair for America’s rural well being care disaster: synthetic intelligence.

“There is not any query about it — whether or not you need it or not — the easiest way to assist a few of these communities is gonna be AI-based avatars,” Oz, the top of the Facilities for Medicare and Medicaid Companies, mentioned not too long ago at an occasion centered on habit and psychological well being hosted by Motion for Progress, a coalition geared toward enhancing behavioral well being care. He mentioned AI may multiply the attain of docs fivefold — or extra — with out burning them out.

The AI proposal is a part of the Trump administration’s $50 billion plan to modernize well being care in rural communities. That features deploying instruments comparable to digital avatars to conduct fundamental medical interviews, robotic techniques for distant diagnostics, and drones to ship medicine the place pharmacies do not exist.

Oz even steered changing in-person obstetric care with AI-guided gadgets.

“We will use robots to do ultrasounds on pregnant ladies,” Oz mentioned. “You are taking a wand, you do not even see the picture—you simply get digitized insights that inform you whether or not the kid’s OK. And admittedly, I haven’t got to see the picture. I simply should know if the picture is nice sufficient to inform me the kid would not have an issue.”

In an announcement to NPR, the Facilities for Medicare and Medicaid Companies mentioned Oz was emphasizing the necessity to “responsibly discover instruments” that may prolong the attain of licensed clinicians, not change them altogether. It additionally mentioned that CMS helps using AI-enabled instruments when they’re evidence-based, patient-centered and used appropriately below medical oversight.

What rural America is already dealing with

Oz’s feedback got here as rural hospitals have confronted steep cuts below the One Large Stunning Invoice Act that President Trump signed final yr, a reconciliation legislation that cuts federal Medicaid spending by about $1 trillion {dollars} over 10 years, closely impacting rural hospitals.

These hospitals already had been grappling with monetary pressures. Based on the nonpartisan analysis group KFFgreater than 190 rural hospitals have shut down between 2005 and early 2024 — about 10% of all rural hospitals within the nation — due to funds shortfalls and associated challenges. Some communities have misplaced their solely hospital, leaving residents having to drive lengthy distances for fundamental and emergency medical remedy — or skip it altogether.

Throughout america, folks residing in rural counties usually tend to die early from 5 main causes — coronary heart illness, most cancers, continual decrease respiratory illness, stroke and unintentional accidents — than these in city areas, in line with a report revealed by the Facilities for Illness Management and Prevention in 2024. Lots of these deaths are preventable with well timed, high quality care, in line with the report.

The CDC analysis pointed to a number of culprits: restricted entry to suppliers, longer journey instances, fewer emergency providers, larger poverty charges and decrease insurance coverage protection.

A well being care system with fewer folks?

Carrie Henning-Smith, affiliate professor on the College of Minnesota and co-director of its Rural Well being Analysis Heart, says using AI avatars would strip away one thing important: human connection.

“Well being care has at all times been about humanity and relationship,” she mentioned. “In case your first and solely supplier is an avatar, we’re eradicating belief, consolation, and continuity.”

Henning-Smith additionally raised considerations about testing unproven expertise on already underserved populations.

“I do not like the concept of rural populations being handled as guinea pigs,” she mentioned. “If that is the place we’re testing AI in well being care, there’s lots that would go improper.”

She additionally pointed to logistical considerations comparable to unreliable broadband, low well being literacy and fragile transportation techniques. If AI techniques cannot operate with out a steady digital spine, she mentioned, they may deepen present gaps.

Supporters say AI may assist increase entry

However some well being tech leaders argue that AI instruments may assist rural communities — not by changing docs, however by taking up administrative burdens that preserve clinicians from seeing sufferers.

Matt Faustman is the co-founder and CEO of Honey Well being, an organization that develops AI instruments designed to automate duties for suppliers — together with managing fax inboxes, processing prior authorizations and retrieving affected person information.

Many suppliers are overwhelmed by paperwork, Faustman mentioned, and the burden is very heavy in rural settings the place clinics could not have massive administrative groups.

“Thirty to forty % of doctor or supplier time can actually get absorbed with administrative work,” he mentioned.

Faustman mentioned automating these duties may unlock clinicians to give attention to affected person care — and permit small hospitals and clinics to scale sooner with out hiring extra back-office staffing.

He additionally mentioned AI may play a task on the patient-facing aspect, particularly in areas the place the precise supplier just isn’t instantly out there.

“It might probably function an preliminary triage and even an early entry alternative for these sufferers to then get diverted to the precise suppliers,” he mentioned.

Can AI actually replicate a human clinician?

Henning-Smith argues that even when AI instruments can deal with fundamental duties, they cannot replicate the core of what well being care requires.

“AI cannot learn facial expressions, tone of voice, or physique language,” she mentioned. “And people issues matter. That is the place the connection between a affected person and supplier is constructed — within the nuance.”

Even when AI instruments are correct, she mentioned, they cannot provide the reassurance or cultural sensitivity that comes from a trusted clinician. And in communities the place belief within the medical system is already fragile, that loss might be particularly damaging.

Henning-Smith additionally raised considerations concerning the financial penalties of changing native jobs with AI expertise.

“When a nurse or physician is employed in a rural city, their wage stays there,” she mentioned. “However whenever you change that job with an AI software inbuilt Silicon Valley, that cash leaves.”

Public backlash

On-line response to Oz’s feedback was swift.

“You assume rural communities need AI docs? They’re nonetheless making an attempt to get dependable web,” one person wrote on X.

One other added, “Dr. Oz: ‘We changed your nurse with a cartoon. You are welcome.'”

Nonetheless, just a few voices defended the concept, noting that some care is best than no care in any respect.

“It isn’t ultimate,” one publish learn, “nevertheless it’s higher than nothing.”
Oz has not provided a full implementation plan, and CMS has not confirmed whether or not AI avatars will develop into a proper a part of the company’s rural well being technique.

However Henning-Smith hopes the dialog would not finish with price financial savings.

“I might be curious if Dr. Oz would need an avatar treating his circle of relatives,” she mentioned. “This seems like a two-tiered system — one for these with sources, and one other for these with out. And I do not assume we needs to be okay with that.”

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles