4.12.24
14 min. Read

Poking holes in the PHTI diabetes report.

Issue 064
Digital health research from Brian Dolan

 E&O: Employers

Welcome back to E&O’s Selling to Employers, the enrollment-focused digital health newsletter from Exits & Outcomes — for paying subscribers only. This newsletter digs into digital health companies that sell to self-insured employers, fully insured plans, and other payers. It’s digital health as an employee benefit.

This edition of E&O: Employers is entirely focused on pushbacks to the Peterson Health Tech Institute’s digital diabetes report. Was this forwarded to you? Increasingly, E&O is a covered benefit from many forward-looking digital health-focused employers. Why not yours? Consider a Business or Enterprise subscription today. Click this link to become a paying subscriber (there are personal plans available too).

Poking holes in the PHTI digital diabetes report

Like just about everyone I spoke to about the PHTI report these past few weeks, I believe the conversation this report has inspired is good for digital health. Even some of the people who strongly felt that the report wasn’t done well, told me it was still a positive force that will push digital health companies to think more critically about evidence generation.

It’s important to keep that in mind. PHTI’s mission is not to publish sobering reports about digital health — its ambition is to create a new high watermark for evidence in digital health. And to be the arbiter of that. This is an activist organization intent on changing the industry, and the decisions it made in its first report make that clear.

As noted below in a few places, just about every previous meta-analysis of digital health evidence ends with a call for more evidence. That would have been a logical conclusion for the PHTI report too. But making the call that digital diabetes programs don’t work is a much more powerful way to get the industry to pay attention — both to PHTI and its assessment. Naming names helps. Tearing open old wounds (like diabetes management vs nutritional ketosis) stirs up opposing factions and helps get the word out too.

I’d expect the forthcoming MSK-focused report from PHTI to follow a similar pattern. One way to go would be for PHTI to re-ignite that kerfuffle the American Physical Therapy Association (probably with a little help from its partner Sword Health) had with Kaia and Optum. APTA successfully shamed Kaia and Optum into stop using the term “physical therapy” when marketing its MSK pain program, which typically does not include licensed physical therapists. I can’t imagine a better way to stir up the companies working in MSK than taking a side in that debate. Especially if PHTI calls for broader adoption of less expensive, non-PT-led MSK care. The fallout from the MSK report might have far-reaching consequences since Hinge Health, Sword Health, and Omada Health are all likely IPO contenders this year.

Anyway, that’s next — back to diabetes.

Here are the questions that I’m still wrestling with after going down innumerable rabbit holes with E&O readers about various aspects of the PHTI digital diabetes report. The nine questions below correspond to the headlines of the articles in this newsletter — scroll down to whatever is of interest or read them all. And, as always, hit reply and let me know what you think?

  • Do the “Purchasers” on PHTI’s advisory council actually need this report?
  • Why are some of the “Purchasers” advising PHTI also offering their own competing digital diabetes management programs?
  • Should that single Teladoc study have been the only input into PHTI’s clinical effectiveness and economic impact assessment?
  • Why would PHTI use an economic impact model that uses a cost savings number plucked from the medical literature?
  • How can PHTI claim all digital programs are the same?
  • What about Omada’s argument that it should not have been included?
  • Did the report really only look at programs from 8 companies?
  • Is the focus on A1c reduction a non-starter when evaluating the RPM category?
  • Why didn’t PHTI just declare a need for more evidence?

Do the “Purchasers” on PHTI’s advisory council actually need this report?

Perhaps the most under-explored aspect of the PHTI report is the non-profit’s “Purchaser Advisory Council”. The names of the 15 health plans, self-insured employers and health systems that make up this Council aren’t a secret — they are listed on the org’s website. The health plans and health systems include UnitedHealthcare, CVS/Aetna, Evernorth, Elevance, Humana, Highmark Health, Federal Employee Program, Blue Shield of California, Blue Cross Blue Shield of Massachusetts, Geisinger, and Kaiser Permanente. The self-insured employers include Salesforce, UC Davis (or is this a provider?), Delta Air Lines, and Morgan Stanley.

Why does this group of “Purchasers” matter? PHTI’s leader Caroline Pearson said on a recent webinar:

“I want to share with you why we decided to start PHTI. That is because we heard again and again from healthcare purchasers that they’ve lost confidence in this industry. Those health plans, employers and providers have heard too many promises about the benefits of digital health technologies. And at this time, they have had ample opportunity to see these solutions in action, and they are deeply skeptical about whether the benefits that they’ve been promised are going to materialize for patients and the members they serve.”

Does it stand to reason that UnitedHealthcare, one of the biggest healthcare companies in the world, needs PHTI to evaluate digital diabetes tools for it? As Pearson points out above, these health plans have had “ample opportunity to see these solutions in action.” (Check out the bottom of this article for a quick list of their current digital diabetes partners.) It’s been 10 years since Glen Tullman took the CEO helm at Livongo. Is there any chance that UnitedHealthcare doesn’t know exactly how cost effective programs like Livongo are or aren’t? And if there was a world where UnitedHealth didn’t know this ten years later, and they suddenly decided that they wanted to… would their best best course of action be to help a white knight non-profit create a report that uses a single data point from a single RCT along with a handful of metrics from the medical literature to model out their effectiveness? No, they would look at their own claims data.

They know already.

The better question is: Why do the big health plans want a report like this out in the public?

I think most of this has to do with one group of “purchasers” — self-insured employers. Employers helped jumpstart the digital diabetes management market and they are most responsible for its rise. If Peterson’s report was intent on serving a public need, it should have focused entirely on the employer perspective. If any “purchaser” is unclear on whether these programs have delivered on the benefits promised, it would be them. Instead of a laser focus on A1c reduction and a tight definition of cost effectiveness, PHTI could have built out a more robust accounting of the benefits that employers expect from these programs rather than a universal throughline that appeals to its current mixed bag of “purchasers”. If self-insured employers continue to demand programs like this, their health plan partner ASOs will continue to source them (and take their cut). At the same time, however, some health plans have been building out their own competitive digital health programs — none of which gets a mention in this Peterson report. Nearly all of them offer their own non-digital diabetes management programs too.

More on that below.

But first, a quick rundown of PHTI’s Purchaser Advisory Council’s current digital diabetes partners/offerings based on E&O research:

  • Blue Cross Blue Shield Massachusetts offers most of its members Virta Health and depending on their PBM — LifeScan’s OneTouch glucose monitor and Aetna’s Transform Diabetes Care program. It also offers a more general care management program powered by HealthEdge’s Wellframe.
  • Blue Shield of California offers Virta Health as well as its own homegrown diabetes management program.
  • Self-insured employer Delta Air Lines offers Verily’s Onduo program to its employees.
  • CVS-Aetna offers its members its own homegrown digital diabetes program called Transform Diabetes Care 2.0. The health insurance company used to work with Livongo on this program, but it broke up with the company around the time the COVID pandemic was just beginning in the spring of 2020.
  • Anthem/Elevance Health used to offer Livongo but it seems like it has stopped partnering with outside digital diabetes management companies. It now offers digital diabetes care through its own homegrown virtual primary care offering powered by its Sydney Health app that Carelon runs for it.
  • Evernorth has a preferred reseller agreement with Omada Health, which replaced Livongo/Teladoc as the preferred offering a few years ago. It still offers Teladoc/Livongo along with Lifescan. Evernorth also offers its own Diabetes Care Value program, which includes CGMs and pharmacist support. Unclear if it also includes any of the programs above.
  • Federal Employee Program BlueCross Blue Shield offers its plan members Teladoc/Livongo.
  • Employer – University of California – Davis doesn’t appear to offer any digital diabetes management programs currently. Its benefits documents reference one called UC Care, which may be a program through the university’s affiliated health system.
  • Employer – Morgan Stanley – Since 2021 Morgan Stanley has offered its employees Teladoc/Livongo’s diabetes management program.
  • Employer – Salesforce – The company currently directs employees to a care navigator and second opinions service from Included Health if they need help managing their diabetes. No diabetes-specific programs appear to be featured in Salesforce’s benefits documents.
  • Humana offers Virta Health to its members.
  • UnitedHealth Group offers various diabetes management programs including its own homegrown one called Level2. For many years diabetes management was one of the many tracks available through UHG’s Rally Health platform.

Why are some of the “Purchasers” advising PHTI also offering their own competing digital diabetes management programs?

It’s a best practice to list out conflicts of interest in research reports like the one PHTI published last month. It’s puzzling why PHTI wouldn’t mention that some of the big names on its Purchaser Advisory Council offer their own competing solutions to the ones criticized in the report. It’s even more puzzling why PHTI doesn’t include some of these programs in the analysis itself.

Aetna’s Transform Diabetes Care program probably has wider adoption than most of the eight programs named in the PHTI report. Originally, Aetna’s digital diabetes program was powered by Livongo, but — as E&O reported back in 2020 — the health insurance company unceremoniously booted Livongo out just as the pandemic began to surge in the US. There is no question that it is a digital diabetes management program. And there is no good reason it was excluded from the PHTI report. At the very least, PHTI should have disclosed that CVS/Aetna, a member of its Purchaser Advisory Council, offers its own competing digital diabetes management program to the ones mentioned in the report. Aetna certainly stands to benefit if this group of digital health unicorns begin to lose customers because of this report.

Similarly, UnitedHealth developed its own digital diabetes monitoring program, Level2, over the past few years. Level2’s offering typically includes a continuous glucose monitor, however, which would make it ineligible for analysis in the PHTI report. Still, like CVS/Aetna, United would likely benefit if the big name digital health companies’ position in the digital diabetes market was weakened by a report like the one it helped PHTI put out. Level2 could step in.

Just about every health plan in the US has some kind of chronic condition management program set-up with a nurse-staffed hotline. These legacy, typically non-digital services are what many digital diabetes management programs replaced in some benefits books. I’d expect every health plan on the PHTI purchaser council offers these competitive programs.

As everyone now knows, health systems quickly adopted telemedicine services during the pandemic. While some of them have since partnered with digital health companies, others may see the bigger ones as competitors. Two of the providers listed on PHTI’s purchaser council offers services that appear somewhat competitive to the programs evaluated in this report: UC Davis Health offers Telehealth consults to help adult patients with Type 2 diabetes manage their diet and lifestyle. Geisinger also offers telemedicine diabetes care to patients.

Given all of that, it would have been cleaner for PHTI to focus on self-insured employers as its only “purchaser”. That’s the only group that isn’t conflicted here and the only one that actually needs help sorting this market out.

Should that single Teladoc study have been the only input into Peterson’s clinical effectiveness and economic impact assessment?

Reminder: The report’s core finding was that digital diabetes management programs cost more than they save — based on their ability to lower A1c — and, therefore, should no longer be adopted.

Ultimately, only one study factored into that conclusion, which puts a target on companies listed in the behavior and lifestyle category (Teladoc/Livongo, Omada, Verily/Onduo, Dario, Vida, and Perry Health). The one study that led to that determination was conducted by Teladoc/Livongo and Peterson called it “a well designed RCT”.

The Teladoc study found that the intervention group reduced A1c by 0.37 percent compared to the control. This was the best outcome produced out of the four studies Peterson found that used a comparator. Peterson’s economic impact model then simply took a figure from the medical literature — that a 1 point percentage reduction in A1c leads to a 1.6 percent reduction in all healthcare costs for that patient (not just diabetes costs but all) — and used that to calculate out cost savings from Teladoc’s 0.37 percent reduction. One of the findings this simple model led to was that digital diabetes management programs generate $109 incremental health savings per user per year in commercial plans. That’s obviously nowhere near enough to account for the cost of these programs, which Peterson said was around $484 per user per year after you back out costs for testing supplies etc.

It’s worth considering if the entire category of digital diabetes management programs should be assessed based on the results of a single study by a single vendor. Here’s a quick thought experiment: If that particular study didn’t exist, Peterson would have used the one with the next best outcome, which Peterson called “Yang 2020”. Yang 2020 followed patients across 13 primary care clinics in Seoul, South Korea. Here’s how the study was done:

“Every month, participants in both groups attended face-to-face physicians’ consultation for the management of diabetes in the clinic. For the intervention group, participants were required to upload their daily self-monitoring of blood glucose (SMBG) results using the mobile phone app in addition to outpatient care for 3 months. The results were automatically transmitted to the main server. Physicians had to check their patients’ SMBG results through an administrator’s website and send a short feedback message at least once a week.”

Does that sound anything like Omada Health’s program or Livongo’s? Should this study really be in the mix here? The study with the next best outcome was even more irrelevant: Conducted way back in 2013, it followed elderly patients in a senior home center in Hong Kong and saw their professional caretakers tracking their blood sugar and receiving feedback from remote providers via “netbooks”. No self-management of diabetes whatsoever.

PHTI assures us the single outcome from the Teladoc study is representative because all four of the outcomes from the few studies with comparators were tightly clustered. But those studies focus on wildly different programs.

Why would PHTI use an economic impact model that uses a cost savings number plucked from the medical literature?

This is one of two PHTI decisions that annoyed digital health companies the most. When evaluating new technologies with little real world adoption it makes sense to use modeling to try to estimate what their impact might be in real world settings. What else can you do? However, some of the companies named in the PHTI report have been in the market for a decade now. As PHTI stated on its own webinar, health plans have had ample opportunity to see these programs in action. Why didn’t PHTI lean on its health plan advisors to provide claims data? Was there really no way to use any of the economic impact studies that digital health companies submitted to try to determine real world cost savings?

A few digital diabetes companies told me that they believed PHTI used the 1.6 percent reduction in total healthcare cost savings for every 1 percent drop in A1c specifically because it would show far less savings than any of the real world studies would have.

I’d guess it was more likely that PHTI thought this was a cleaner way to do the analysis, but this one contentious decision might be the reason a critical reader might lose faith in the analysis. It’s weak. If PHTI has ambitions to step up and create a higher level of cost-effectiveness evidence generation for digital health, it can’t take shortcuts like this.

How can PHTI claim all digital diabetes programs are the same?

One E&O reader wrote in and said that the fact that PHTI stated that all of these digital diabetes management programs in the behavioral and lifestyle group were more or less the same would probably be the most aggravating thing in the report for a lot of those companies’ founders. That rings true.

It’s also likely part of PHTI’s strategy to rile up the industry and get it to pay attention to its assessments.

While it’s not super illuminating, here’s the specific line PHTI uses to smooth over the entire category of digital diabetes management:

“This evaluation is conducted at the category level. Based on the similarity of approaches and the consistency of clinical outcomes, it is likely that individual solutions perform in line with the category.”

Of course, these programs aren’t all the same, but if the only way you are evaluating them is by their A1c impact then it may not matter how they’re different. Also, for some of the companies mentioned in the report, their clinical impact wasn’t actually evaluated at all. It’s just assumed their impact would be the same because their approach looks to be about the same.

What about Omada’s argument that it should not have been included?

Omada raised an interesting argument about why it should not have been included in the PHTI analysis.

Remember: Omada is first and foremost a company focused on diabetes prevention. When it added a diabetes management program it specifically chose to based its program’s content on an established framework called Diabetes Self-Management Education & Support (DSMES). Because DSMES is so well-studied, Omada didn’t conduct a lot of studies to support its diabetes management programs — as it did for its more novel intervention in diabetes prevention. Instead it sought (and secured) accreditation like every other DSMES service provider.

As Omada’s Chief Medical Officer Dr. Carolyn Bradner Jasik wrote in response to the PHTI report:

“This report applies methodology created by ICER that conducts ‘evidence-based reviews of health care interventions, such as drugs, devices, and diagnostics.’ The field of digital health innovation is diverse. It includes solutions that deliver a single intervention, and it also includes health services providers who provide care per guidelines that are delivered by licensed/credentialed professionals and validated through standard accreditation processes. Omada Health is the latter.”

Also worth recalling: Omada’s diabetes prevention program fared pretty well through its ICER review way back in 2016. Omada might be the only company mentioned in this PHTI report that has tangled with an ICER-like review before and lived to tell the tale.

Did the report really only look at programs from 8 companies?

The first thing that caught my eye when I read the PHTI report was that the list of eight companies included seven familiar names and one clear outlier: Perry Health.

PHTI’s curious inclusion criteria required companies to have raised at least $25 million. Perry has raised $26 million. (Phew, just made the cut.) Every other company mentioned had raised in the hundreds of millions. It’s an odd grouping. And where was Welldoc? It has raised north of $80 million over its long history. It arguably created the category of digital diabetes management and PHTI didn’t see fit to mention them? (I asked and neither PHTI nor Welldoc knew why they weren’t included.)

Another odd decision: PHTI found no studies focused on either Vida or Verily/Onduo’s diabetes programs, but it name-checked both companies throughout its analysis.

PHTI’s list of companies is inscrutable (and indefensible) but PHTI clearly intended to focus on those with the most funding.

Still, if you ask PHTI it will tell you this is a category-level assessment of digital diabetes management solutions. That becomes clear if you take a look at the report’s appendices. The researchers reviewed studies focused on many diabetes management programs beyond those from the eight companies listed.

So, PHTI went with a confusing approach of trying to be sweeping in its assessment while still naming names. I think the effect is that any company not mentioned (like Welldoc) can simply tell hesitant purchasers that the PHTI report is irrelevant since they weren’t mentioned.

And for exactly that reason: if PHTI didn’t name those eight companies, then it would have been a lot easier for the industry to simply ignore the assessment.

Is the focus on A1c reduction a non-starter when evaluating the RPM category?

The RPM subcategory, which is enabled almost entirely by a set of fee-for-service CPT codes, deserves its own separate report based on a more tailored set of considerations. Selling to providers is very different from selling to self-insured employers, which is how PHTI should have framed the other two categories of this report.

By jumbling together three different “purchasers” with (at least) three different categories of programs, PHTI muddied its analysis.

One way to think about the digital health companies selling remote patient monitoring to providers is on a spectrum of companies that offer connectivity tools and no care services vs those that offer connectivity tools and a full suite of fully-outsourced care services.

Given that spectrum: Can PHTI compare an RPM offering that leaves the care component entirely up to the purchaser? Is that RPM vendor in any way responsible for whatever clinical outcomes are achieved? On the other end, the digital health company might be entirely responsible for the clinical outcomes. But given this spectrum, it seems to be a nonstarter to me to evaluate RPM companies based on A1c reduction. In some cases it will be entirely variable based on the purchaser’s actions.

Why not just declare a need for more evidence?

The more appropriate conclusion for Peterson’s report is that the evidence is — as always in digital health — too thin. Two people I spoke to actually assumed PHTI had called for more evidence — RCTs specifically — even though they didn’t. It’s just baked into digital health culture that a report like this would always include that call to action.

The reality is that the companies in this category have not had to invest in rigorous studies with comparators because their go-to-market didn’t require it.

This is also the problem Peterson is trying to fix, which is one reason this first round of reports will be so awkward. PHTI is attempting to become the hurdle that requires this higher level of evidence.

However, instead of spotlighting and declaring a lack of evidence for digital health programs (which many health tech meta-analyses have done in the past to no great effect) Peterson went the more inflammatory route and deemed the evidence sufficient to conclude that these programs don’t work.

Peterson’s emotional approach might get better results than the more logical one. I am sure the agita that the PHTI report created was not at all unexpected back at Peterson HQ — I’m sure it was a crucial part of the plan to wake up the industry and get digital health companies to pay attention to its brand-new, self-declared arbiter of evidence.

Links to E&O’s reports, databases, newsletters

Click below for dedicated pages for each of those categories:

  • Want to read through past editions of E&O’s Selling to Employers newsletter? Check out the new archive for this topic right here.
  • Read through the long-form E&O research reports here.
  • Search and sort the E&O databases here.
  • Skim more than 300 past issues of E&O newsletters here.
And so ends Issue 064 of E&O: Employers. If you learned something from today’s issue, help me out and forward this newsletter to a friend or two.
article end logo
×
Emails show how Doctronic’s AI pilot blindsided Utah’s Medical Board
4.10.26
13 min. Read
ACCESS Model’s shockingly low payments. DMHT Rx count.
2.13.26
7 min. Read
HLTH acquisition price. Estimating Pomelo Care pricing, revenue, and more
1.09.26
7 min. Read
CMS to pay for ADHD DTx. More PFS notes. Two FDA De Novos.
11.07.25
6 min. Read
Pricing for Sword Health, Hinge Health, Joint Academy in the UK.
10.31.25
7 min. Read
Spring, Slingshot AI, Click and others write FDA about GenAI. Big Health board departs.
10.24.25
7 min. Read
Cigna clarifies new non-coverage policy for PDTs. Bevel $10M. FDA GenAI comments.
10.17.25
5 min. Read
Revisiting E&O scoops, pricing intel, revenue finds
10.10.25
7 min. Read
Big Health’s 2024 revenue shrinks. 2025: Runway worries.
9.26.25
5 min. Read
Kavira Health pricing. What did Swing Therapeutics buy? PFS Comments.
9.18.25
7 min. Read
  • First
  • Previous
  • 1 of 41
  • Next
  • Last