Five reasons why the Alexander Group defense looks exceedingly weak

Photo credit: Mario Moretto | BDN l Rhode Island welfare consultant Gary Alexander (left) and Maine Department of Health and Human Services Commissioner Mary Mayhew prepare Jan. 14 to answer questions from the Health and Human Services Committee at the Cross Office Building in Augusta.

After health policy analyst Kathy Gifford identified a problem with a figure in the Alexander Group’s report on Medicaid expansion, several members of the group came to its defense.

The issue revolved around a $575 million discrepancy and a different percentage cited in the report than used in calculations. The rate the Alexander Group used for federal reimbursements decreased the amount of money Maine would receive, thus increasing the cost of Medicaid expansion.

There are five reasons why the defense offered by members of the Alexander Group look rather weak

First, two different members of the Alexander Group gave reporter Mario Moretto two different explanations. 

As I’ll go through each below, we can leave this point, except to note that the offering of the different explanations casts doubt on what, exactly, generated the figure used. Not providing the same explanation doesn’t exactly generate confidence in the defenders’ claims about the basis of the figure.

Second, one of the two explanations doesn’t make sense mathematically. 

Randolph said that because different areas of Medicaid spending receive different levels of federal funding, you can’t distill a match rate down to a single number.

“I’m kind of surprised she did it that way,” he said. “She should know better that you can’t calculate it that way.”

However, none of the estimated match rates identified by the Alexander Group in its report dipped below the 61.55 percent traditional Medicaid match rate projected for 2014. That left Gifford puzzled by how the average federal match across all categories could be 60 percent.

Based on rather basic math, you can’t average or weight items that are 61.55% or higher and end up with 60%. But that’s what Mr. Randolph says the Alexander Group did.

Third, another member of the Alexander Group contended the report lumped non-Medicaid spending into the Medicaid baseline, thus making the combined figure useless for projecting baseline Medicaid funding.

Murray Blitzer, another member of the Alexander Group, suggested Tuesday that the figures in the study included not just Medicaid spending, but also medical services paid for with state dollars only.

“If you look at the baseline, it’s coming off the state spending occurring in MaineCare and other related services,” he said. “Most Medicaid programs in the states, it’s a combination of Medicaid and some assistance that’s state-only.”

As critic Gifford noted:

“If he’s including non-Medicaid costs in there, that would drive [the average match rate] down, I guess.” “But you’d have to say that if it’s a non-Medicaid program, why is it in there? A state-only program isn’t going to grow unless appropriations grow. Those aren’t entitlements.”

Thus this explanation from Mr. Blitzer, which doesn’t match the one proffered by Mr. Randolph, doesn’t work either.

Fourth, the second explanation is vague and thus can’t be assessed, since the report didn’t explain the specifics or even bother to mention that non-Medicaid programs were included.

Regarding Blitzer’s explanation:

It was unclear exactly what state-funded services were included — or why they were included at all. Those services would not necessarily grow along with Medicaid expansion.

This lack of clarity is a very poor research practice and makes the explanation sound as if someone just came up with it. A clear rationale is missing. Indeed, no rationale is provided at all. This is highly problematic.

Good research lays out the sources of all data and explains how complex measures were created. This enables others to replicate the findings, to make sure that the research was done correctly. (In fact, replication standards are rising, as an increasing number of academic journals require that all of the data are available for any researcher to run through statistical programs and every step is laid out.)

Fifth, given the conflicting explanations and missing information, it still appears that the reimbursement rate given in the text and the reimbursement rate used in calculations are different.

As I noted in my last post, this is a big no-no. Researchers are obligated to have their description of calculations and the actual mathematical calculations match. If they don’t match, the explanation is inaccurate. This goes to the report’s honesty and transparency.

Maine legislators and taxpayers deserve better, more consistent explanations than have been offered to date. If they are not available, the report should be withdrawn.

Amy Fried

About Amy Fried

Amy Fried loves Maine's sense of community and the wonderful mix of culture and outdoor recreation. She loves politics in three ways: as an analytical political scientist, a devoted political junkie and a citizen who believes politics matters for people's lives. Fried is Professor of Political Science at the University of Maine. Her views do not reflect those of her employer or any group to which she belongs.