Skip Maine state header navigation
Skip All Navigation
|Home | Contact Us | Careers | Calendar|
STATE OF MAINE
Superintendent of Insurance Mila Kofman issues the following Decision and Order in the above-captioned proceeding.
The adjudicatory proceeding in this matter was conducted pursuant to 24‑A M.R.S.A. § 6913(1)(C); the Maine Administrative Procedure Act, 5 M.R.S.A. chapter 375, subchapter 4; 24-A M.R.S.A. §§ 229 to 236; Bureau of Insurance Rule Chapter 350; and orders issued by me.
On August 12, 2008, pursuant to 24-A M.R.S.A. § 6913(1)(B), the Board of Directors (the “Board”) of the Dirigo Health Agency (“DHA”)1 filed its annual determination of:
24-A M.R.S.A. § 6913(1)(A).
The purpose of this proceeding and hearing is for me to review the Board’s filing and “issue an order approving, in whole or in part, or disapproving the filing.” 24‑A M.R.S.A. § 6913(1)(C). I am required to “approve the filing upon a determination that the aggregate measurable cost savings filed by the board are reasonably supported by the evidence in the record.” Id. The Board, through DHA as the moving party, has the burden of proving that its determination of aggregate measurable cost savings is reasonably supported by the evidence in the record.
“Reasonably supported by the evidence” has been previously interpreted to refer to the totality of the evidence and not to any part of the evidence taken out of context. In re Review of Aggregate Measurable Cost Savings Determined by Dirigo Health for the First Assessment Year (“Year One Decision”), No. INS-05-700 at p. 2 (October 29, 2005).Furthermore, it has been stated that “reasonably supported” is not equivalent to a preponderance-of-the-evidence standard. Id. The Board does not have to prove that its chosen alternative is the best or only alternative supported by the record, nor does it have to show that its chosen alternative is the most reasonable, but rather, the Board must show that the evidence in the record reasonably supports its alternative. Id.
The Dirigo Health Agency, through its Board of Directors, is a party to the proceeding. 24-A M.R.S.A. § 6913(1)(C). Other parties to the proceeding, pursuant to my grants of intervention, include the Maine Automobile Dealers Association Insurance Trust (“Trust”), the Maine State Chamber of Commerce (“Chamber”), the Maine Association of Health Plans (“MEAHP”), Anthem Health Plans of Maine, Inc. (“Anthem”), and Consumers for Affordable Health Care (“CAHC”).
III. PROCEDURAL HISTORY
On July 10, 2008, a Notice of Pending Proceeding and Hearing was issued, among other matters setting the intervention deadline and contingent hearing dates. The July 19th Order also included initial procedures for the conduct of the proceeding. By Order issued August 18, 2008, September 9, 2008 was established as the date for the public hearing.
On August 12, 2008, the Board, through its counsel, Assistant Attorney General William Laubenstein, submitted the Board’s filing. The filing consists of the Board’s August 11, 2008, written decision and a copy of the complete administrative record of the proceeding before the Board, In re Determination of Aggregate Measurable Cost Savings for the Fourth Assessment Year (2009). On August 29, 2008, the Board filed an amended page 4 of the Board’s decision. This administrative record is voluminous and has been made available for public inspection at the offices of the Bureau of Insurance in Gardiner, Maine throughout this proceeding. All other filings made by the parties and the Superintendent’s interlocutory rulings and orders have been posted throughout the proceeding to the Bureau’s web page at www.maine.gov/insurance for public access and inspection.
On August 15, 2008, the Trust filed a motion to recuse Superintendent Kofman from any further participation in this matter, which was denied by Order issued August 25, 2008.
Intervention was granted by Order issued August 18, 2008, to the Trust represented by Bruce Gerrity, Esq.; the Chamber represented by William Stiles, Esq.; MEAHP represented by D. Michael Frink, Esq.; Anthem represented by Christopher Roach, Esq.; and CAHC represented by Joseph Ditré, Esq. DHA was represented by Assistant Attorney General Michael Colleran. The August 18th Order also included further procedures for the conduct of the proceeding.
An Order Regarding the Record was issued on August 29, 2008 seeking additional information from DHA and the Chamber, to which DHA and the Chamber separately responded on September 2, 2008.
A Scheduling Order was issued on September 5, 2008 setting forth the procedure for oral argument at hearing and the order of issues to be addressed at the hearing.
The hearing was held in Augusta, Maine on September 9, 2008. The hearing was conducted entirely in public session. Counsel for each of the parties presented oral argument at the hearing. At the conclusion of the hearing, written Hearing Questions for Citations to the Record were distributed, to which all parties filed separate responses on September 11, 2008.
On September 23, 2008, a Part I Decision and Order was issued setting forth the result of this proceeding with a summary of the Superintendent’s factual conclusions. That preliminary decision and order was issued pursuant to Bureau of Insurance Rule 350, § 18, which authorizes the issuance of a two-part decision in “extraordinary circumstances, including those in which a time constraint imposed by rule or statute requires the issuance of a decision by a specific date.” This Part II Decision and Order incorporates the complete statement of findings of fact supporting the decision and the detailed procedural history.
IV. DISCUSSION, ANALYSIS, FINDINGS, AND CONCLUSIONS
The Board’s filing itemizes aggregate measurable cost savings on the basis of four identified topics: three categories of savings initiatives and another category labeled “overlap” that is intended to account for savings that are double-counted between certain of the savings initiatives. The table below identifies the four areas, the amount of savings and overlap approved by the Board as contained in its filing, and the amount of savings and overlap that I find to be reasonably supported by the evidence in the record
A. Legal Issues
As explained in the Years One, Two, and Three Decisions of the Superintendent of Insurance, my statutory responsibility in this proceeding is limited to determining whether the “aggregate measurable cost savings filed by the board are reasonably supported by the evidence in the record.” 24‑A M.R.S.A. § 6913(1)(C). In making this determination, my authority is limited to “issu[ing] an order approving, in whole or in part, or disapproving the filing.” Id. Thus, I do not sit as an appellate tribunal with the authority to review the Board’s interpretations of law. Although the payor intervenors (MEAHP, the Trust, the Chamber, and Anthem) have argued for me to make certain legal interpretations regarding the Board’s actions – for example, whether the Board’s inclusion of the Medical Loss Ratio initiative in its cost savings determination is permissible under the Dirigo laws – these issues are beyond my statutory authority and beyond the scope of this proceeding. As consistently explained by the Superintendent through the course of the previous three years’ annual review proceedings, I have not been granted the power by the Legislature to review the legal interpretations made by the Board. The breadth of my legal authority in these proceedings is prescribed by statute. My charge is to review the record to determine whether the evidence reasonably supports the aggregate measurable cost savings determined by the Board. This limited statutory jurisdiction over the Board’s cost savings determinations does not invest me with the powers of the judicial branch, in this instance to rule on the legality of substantive decisions made by the Board, a separate executive agency, under its separate statutory responsibilities. Thus, as in previous years’ proceedings, I confine my review to analyzing whether the amounts and methodologies used by the Board for determining aggregate measurable cost savings are reasonably supported by the evidence in the record. When claims are raised that the Board exceeded its authority or erred as a matter of law in determining the meaning of the phrase “aggregate measurable cost savings” and in creating a methodology to implement that interpretation, the task of resolving those claims is for the courts. See, generally, Maine Association of Health Plans v. Superintendent of Insurance, 2007 ME 69.
B. The Board’s Determination of Aggregate Measurable Cost Savings
To assist it in developing a methodology for calculating aggregate measurable cost savings, DHA retained the consulting firm of schramm-raleigh Health Strategy (“srHS”). The recommendations by srHS were presented in a document entitled Report to the Dirigo Health Agency, Dirigo Health Act: Aggregate Measurable Cost Savings (AMCS) for Year 4, Updated June 26, 2008 (the “srHS Report”). The srHS Report determined the aggregate measurable cost savings to total $190.2 million.2 The Board adopted all of the srHS savings initiative categories and the finding that this year’s calculations required no overlap adjustment, but rejected aspects of the savings calculations from the hospital and uninsured / underinsured initiatives, thereby approving a savings amount of $149.6 million.
1. Hospital Savings Initiatives. Board determination: $119.4 million. Amount the Superintendent finds reasonably supported by the evidence: $40 million.
The hospital savings initiative component of the Board’s filing seeks to measure savings resulting from the cost containment initiatives established in connection with the Dirigo program. This requires a comparison of actual cost levels to an estimate of what would have occurred had the Dirigo program not been implemented – a concept referred to by researchers as “the counterfactual.” Expressed in mathematical terms, Savings = Counterfactual – Actual, and the principal task at hand is to review the data on the record and the associated calculations of the counterfactual. Such a task is inherently difficult and imprecise. Perhaps for that reason, the standard of review the Superintendent is required by law to apply is whether the evidence on the record reasonably supports the Board’s findings. By necessity, the Superintendent’s task in this proceeding becomes determining whether a reasonably supported counterfactual is provided in the information placed on the record. Such a standard by its very nature implies that significant judgment is involved in making the determination of the reasonableness of the savings amount. No single formula or model should be expected to produce an answer. The totality of the evidence must be reviewed and a judgment made as to whether the recommended savings have reasonable support.
In the proceeding, the inherent difficulty of the task has been exacerbated by incomplete documentation of the data, methods, interpretations, and assumptions made by DHA and its consultant. The payor intervenors argue that the Board exceeded the discretion granted to it by statute by relying heavily on recommendations from DHA and its consultant that were based on poor practices and highly questionable methodology. The indiscriminate nature of the payor intervenors’ criticisms, however, has only served to muddy the waters further, as many of their charges have been made without substantiation for their materiality or their applicability to the measurement task at hand, disregarding readily available information that would allow them to assess the degree to which specific criticisms, if true, would actually affect the estimated savings. Furthermore, many of the attacks leveled at DHA and its Board by the payor intervenors are, in effect, criticisms of the law, which both the Board and the Superintendent of Insurance are obligated to follow.
As a result of the difficult statutory task and the manner in which it has been conducted by the parties to the proceeding, the record is composed of a myriad of dramatically varying assertions, in some cases unsupported, in others supported with difficult-to-interpret technical details. Fortunately, a significant amount of the data required to assess the level of savings is contained in the record, and the Superintendent and her staff and consultants have been able to work through this information to make judgments about the degree to which the recommended savings find reasonable support.
The “CMAD” Approach
In the SOP proceedings, the estimates of the savings related to voluntary cost controls by hospitals have centered around the concept of a cost or expense per case-mix adjusted discharge. The numerator of this measure is an adjusted version of total hospital expense, and the denominator is an estimate of hospital services provided, expressed in patient discharge equivalents. That denominator is called “case-mix adjusted discharges,” or CMADs, and consists of the sum of inpatient discharges (adjusted by an index of inpatient case mix intensity) and an estimated discharge equivalent for outpatient service activity (defined as outpatient charges divided by the quotient of inpatient charges and inpatient admissions). In many cases the expense per CMAD (total expenses divided by CMADs) has been referred to as “CMAD,” confusing the output denominator with the expense per output measure that is the focus of the analysis. In an effort to maintain this distinction and clarify the analysis, in this decision expense per CMAD will be labeled ECMAD.
The payor intervenors have raised the issue as to whether the definition of ECMAD as used in the srHS Report, which is based on the same formula used in prior measurable cost savings proceedings, follows the statutory definition, as set forth in the “Act To Implement Certain Recommendations of the Commission To Study Maine's Community Hospitals,” P.L. 2005, ch. 394, § 4(1)(B).3 The difference is primarily in the manner in which outpatient activity is converted to a discharge equivalent, with the statutory definition relying on inpatient and outpatient revenue rather than inpatient and outpatient charges. No information has been placed on the record containing the required components of separated inpatient and outpatient hospital revenue, and so the impact of including such a measure cannot be determined from the record. Thus, as a factual matter, the srHS formula for calculating ECMAD is a measure of hospital costs that is reasonably supported by this record; the question of whether it is inadmissible as a matter of law is outside the jurisdiction of the Superintendent.
Analytically, revenue is a measure preferable to charges since it is less subject to distortions related to changes in charge levels (i.e., nominal prices) that are not directly related to changes in actual cost or revenue of the hospitals.4 However, this concern is less relevant for this proceeding given that the data and analysis on the record include comparison to ECMADs in other states. To the extent that phenomena like increasing charge levels influence the measurement of ECMAD, the impact on the analytical results will be much less material in a multistate analysis than in a Maine-only analysis, as long as these factors change in similar ways over time across states.
Although the use of ECMAD as the relevant measure was established early in the process that surrounded the formulation and implementation of the Dirigo Act, it is certainly not the only measure of hospital costs that could be analyzed. Total cost, for example, is relatively simple and objective. Each has its advantages and disadvantages. A per-output measure of cost such as ECMAD has the advantage of having its variable cost component less affected by extraneous factors affecting volume, such as population growth. It has the disadvantage of rising and falling due to the impact of these same factors on the volume over which fixed costs are spread. Total cost can grow or shrink in ways that are largely volume driven but are not affected by distortions in output measurement. Per capita measures might be preferable to both total cost and per-output cost. However, ECMAD is the only measure that has been analyzed on this record. The payor intervenors have disputed this measure but have not placed an analysis using different measures on the record.
Clarifying the Question to be Analyzed
It is important to define exactly what question must be answered in assessing AMCS attributed to the hospital savings initiative, particularly because many of the assertions made in the proceeding appear to be focused on different questions. Given that hospital costs and costs per CMAD rise over time and have continued to rise over time by all measures over the period in question, determining the counterfactual cost figure (against which savings are measured) requires assessing how much more they would have grown in the absence of Dirigo.
Further, any cost reduction due to the voluntary cost control efforts related to Dirigo will be occurring simultaneously with changes in the rate of growth in hospital costs due to other factors (the technical term is the “secular trend” in cost), and at a theoretical level this secular trend could be either increasing the rate of growth in costs or decreasing the rate of growth in costs. Thus it is possible that any measured reduction to the overall rate of growth in Maine hospital costs over time overstates or understates the impact of Dirigo, depending on the size of the Dirigo-related cost impact and the size and direction of the secular cost trend. One important task, then, is to in some way attempt to separate the secular trend from the impact of Dirigo.
One can imagine polar-opposite scenarios about the relationship of secular trend and the impact of Dirigo. It could have been that cost growth in Maine was the highest in the country but still lower than it would have been in the absence of Dirigo. It also could have been that Maine hospital cost growth had dropped dramatically and by more than in any other state but that this drop was not at all attributable to Dirigo. DHA argued that all reductions in the rate of cost growth are attributable to Dirigo, while the payor intervenors argued that the existence of reductions in the rate of cost growth in other states implies that reductions in cost growth in Maine cannot have been attributable even in part to Dirigo. Neither is necessarily true, and they cannot both be true; further evidence and further analysis are essential. Increases or decreases in cost do not by themselves answer the central counterfactual question: At what level would hospital costs in Maine have grown in the absence of Dirigo?
The Superintendent’s Prior Decisions and Guidance
Because counterfactual estimates are inherently difficult, complex, and require multiple judgments and assumptions, the quality of those estimates depends on the quality of the information presented. In order to constitute valid “measurable cost savings” for purposes of these proceedings, it is not enough for savings to exist. They must be identified and measured in some comprehensible and meaningful way. The existence of savings is important, but equally important to the savings determination is the quality of the information placed on the record to support the existence of those savings. Shortcomings in this regard have been a consistent characteristic of the cases made in each year’s proceeding.5
Notwithstanding these shortcomings, the parties have made many assertions about what prior findings of savings imply for the determination of savings in 2007. However, the level of savings approved in prior years has no direct relevance to the degree to which reasonable evidence exists on this year’s record to substantiate savings. The savings amounts approved by the Superintendent in prior proceedings are not independent best estimates; they are judgments as to whether the determination of the Board was reasonably supported by the evidence on the record, in whole or in part. In any of the prior years, it is possible that there existed real savings that were not recognized because insufficient evidence appears on the record to meet the reasonableness standard, either because the evidence demonstrating those savings was not yet in existence, because the evidence had not been introduced into the record, or because its full implications had not been adequately understood at the time. As a result, any assertion that amounts approved for prior years represent a natural limit on amounts that can be approved in the current proceeding are unfounded.
Additional assertions have been made in this year’s proceeding about the guidance provided in the Superintendent’s prior decisions, and what methodologically should or should not have been done to adhere to that guidance. In prior years, DHA has put forward a method to calculate hospital savings that has relied on the series of Maine-specific ECMAD numbers in the three years prior to Dirigo’s implementation and comparing the average growth in that period to the average growth in the post-Dirigo period. The method made one minor adjustment acknowledging the need to control for secular trend, an adjustment for the rate of growth in hospital input prices in each year.6 In last year’s decision the Superintendent made the point that the rate of growth in Maine-only hospital costs in the period ending in 2003 was more relevant for its expected (counterfactual) rate of growth in 2004 than it would be for its expected rate of growth in 2007. This has been misinterpreted to mean that the pre-Dirigo period in Maine is irrelevant or necessarily misleading for the assessment of 2007 savings. The pre-Dirigo period declines in direct importance over time, but was neither the only relevant information in 2004 nor irrelevant in 2007. The further in time we are from the base period ending in 2003, the less the Maine base period information can be relied upon to make the assessment, and the more need there is for comparative information from other states and other cost-influencing factors.
An example illustrates this point. Imagine two states, State A and State B, with annual rates of growth as indicated in Table 1 below.
If we know that specific steps were taken to control costs in State A beginning in Year 4 and have additional independent evidence of its impact, the comparison of State A’s change in cost growth in the first three years is a more appropriate comparison to the fourth year growth in State A than to the rate of cost growth in State B in the fourth year.
Now, imagine the passage of three more years.
In the seventh year, the Year 1 through Year 3 experience in State A is less relevant and informative about whether the level of cost growth is lower than the counterfactual than it was in the fourth year. With the passage of time, the comparison to what has happened in other states becomes more important. In the example, the increases in State A may still reflect restrained growth if the growth in the comparison group has been even larger. Thus, a multistate comparison becomes more appropriate and important, because the impact of other forces reflected in the secular trend have more time to impact the cost growth in State A and can be controlled for partially by introducing the comparison. This comparison is further strengthened if we introduce additional information about cost drivers into the analysis. For example, if we knew that inflation had driven wage rates up dramatically in Years 5 through 7 in State B but not in State A, the interpretation of the comparison would change. This illustrates how a multivariate analysis helps to control for the secular trend. If we have a number of factors that can affect cost growth (which we do), then regression analysis is a framework that allows one to simultaneously control for the many variables that affect cost growth. If one is compelled by law to assess the counterfactual related to the Dirigo implementation several years after the implementation has begun, one can partially address the increasing effect of secular trend on the cost growth levels by using a multistate, multivariate regression framework.
So, the guidance in last year’s decision was not and is not intended to have the experience before Dirigo in Maine ignored, but to have it supplemented by other information, which is now increasingly important in the assessment framework. The multistate, multivariate regression approach also has the advantage of making some of the shortcomings associated with ECMAD as the basis for the hospital cost growth calculations less important. Any factors which are changing in Maine that affect the estimated rate of growth in costs, but which also have a similar impact on other states measured cost growth, or which are controlled for by variables in the regression analysis, will have much less of a distorting effect in the multivariate regression analysis.
Having described the advantages of a multivariate regression, it is necessary to caution that as with any other tool, its advantages are only realized if it is designed and executed appropriately and well. The lack of reliable documentation, and the lack of evidence in the record that the analysis was executed in accordance with sound professional principles, makes it inconsistent with the Superintendent’s guidance and does not provide the same degree of benefit as an analysis supported by documentation demonstrating that sound methods and reasoning were employed.
Carrying Out the Analysis and Documenting Analytical Evidence for Savings
Analyzing the change in ECMAD over time nationally requires voluminous and detailed source data, and numerous steps to clean, aggregate, calculate, analyze, and interpret those data. To provide clear evidence that would enable an objective assessment of the conclusions reached, it is essential to document all of these steps clearly, and the reasoning used to draw the conclusions. As in prior years, however, there are deficiencies in the completeness and clarity of the record before us. In the Superintendent’s hearing questions, it was requested that DHA point to those places in the record at which these types of information had been provided. While the response to this question clarified the steps taken by DHA’s consultant, it also makes clear that the information provided on the record by DHA could have and should have been more complete. For example:
However, it is not only the information provided by DHA that has been found incomplete and lacking in transparency. The payor intervenors have also in many instances failed to substantiate their own arguments. For example:
In short, both DHA and the payor intervenors have omitted important information and introduced extraneous and misleading information, the net effect of which is to make more difficult the already difficult task of determining AMCS related to the hospital savings initiative.
As in past years, the Superintendent has gone through the task of reviewing the record, and has evaluated the underlying data placed on the record and methods used so as to illuminate that which is not illuminated in the written portion of the record, to the extent possible, and to evaluate the assertions made by both parties that are contained in the written record.
Issues with the Data Used to Perform the Analysis
A central distinction between the analysis done for this year’s proceeding and those done in prior years is the use of a national data set. This difference required the use of the national Medicare Cost Report data set obtained (indirectly through AHD) from the CMS, representing over 4,000 hospitals, rather than using the detailed actual cost reports from the 36 hospitals located in Maine. The careful review efforts made by the payor intervenors in prior years were not feasible given this choice.
The payor intervenors’ witnesses testified to the data anomalies found in the srHS MCR data, but did not provide any evidence on the record as to the impact these data issues might have on the estimates produced by srHS. Given the size of the dataset and the number of observations, DHA’s position that these issues are not material to the results is plausible, but not dispositive based on this record.
There are two specific issues that definitely merit further investigation due to their clear potential for significantly influencing the AMCS results of the analysis. The first of these issues is the observed growth in ECMAD for Maine hospitals between 2000 and 2001 in this year’s analysis, a key element in the calculation of the counterfactual “baseline” rate of growth. The figure used in this year’s analysis was 11.6%, as compared to the 4.7% figure used in prior years’ analyses. This more than doubling of the 2000-2001 growth rate, in data that is sufficiently mature that it should be essentially static, is troubling. DHA’s consultants ascribe this change vaguely to the change to the national data source described above. It is a serious shortcoming in their analysis that they have not analyzed this significant inconsistency in their new data source and provided an explanation and/or correction, especially in light of the dramatic impact the new figure has on the results, as discussed below. This flaw raises serious questions about the credibility of the entire analysis.
The Superintendent will take official notice of the record of the Year Two proceeding for the limited purpose of providing some explanatory background as to the likely source of the difference in the year-to-year growth rate in ECMAD. A comparison of the figures in the Year Two record with the corresponding data used in this year’s srHS analysis elicits at least two important differences in the data. One difference is that the analysis conducted in this year’s process by srHS apparently, for the first time, calculated ECMAD on the basis of discharge totals that excluded newborns, although the documentation provided by DHA for Year Four, provides no information substantiating whether that is the case. The other difference, with a more significant impact on the 2000-2001 growth rate, is that the number of (without newborns) discharges in the current year’s data file for 2000 is significantly higher than the comparable number in the Year Two file – 146,561 as presented this year vs. 142,061 in the Year Two analysis. This significantly larger inpatient discharge number by itself depresses the cost per CMAD, and this effect is magnified by the role it plays in the outpatient equivalent portion of the CMAD formula: Outpatient Charges / (Inpatient Charges/Inpatient Discharges). The (Inpatient Charges/Inpatient Discharges) figure is reduced, increasing the value of the “outpatient discharge equivalent” because these vary inversely. Since both the inpatient discharges and the outpatient discharges are larger, the total CMAD (adjusted discharges) figure is much larger, reducing the ECMAD (expense per adjusted discharge) figure for 2000 and increasing the rate of growth between 2000 and 2001. As discussed above, the original CMS cost report dataset is not on the record, and so it cannot be confirmed from the record whether the difference in the discharge totals are due to a revision in the CMS file, or whether some issue with the manipulations performed by AHD and/or srHS in some way caused an error in the data.
The second significant issue with the data is the manner in which the incomplete data for 2007 were adjusted to reflect missing data. Due to differences in hospital fiscal years and annual filing dates, 14 of the Maine hospitals (and a much larger number of non-Maine hospitals) have only partial-year cost report information for SFY 2007. The missing data were filled in by srHS by computing a pro-rated and trended total for each component of each hospital. However, projecting each component of ECMAD separately may lead to inappropriate values. What is clear is that the overall ECMAD for Maine hospitals in 2007 is quite different depending on which method is used to compute an approximation of a complete year. The ECMAD figure used in the srHS analysis for Maine is $7,757, which represents an average annual rate of growth compared to 2003 of 4.5%. Altering the method to use a projection of the total ECMAD amount, rather than projecting the components and recalculating, produces an ECMAD of $7,815 or an average annual rate of growth since 2003 of 4.7%. Using the raw 2007 data (with partial year data for some hospitals) to calculate the ECMAD produces a value of $7,992, or 5.3% average annual growth since 2003. Below, we calculate the sensitivity of the AMCS to different assumptions about the 2007 hospital costs.
As discussed in the next section, conducting an analysis of the sensitivity of the results to the issues associated with the 2000 and 2007 ECMAD values is important to arriving at a judgment about the level of savings generated by the hospital savings initiative.
Issues with the Regression Analysis
There were a number of points asserted by the payor intervenors as flaws in the regression analysis, but many were not tested with respect to their impact on the results. Appendix 1, at the end of this section of the analysis, is a table that summarizes the objections and evaluates the merits of the criticisms. Because neither DHA nor the payor intervenors provided information for the record about the actual importance and impact on the estimates of the issues that might affect the validity of the regression results, the Superintendent and her consultant as part of the review of the record have examined the data on the record to arrive at independent judgments about the severity and impact of these issues.
To quantify the impact of Dirigo on hospital costs, srHS used nationwide hospital-level data relating to costs and determinants of costs for 2000 through 2007. This information is also aggregated and adapted to state-level “virtual hospitals.” They estimate linear (ordinary least squares) regressions modeling ECMAD as a function of key cost determinants and a linear time trend (variable Y). Their model specification allows for the trend to vary between Maine and other states (in both intercept [variable M] and slope [variable M:Y]) and between before and after 2003 (when Dirigo’s effects were presumed to begin) (intercept effect: variable D, slope effect: variable D:Y). Finally, they include regressors in their model specification that allow the post-2003 portion of the linear cost trend to vary between Maine and the control states (intercept effect: variable M:D, slope effect: variable M:D:Y).
In their U.S. Hospitals regression, the R2 statistic (the proportion of variation in costs explained by the variables in the model) is 0.43. They interpret the coefficient on the M:D:Y variable, -32.2, as indicating that in 2007 hospital costs in Maine were $32 per CMAD lower because of Dirigo. The p-value on this coefficient for the “one-sided test” (an estimate, based on some standard assumptions, of the probability that the coefficient is greater than zero) is 0.45, which is less than 50% but much higher than the significance threshold of 0.05 traditionally used by statisticians.
The payor intervenors argue that the high p-value demonstrates that the results of this regression analysis are so unreliable that the only conclusion that could be drawn from them is that there were no savings. What the p-value actually signifies, however, is that if this regression analysis were the only information we were relying on, it would be barely more likely than not that there were any savings at all – the margin of error around the “observed” $32.2 reduction in ECMAD is so high, based on the model’s own assumptions, that zero is well within that margin of error.
The fallacy in focusing on p-values in this situation is that p-values are only relevant if the “null hypothesis” is under serious consideration as a competing analysis of the observation. The null hypothesis is that the effect that is being measured does not really exist, and that the value obtained is entirely the result of random fluctuations in the measurement. A high p-value means the measurement is too close to zero for the observations and analysis to be sufficient, by themselves, to contradict the null hypothesis in any meaningful way.
In this case, however, there is enough other evidence that savings exist that the null hypothesis is not a stronger competing theory. What this means is that if the regression were properly specified and properly executed, and if it embodied the only useful information we had, then $32.2 per CMAD would be the single most likely estimate we had for the reduction in ECMAD.
However, that is very different from saying it is a reliable estimate. It might be the most likely figure, but a wide range of other figures are almost equally likely. The reason the p-value is so high, after all, is because the margin of error is so wide. When the Superintendent’s consultant tested a refined version of the srHS model under several different assumptions, the 95% confidence interval was always more than $350 million wide. That is to say, all of the savings calculations generated by that model have a margin of error of more than plus or minus $175 million, within the range that statisticians consider significant. It must also be kept in mind that these statistical calculations are based only on a comparison between the observed data and the equation produced by the regression model; they assume that the data entered into the model are valid. The significant data quality concerns discussed earlier further reduce the usefulness of any estimate generated by the srHS model.
In the srHS Cluster 1 regression, the R2 statistic is 0.98. The coefficient on the M:D:Y variable is -185.4, and the p-value for the one-sided test that the coefficient is less than zero is 0.055, slightly above the standard criterion. Again, srHS interpreted the coefficient on the M:D:Y variable as indicating that hospital costs in Maine were $185 per CMAD lower than they would have been if not for the passage of the Dirigo Act. Finally, they combine the two estimates of Dirigo’s effect on hospital costs by assigning a weight of 75% to the US Hospitals estimate ($32) and 25% to the Cluster 1 estimate ($185), arriving at an estimate of costs savings of $70 per CMAD.
Starting with the srHS state-level data, and a replication of the results, several issues with the srHS approach were investigated by the Superintendent’s consultant using the data on the record. Investigating the potential impact of the issues identified requires using the data on the record to adjust the econometric modeling approach in several ways:
Results of Respecifying the srHS Regression Model
The results of these adjustments to the analysis conducted by srHS are presented in the following Table 3, comprised of three separate tables:
[See pp. 17 – 19.]
* Maine indicator, Year 2003 indicator, Year 2007 indicator, M*Yr03, M*Yr07
** Add control variables for Total Beds, Interns per Bed, Rural indicator, Days Medicare, Uninsured, Wage Index, Critical Access Hospitals, Days Medicaid, Percent Under 100% FPL, Teaching Status, All-Payer CMI.
Column A of Table 3 is based on the original srHS data, with descriptive computations and regression analysis results using the methodological modifications described above. Because we are using the srHS data, our calculations at this stage have not yet taken into account the impact of the data problems described above on the results. Addressing each block of column A separately:
In brief, making the analysis multistate reduces the estimated reduction in cost growth because cost growth has declined generally in the United States (but not by as much as in Maine), but adding controls for some specific factors that affect cost growth (e.g., rural indicator, percent Medicaid, etc.) increases the estimated reduction in cost growth. Roughly speaking, the additional control variables offset the impact of adding the multistate comparison in such a way that the results resemble the simple Maine-only trend comparison.10 This outcome was not knowable without actually performing the analysis, and the result can be viewed with greater confidence given the rigor of the approach, though the confidence is diminished by the problems with the quality of the underlying data.
The Column A regression-based estimate of cost reductions, using the original data, is $355 million. The 95% confidence interval on this result is between $162 million and $548 million. That is, the srHS data, unadjusted, produce a cost reduction estimate with a high degree of uncertainty as to the exact amount, but a high degree of statistical significance that it is very large. We can conclude from this that if the data could be relied upon, a methodologically sound regression analysis would have found significant and large cost reductions. This suggests that the Cluster models for estimating cost growth reductions may not have been significantly biased relative to a national sample. Why a national sample at the state level was not one of the analyses presented in the srHS Report is not known, nor is it known why srHS did not include in their report the detailed assessment of the Clusters that could be found in the discovery materials.
Two things must be kept in mind about the results in column A of Table 3. First, as discussed under “Clarifying the Question to be Analyzed,” the finding of significant reductions in cost growth does not tell us to what degree those reductions are attributable to Dirigo. Second, the results as just described rely on the srHS dataset as calculated by them, and thus are affected by significant data quality problems. We first turn to the data quality issues and then discuss what can be reasonably supported by evidence on the record for savings related to Dirigo, given the cost reduction findings.
The Impact of the Key Data Quality Issues on the Results
There are important data issues and modeling issues that require sensitivity analysis in order to gauge the impact on the cost reduction estimate of uncertainties in the input data and of the assumptions embedded in the calculation of the counterfactual “without Dirigo” ECMAD value. Columns B through F in Table 3 repeat the analysis just described above for different combinations of adjustments related to the data quality issues discussed earlier:
The adjustments applied to the 2000 and 2007 values are not in any real sense “precise,” in that we do not know the correct amount by which to adjust the values without a thorough review of the source data beyond what has been placed on the record (i.e., careful construction of a cleaned data set from the Medicare Cost Report file). In addition, inserting values in for a key variable (e.g., 2000-2001 ECMAD growth) without making a consistent edit to the source data across the entire U.S. source file may lead to results less significant than would be the case with application of consistent edits. It is clear, however, that the results are sensitive to data quality and that the record does not contain evidence necessary to resolve the data quality issues definitively.
Scenario F is the one that addresses the data issues most directly, adjusting the 2000-2001 cost growth number downward in a way that ignores the anomalous 2000 value and avoiding arbitrary adjustments/projections to the 2007 data by using the raw 2007 data. The point estimate for cost reduction produced in this scenario is $87 million, and its 95% confidence interval is the range between $270 million and negative $96 million.
Interpreting the Results
The preceding discussion developed an estimate for cost reduction in the post-Dirigo period that has the following properties:
Although this model, taken alone, would suggest that $87 million is the best estimate of post-Dirigo cost reductions, the broad confidence interval and the questions raised about the underlying data mean that a wide range of figures might be equally reasonable. Therefore, any other evidence in the record that would shed light upon cost reductions needs to be given significant weight.
There is one competing estimate on the record advocated by some payor intervenors. This estimate was developed by a consultant for the Maine Association of Health Plans, Jack Burke, FSA, MAAA of Milliman. Mr. Burke’s methodology, which was also used to develop a competing savings estimate in Year Three, used as its starting point the general methodology developed by srHS and approved by the Board in Years One through Three: calculating the counterfactual “without Dirigo” ECMAD estimate by trending the observed Maine ECMAD figures during the base period. However, Mr. Burke made significantly more conservative assumptions, yielding lower figures. In Year Three, Mr. Burke found estimated savings of $8.2 million (as compared to the srHS estimate of $88.4 million). This year, he found estimated savings of $21.2 million.
The flaws in the trending approach underlying the Burke estimate have been exhaustively documented in Years One through Three, and Mr. Burke himself did not offer that figure as a definitive or reliable estimate of the hospital cost savings. However, the srHS estimate also has little persuasive power, even after the best efforts made to correct for its deficiencies. The srHS estimates and the Burke estimate, notwithstanding their deficiencies, are the only measurements of aggregate hospital cost savings on the record.
As in years One through Three, DHA again failed to build a conclusive and comprehensive record. Some intervenors argue that the failure of either estimate to measure savings reliably should equate to a finding of no savings. However, there is persuasive evidence on the record that savings have been realized, even if the amount of those savings is difficult to calculate with precision. The Board’s finding must therefore be approved in part and my task is to determine what part can be reasonably supported by the evidence in the record.
Given the lack of definitive information, what would be reasonable is to consider the Burke model as a lower bound and the best available refinement of the srHS model as an upper bound. It would not be reasonable to give equal or nearly equal weight to the srHS model as to the Burke model, even with the corrections that have been made to the srHS model, because the conservative assumptions underlying the Burke model make an implicit recognition of secular trend that is lacking in the srHS model, and because data quality concerns and absence of documentation are pervasive in the srHS model. Therefore, I find it reasonable to give two-thirds weight to the Burke estimate of $21.2 million and one-third weight to the regression estimate of $87 million. Rounding the resulting weighted average down to the nearest $10 million, because attempts at greater precision are not reasonably supported in this record, produces an estimate of $40 million for hospital cost savings.
The greater weight given to the lower estimate can be corroborated by anecdotal evidence, and the persuasive power of anecdotal evidence is not insignificant in light of the low statistical credibility and limited explanatory power of the projections from the models presented by the parties. Largely absent from this record and from the records of the prior year proceedings is testimony or other information provided by the hospitals that are the subject of this savings initiative. With the relatively indirect method of measuring changes in publicly reported cost data being the primary analytical strategy, some specific information from the institutions that are the focus of the hospital savings initiatives would be very helpful. It should be the goal of any future activities aimed at measuring the impact of the hospital savings initiative to give due weight to reports of actual savings by the hospitals themselves, and to consider detailed surveying and/or testimony from these institutions about exactly what they are doing in response to Dirigo and what they estimate it to have saved and how.
One piece of evidence on the record from hospitals this year is DHA Exhibit 7, which contains a quote from a newspaper article authored by Elizabeth Mitchell, Senior Director of Public Policy at MaineHealth. The Exhibit contains the following specific quote:
This statement by a member of senior management at Maine’s largest hospital group (Maine Medical Center, Miles Memorial Hospital, Saint Andrews Hospital, and Stephens Memorial Hospital), is a strong assertion that Dirigo has influenced cost levels and hospital margins (via price reductions), and thus represents further evidence that the savings are real. In addition, this statement provides some insight into the problem of correcting for regression to the mean and identifying actual Dirigo savings, since it was based on the hospital’s own evaluation of its cost containment efforts rather than an observation of how observed trends compared to some hypothetical baseline.
On the other hand, this figure represents only one hospital group, albeit the state’s largest, and there is no evidence that it was calculated in any rigorous manner. The true amount might be lower because of the human tendency to overstate, or it might be higher because the $40 million claimed to have been passed along to payors is not necessarily the entire amount of measurable cost savings, or because some savings might have inadvertently overlooked when compiling the report.
Therefore, the most that can be said about the MaineHealth report from a quantitative perspective is that it is not inconsistent with $40 million in annual statewide savings, when we consider the expectation that MaineHealth represents a major proportion of statewide savings and the expectation that more of the observed savings occurred in the third year because cost containment efforts in prior years would have residual effects in later years. For example, if $20 million of the savings were in 2007 and they represented half of the statewide savings, that would be $40 million statewide.
For these reasons, the Superintendent finds that the Board’s determination of Dirigo-related savings of $119 million are not reasonably supported by the evidence in the record, but that part of that estimate in the amount of $40 million is reasonably supported.
2. Uninsured / Underinsured Savings Initiatives. Board determination: $23.6 million. Amount the Superintendent finds reasonably supported by the evidence: $6.1 million.
The Uninsured / Underinsured savings initiative component of the Board’s filing seeks to quantify savings to the health care system that result from the increase in coverage that can be attributed to the Dirigo reforms. When Maine residents have health coverage, the providers of their medical care will be reimbursed for those services, which will reduce the amount of bad debt incurred and charity care provided. This will, in turn, reduce the pressure to shift these costs onto private payors and will help reduce the rate of premium increases. The savings due to these reductions in bad debt and charity care were determined by srHS to be $35.7 million. The Board adopted this amount in part and approved $23.6 million.
Savings Estimates in the First Three Years
In Years One and Two, DHA’s consultant developed estimates of these savings based on a process that used the total amount of bad debt and charity care reported by Maine hospitals as a starting point, and then developed various assumptions to estimate the reduction in this amount that could be attributed to enrollment of a portion of the uninsured and underinsured population into DirigoChoice and MaineCare.
In Year Three, srHS revised this approach and derived its estimate of this savings from the actual claim costs of members enrolled in Dirigo and in the MaineCare parents expansions. The result of this approach was characterized by srHS as an estimate of “new money to the healthcare system for those previously uninsured and underinsured that are now enrolled in DirigoChoice or in the MaineCare parents expansion programs.” This approach used the actual enrollment in these two initiatives and developed assumptions about factors that included the portion of the enrollment that was previously uninsured or underinsured, and their average claim costs, to arrive at an estimate of savings that could be attributed to the Dirigo initiative. The result of this analysis was an estimate of $14.0 million of savings. A consultant for the Maine Association of Health Plans, Jack Burke, FSA, MAAA of Milliman, developed an alternative estimate that preserved many of the srHS assumptions and incorporated several additional assumptions, including an assumption about the variable costs to providers of the additional health care services for these newly insured individuals. Mr. Burke’s analysis resulted in an estimate of $6.3 million. The Board adopted the $6.3 million estimate developed by Mr. Burke and the Superintendent found that determination reasonably supported by the evidence in last year’s Decision.
The New Methodology in Year Four
This Year, srHS adopted a new approach to estimating the savings in this category. Their approach attempted to measure the full, global impact of Dirigo on the rate of uninsurance in Maine, rather than measure the impact of covering the specific populations participating in the Dirigo initiatives. The way srHS11 sought to capture this global impact was through a new methodology, supported by statistical regression models similar to those used to analyze the impact of the hospital cost savings initiative.
Three multistate multivariate regression models were developed by srHS to produce counterfactual estimates of what the rate of uninsurance in Maine would have been in the absence of Dirigo. The three models were a U.S. model, a Northeast model, and a Maine model. The regression’s control variables identified by srHS included:
The srHS Report assumed a pre-Dirigo period of 1999-2002 and used the models to estimate what uninsurance rates in Maine would have been during the post-Dirigo period in the absence of Dirigo. These counterfactual estimates were then compared to actual uninsurance rates in Maine to determine the impact of Dirigo, as in the methodology used for hospital cost savings. An additional estimation step was necessary to develop a figure for 2008, because 2006 was the most recent year for which all data were available. To project the Dirigo impact out to 2008 for the purpose of determining AMCS, srHS assumed that the actual 2002-2006 annual rate of change in uninsurance would persist to 2008. The counterfactual 2008 uninsurance rate for each regression model was similarly projected based on each model’s predicted annual rate of change from 2002-2006.
The 2008 counterfactual rates of uninsurance projected from the three regression models were compared to the projected “actual” Maine uninsurance rate for 2008 to arrive at three estimates of the reduction in the uninsurance rate that could be attributed to Dirigo. Each reduction was then multiplied by the projected 2008 Maine under-65 population to arrive at an estimated reduction in the number of uninsured that can be attributed to Dirigo. This estimate was then multiplied by $893, the estimated cost of uncompensated care per uninsured person, to arrive at an estimate of the total savings attributable to Dirigo in 2008. The results from the three srHS regression models can be summarized as follows:
The srHS Report recommended a savings estimate of $35.7 million for 2008, which represents a 75%/25% weighted average of the U.S. and Northeast models. The report concluded that a weighted average was appropriate to minimize any potential distortion to the results that might arise from health care reform efforts in other Northeastern states. (R. at Tab 3-61, pp. 266-67).
Comments of Payor Intervenors
During the hearing before the Board and in subsequent briefs, payor intervenors raised several concerns about the methodology used by srHS, as follows:
In addition, Mr. Burke provided his own estimate based on the method approved in Year Three. He updated the numbers used last year based on data in this year’s record. (R. at Tab 1-29, Burke Exhibit 2, Attachment II). The resulting estimate of savings was $6.1 million.
Action by the Board
The Board rejected the $35.7 million savings amount recommended by srHS and instead determined that savings were best predicted by the Maine model and adopted $23.6 million. During their deliberations, several Board members expressed concerns about several aspects of the srHS proposal, including the magnitude of the increase proposed by srHS relative to the amount allowed by the Superintendent last year, the absence of any adjustment for the impact of several past enrollment shocks, and the decision to adopt the changed methodology despite the absence of any direction from the Superintendent to do so. (R. at Tab 3-62, p. 6-19).
The payor intervenors object to the introduction of a multistate multivariate regression model. Although there was no direction to do so in last year’s Decision, there is no fundamental reason not to use this approach, so long as it is well specified, correctly executed, and well documented. However, as discussed below, that was not the case this year.
As pointed out by the payor intervenors, the new methodology resulted in estimated savings several times larger than the amount approved last year. A comment from the Year Three Decision was cited repeatedly in this year’s record. The Superintendent concluded last year that:
The payor intervenors have cited that comment as evidence that the amounts determined for bad debt and charity care savings this year by srHS and the Board are unreasonable. That is not correct. “Final test” referred only to its placement in the analysis and did not imply that this was the ultimate test. An amount significantly different from past approved amounts could be approved if there were adequate support in the record for that determination.
The fact that the new methodology resulted in significantly larger savings does not in itself invalidate the methodology, but it does highlight the need to check the accuracy and reasonableness of the results, both through empirical evidence and verification of the model’s specification. As explained in the Year Three Decision:
Checking for accuracy and reasonableness is always prudent, but is crucial when both the methodology and the results are significantly different from those in prior years. However, there is no evidence on the record that srHS performed any such checks. The srHS report attributed the large increase produced by the new methodology to the global approach used, stating:
However, no empirical evidence was provided to support this assertion. To check reasonableness, srHS could have investigated the sources of the change in uninsurance rates. An analysis of the extent of changes in the various categories of insurance (such as private coverage, MaineCare, Medicare, and military coverage) and the impact of Dirigo over time on enrollment in each category might be illuminating. Census data is available from sources cited in the srHS Report to support this type of analysis.
When DHA’s witness, Dr. Kenneth Thorpe of Emory University, was questioned about this topic at the hearing, the only impacts he cited beyond those considered in prior years were increased private health insurance due to lower rates and increased enrollment in MaineCare due to publicity associated with Dirigo. (R. at Tab 3-61, p. 320-321).
With regard to private health insurance coverage, there is no indication on the record that srHS looked at enrollment changes or at how those changes compare to other states. Furthermore, evidence on the record casts doubt on the existence of any increase in private coverage. The population data and uninsurance rates shown in the srHS Report (R. at Tab 4-64, p. 70) indicate a decrease of about 17,000 in the number of uninsured in Maine from 2002 to 2006. The MaineCare enrollment numbers provided by Mr. Burke (R. at Tab 1-29, Burke Exhibit 5) indicate an increase of about 52,000 from December 2002 to December 2006. Thus, increased enrollment in MaineCare is a more plausible explanation for the decrease in the number of uninsured than any increase in private coverage. Because only a part of the MaineCare increase can be attributed to the Dirigo Act, it is essential to use methods that can distinguish between Dirigo-related MaineCare enrollment and other factors that have led to increased enrollment.
One way to check the accuracy and specification of the regression models would have been to run the models for 2002 and earlier years to see how closely the results matched actual uninsurance rates in Maine. There is no indication on the record that this was done. Due to the lack of documentation and detailed explanations for the models, the Superintendent and her staff and consultants were unable to perform this check or to validate the results of the regression models from the information in the record. However, some indication of the validity of the models can be gleaned from analyzing the results for 2003.
The payor intervenors questioned the inclusion of 2003 in the post-Dirigo period. The Dirigo Act was not effective until late in the year and enrollment in DirigoChoice and the MaineCare parents expansion did not begin until 2005. On the other hand, health care reform was certainly in the public eye throughout 2003. The Superintendent is aware that the “sentinel effect” can affect behavior even before a new law takes effect. Nonetheless, while there may have been some “Dirigo effect” on the uninsurance rate in 2003, it is unreasonable to attribute all of the 2003 change to Dirigo. There was a major decline in the rate of uninsurance in 2003 that is largely explained by a Medicaid expansion that began in 2002. While it is possible that the 2003 enrollment in this expansion was slightly higher than it would have been in the absence of Dirigo, most of it likely would have occurred anyway. If the srHS models were well-specified, one would expect them to produce 2003 uninsurance rates close to or slightly above the actual value of 11.79%. Instead, they resulted in much higher values.
Accordingly, the results of these regression models are not sufficiently reliable to serve as a basis for estimating the reduction in the number of uninsured individuals in Maine that can be attributed to Dirigo.
In addition to these serious questions about the reliability of the models, there are also problems with how the model results were used. First, the 2006 results were projected to 2008 based on the average 2002-2006 rate of change. There was no analysis, however, as to whether factors affecting the uninsurance rate in 2003-2006 would be likely to continue, or whether other factors that were not present in those years might impact the rate in 2007 and 2008. Another problem with this projection is that it inflates the effect of any error or bias in the model itself.
Similarly, the actual 2006 uninsurance rate was projected to 2008 based on the average 2002-2006 rate of change, without considering whether factors affecting the uninsurance rate in 2007 and 2008 were likely to resemble those in 2003-2006. As noted by the payor intervenors, the decrease in the uninsurance rate from 2002 to 2006 resulted largely from three sources: the 2002 implementation of the MaineCare non-categorical adult expansion, the 2005 implementation of the MaineCare parents expansion, and the 2005 implementation of DirigoChoice. Payor intervenors argued that it is not reasonable to project similar decreases in 2007 and 2008 in the absence of further “shocks” to the system. In response, Dr. Thorpe testified that “the state is not planning on abolishing those programs.” (R. at Tab 3-61, p. 276).
In reviewing the record, one must consider, however, the impact on uninsurance trends from changes in enrollment in these programs, not just their continued existence. In the case of the 2002 MaineCare expansion, the record shows that enrollment peaked in early 2005 and has been declining recently. (R. at Tab 1‑29, Burke Exhibit 5). New enrollment in DirigoChoice has been severely limited due to funding issues. Therefore, it is not reasonable to project decreases in the uninsurance rate at the same pace as the earlier period.
As discussed above, the difference between the projected 2008 uninsurance rates with and without Dirigo was multiplied by an average per capita cost of care of $893, based on 2005 values from the report: “Paying the Premium, The Added Cost of Care for the Uninsured” (R. at Tab 4-65, Document 12), trended to 2008. Mr. Burke testified that a reduction to this amount is necessary because a portion of the cost of uncompensated care was actually paid by other entities and would not become bad debt or charity care incurred by providers. He derived a 36% reduction from Maine data at page 32 of the report, and no contrary evidence in the record has been cited. Furthermore, the report also states that the uncompensated care is based on “what the privately insured would pay, on average, in the state for the same health care services.” (R. at Tab 4-65, Document 12, p. 13). To the extent that a portion of the reduction in uninsured rates that can be attributed to Dirigo is related to increases in MaineCare enrollment, there should have been an additional reduction in the assumed per capita cost to reflect the fact that MaineCare reimbursements are lower than private health insurance reimbursements. There should also have been a small adjustment to reflect another payment category cited in the report, which was other non-patient non‑government sources of revenue, including philanthropy.
Also, the per capita estimates derived from this report were derived from estimated uncompensated care for an assumed 161,000 uninsured Maine residents in 2005. (R. at Tab 4‑65, Document 12, p. 8). According to the srHS Report, the actual number of uninsured Maine residents was about 130,000 in 2005. (R. at Tab 4-64, p. 70). This large difference casts further doubt on the validity of the assumed $893 per capita cost.
The determination of $23.6 million for savings due to the Uninsured / Underinsured savings initiatives is based on an average per capita cost of care of $893 and a reduction of about 26,500 in the number of uninsured Maine residents that can be attributed to Dirigo. Neither assumption is reasonably supported by the evidence in the record. While it may be possible to approximate a reasonably supported amount for the average per capita cost from the record, it is not possible to do so for the number of newly covered Maine residents that can be attributed to Dirigo unless the srHS methodology is discarded and Mr. Burke’s analysis is utilized.
Mr. Burke’s estimate of $6.1 million is based on enrollment of 12,050 in Dirigo and 5,600 in the MaineCare expansion. (R. at Tab 1-29, Burke Exhibit 2, Attachment II). From data in Table 1 (p. 70 of the srHS Report), it can be determined that the actual uninsured counts are 139,000 in 2002, 127,000 in 2003, and 122,000 in 2006. Given the magnitude of these changes, it appears that the enrollment assumed in the Burke report provides a credible explanation for many of the changes. The limitations of this approach, however, must be kept in mind, as it is not clear that Mr. Burke fully addresses the impact of providing formerly underinsured people with better coverage through MaineCare or through DirigoChoice. This methodology also fails to consider the potential effects of carriers holding down rates and adjusting their product designs to compete with DirigoChoice, enabling additional people to avoid the ranks of the uninsured and underinsured. However, there is a lack of evidence in this record demonstrating that those effects have been appreciable in this market at this time. Thus, despite the deficiencies observed in both the srHS and Burke models, the Burke estimate of Uninsured / Underinsured savings is the only one that is reasonably supported by this record.
Accordingly, the Superintendent finds that the Board’s estimate of $23.6 million is not reasonably supported by the evidence in the record, and that the part of that estimate that is reasonably supported by the evidence in the record is $6.1 million, as reflected in Mr. Burke’s estimate.
3. Medical Loss Ratio. Board determination: $6.6 million. Amount the Superintendent finds reasonably supported by the evidence: $6.6 million.
This year’s Board filing included, for the first time, an initiative referred to as Medical Loss Ratio, or MLR. One component of the Dirigo reform legislation was greater oversight over small group pricing and loss ratios. Dirigo legislation requires small group insurers either to file rates for prior approval by the Superintendent, or to maintain a loss ratio of at least 78% over any 36-month period and to refund any excess premiums to policyholders if needed to achieve the target loss ratio. 24-A M.R.S.A. § 2808‑B(2‑C), enacted by P.L. 2003, ch. 469, § E‑16.
For the first time, a refund has been paid under the terms of this legislation. The srHS Report includes a report from Aetna Life Insurance Company showing a premium refund of about $6.6 million payable in early 2008, resulting from loss ratios for the three years ended June 30, 2007. The srHS Report has asserted that this is a savings that can be attributed to Dirigo and has recommended the inclusion of this amount in the Year 4 determination. The Board has agreed and included this amount.
The payor intervenors have raised several objections to the inclusion of this amount by the Board. One issue they have raised is that this refund is payable to employers, not health care providers and is therefore, not a “savings to the Maine health care system.” Another issue raised is recoverability; because the refunds were paid to employers, they will not benefit health care providers. (Chamber brief, August 8/26/2008, pp. 36-37). In addition, there was discussion in the pre-filed testimony of Mr. Burke of the possibility that insurers would include a risk charge component in their pricing to account for the additional expense of the premium refund that would be incurred in times of favorable claim levels. (R. at Tab 1-29, p. 11). Such a risk charge would offset the savings enjoyed by employers when loss ratios fell below the mandated threshold. However, there was no evidence in the record that any small group carriers in Maine had included such a risk charge in their pricing.
Any refunds provided pursuant to this provision of the Dirigo Act reduce the cost of health insurance for the employer groups that receive the refund. Indeed, the amount saved by these groups and the basis for the refunds in the Dirigo Act are not in dispute; the only question is whether savings to particular policyholders that do not necessarily have any connection with reduced costs of health care, and by their nature can never be recovered by insurers, are the kind of “measurable cost savings” contemplated by 24‑A M.R.S.A. § 6913(1)(A).12 That is a matter outside the Superintendent’s jurisdiction, for the reasons explained earlier in the general discussion of legal issues. The Board has already considered the issues raised by the payor intervenors, and has determined that the inclusion of the refund in its savings determination is appropriate.
Based on these considerations, the Superintendent finds that the amount of $6.6 million determined by the Board for the Medical Loss Ratio initiative is reasonably supported by the evidence in the record.
4. Overlap. Board determination: $0. Amount the Superintendent finds reasonably supported by the evidence: $4.0 million (this is a negative adjustment).
“Overlap” refers to the potential double-counting of savings among the various initiatives.
Overlap between CMAD and Uninsured / Underinsured Savings Initiatives
The srHS Report asserts that there is no overlap between the Hospital Cost and Uninsured / Underinsured savings initiatives because “the analysis for BD/CC includes only those costs, charges, and discharges that would have existed in the absence of Dirigo as well as in the presence of Dirigo.” (R. at Tab 4-64 p. 20). Payor intervenors dispute this. The Chamber argued that the global approach srHS used for Uninsured / Underinsured “swallows” both Hospital Cost and MLR savings. (Chamber Brief p. 37). Since the Uninsured / Underinsured savings found reasonable in this decision are not based on the srHS methodology, this specific dispute is moot. The question that needs to be answered is whether there is overlap between the Hospital Cost and Uninsured / Underinsured savings found reasonable in this decision.
MEAHP witness Burke asserts that “Any reduction in bad debt at the hospital would in theory reduce the pressure on the cost per case for the remaining, paying customers, and would thus be reflected in the CMAD calculation.” (R. at Tab 1-29, Burke Exhibit 2, p. 10). Although the focus on customer impact suggests that Mr. Burke is discussing hospital charges, and the methodologies used in this case analyze the underlying costs, a substantially similar rationale applies to costs. If bad debt and charity care are included in hospital reports as “expenses,” then all reductions in hospital bad debt and charity care are included, dollar for dollar, within the overall reductions in ECMAD. This means that all savings in bad debt and charity care under the Uninsured / Underinsured savings were already captured within the Hospital Cost savings initiative, to the extent that those savings are attributable to hospitals. Because nothing on the record appears to refute this theory, there is overlap to whatever extent uncompensated care is reflected in the numerator of the ECMAD calculation. There is no indication in the record that the data received from AHD had been adjusted to remove uncompensated care revenue or cost.
When asked about this adjustment, Steven P. Schramm of srHS testified that he did not know whether it had been removed. (R. at Tab 2‑60 p. 125). The srHS Report lists further adjustments made to the data after it was received from AHD (pp. 44-45) but does not indicate any reduction to remove uncompensated care. Therefore, the record would not reasonably support any figure that fails to account for overlap between the Hospital Cost and Uninsured / Underinsured savings, to the extent that the Uninsured / Underinsured savings reflect hospital costs.
Mr. Burke cites sources indicating that 66% of uncompensated care is for hospital care, and they have not been contradicted by any of the parties. Id. Applying this factor to the $6.1 million of Uninsured / Underinsured savings results in an overlap of $4.0 million.
Overlap between Uninsured / Underinsured and MLR Savings Initiatives
As discussed above, the Chamber’s argument that the global approach srHS used for Uninsured / Underinsured “swallows” both CMAD and MLR is moot. MEAHP argues that to the extent the MLR initiative reduces the cost of insurance, it would be reflected in the reduction in the number of uninsured. However, this does not demonstrate that Uninsured / Underinsured savings include any MLR savings. The MLR reflects savings to currently insured groups. To the extent that reduced insurance premiums resulted in newly insured people, that would reflect additional savings that are not included in the MLR component of the filing.
While the Uninsured / Underinsured savings do not include any MLR savings, the reverse is not necessarily true. The MLR savings result from the amount of medical claims for a block of small group health insurance policies being less than 78% of premium. To the extent that the amount of claims was reduced due to Uninsured / Underinsured savings that are passed on to insurers, part of the MLR savings could double-count the Uninsured / Underinsured savings, but only if insurers failed to reflect those savings when they set the premiums. Reducing premiums to reflect anticipated Uninsured / Underinsured savings is required by law, 24‑A M.R.S.A. § 6913(9), and if those savings are reflected accurately, the combined effect on the medical loss ratio of the reduced claims and reduced premiums would be neutral. The record includes no evidence to the contrary. Accordingly, there is no evidence of any overlap in this category.
Overlap between CMAD and MLR Savings Initiatives
Similarly, overlap could exist between CMAD and MLR if the MLR savings resulted in part for CMAD savings being passed on to insurers. However, here again, there would be no overlap if these savings were also reflected in premiums. Because there is no evidence on the record that any CMAD savings passed on to insurers were not reflected in lower premiums, there is no evidence of any overlap.
Based on this analysis, the Superintendent finds the Board’s determination that there is no overlap not to be reasonably supported by the evidence in the record, and finds instead that the evidence in the record reasonably supports a finding of overlap totaling $4.0 million.
By reason of the foregoing, the Superintendent ORDERS that the Dirigo Board’s determination of aggregate measurable cost savings is APPROVED IN PART and that $48.7 million of aggregate measurable cost savings approved by the Dirigo Board is found by the Superintendent to be reasonably supported by the evidence in the record.
VI. NOTICE OF APPEAL RIGHTS
This Decision and Order is final agency action of the Superintendent of Insurance within the meaning of the Maine Administrative Procedure Act. Pursuant to Bureau of Insurance Rule 350, § 18, the time for appeal runs from the issuance of this Part II Decision and Order. Any party may appeal this Decision and Order to the Superior Court as provided by 24-A M.R.S.A. § 236, 5 M.R.S.A. § 11001, et seq., and M.R. Civ. P. 80C. Any such party must initiate an appeal within thirty days after receiving this notice. Any aggrieved non-party whose interests are substantially and directly affected by this Decision and Order may initiate an appeal within forty days after the issuance of this decision. There is no automatic stay pending appeal; application for stay may be made as provided in 5 M.R.S.A. § 11004.
1 The full complement of the Board consists of 9 voting members and 4 ex officio nonvoting members. 24‑A M.R.S.A. § 6904(1). Four of the voting positions, however, were vacant at the time of the agency’s decision; the members of the Dirigo Board at that time were: Robert McAfee, M.D., Chair; Jonathan Beal, Esq.; Edward David, M.D.; Mary Anne Turowski; Mary McAleney; Trish Riley ex officio, Director of the Governor’s Office of Health Policy & Finance; Rebecca Wyke ex officio, Commissioner of the Maine Department of Administrative & Financial Services; Anne Head ex officio, (then Acting) Commissioner of the Maine Department of Professional & Financial Regulation; and David Lemoine ex officio, Maine State Treasurer. Ms. Riley and Ms. Wyke recused themselves and did not participate in the deliberations.
2 The $190.2 million of aggregate measurable cost savings determined by the srHS Report consists of Hospital initiatives of $147.9 million, Uninsured / Underinsured initiatives of $35.7 million, Medical Loss Ratio of $6.6 million, with no reduction from Overlap.
3 A substantially similar definition has now been codified at 22 M.R.S.A. § 1721(1)(B), which took effect July 18, 2008, after the filing of the srHS Report.
4 Hospital charge levels are closely related to hospital revenue if, for example, the hospital is paid a percentage of its charges by a particular payor. In other cases, however, there might be no relation between revenue and nominal charge, as when the hospital is paid a predetermined fixed fee.
5 Given the sophistication and complexity of a multistate multivariate model, it is highly desirable for the Board to retain its own expert to help it analyze the validity of the results. Such additional steps would facilitate an early resolution of questions about such factors as assumptions, data, and methodology, and would enhance the record to allow for better public review.
6 Practically speaking, this adjustment had a very small effect on the answer, as input prices (wage rates, prices of supplies, etc.) are not a major determinant of hospital cost trends.
7 Microsoft Access, unlike some more full-featured database software, does not generate a record of manually changed database records.
8 This is implemented by estimating generalized linear models with a log link and gamma family distribution assumption.
9 In effect, even though multistate data are being used in the regression, this approach extrapolates the pre-Dirigo Maine-specific trend as the basis of the post-Dirigo cost savings.
10 The fact that the Cluster regressions produced similar results to our modified regression, even though the srHS specification is essentially based on a projection of Maine-only results, is due to the simple coincidence that the effect of the multistate adjustment roughly washes against the effect of introducing the control variables. There was no reason to know before performing the calculations that this would be the case.
11 The regression models were primarily developed by Dr. Kenneth Thorpe. References to srHS in this Decision and Order should be understood to incorporate Dr. Thorpe’s work, which was included as an appendix to the srHS Report.
12 In Year One, the Superintendent was faced with a similar question when reviewing the question of savings from the Voluntary Underwriting Gain (VUG) initiative, under which insurers were encouraged to limit their underwriting profit. Although the Superintendent determined in that instance that there were no measurable savings, that conclusion was based entirely on the lack of evidentiary support. The Superintendent declined to review the Board’s inclusion of VUG within aggregate measurable cost savings for the same jurisdictional reasons as apply to the Board’s inclusion of MLR this year. See discussion in section IV(A) above.
PER ORDER OF THE SUPERINTENDENT OF INSURANCE
Last Updated: March 26, 2012
|Copyright © 2006 All rights reserved.|