Sherlock Company: Informed Solutions for Health Plan Finance

Sherlock Company

What's New

Health Plan Research

Sherlock Benchmarks

Services

Quality Assurance Procedures

Photograph of USCGC Atlantic
Copyright © U.S. Naval Institute

Since benchmarks are useful to the degree that they are reliable, we are deeply committed to their quality.

1. Voluntary. Since data providers are themselves users, participants have a stake in the quality of the information that they submit. Each of the participants receives a copy of the report, typically shared throughout the organization and with senior management, which will in part reflect the quality of their individual submission. Conversely, since no firm is compelled to participate, it only does so if it is committed to the process.

2. Participatory Survey Development. The scope and data definitions of the survey reflect the input of participants.  Thus, each data element collected is considered valuable, and intended to be relied upon, by actual users of the metrics. Also since content is collected in a form that is the least difficult for most of the plans to submit, the likelihood of accurate reporting is increased.

3. Strong Definitions. Functions, activities and product lines were extensively defined both as “comments” in the survey instrument itself and searchable in the Common Guidelines. Definitions typically include lists of activities performed by the functional area and examples of cost centers associated with various functions.

4. Solutions to Emerging Issues. As with the survey development, emerging issues were also resolved with participant input, promoting consistency of responses. If new activities or ambiguities in existing activities are found during the survey collection period, these issues were aired at periodic conference calls during the survey process. Approaches were discussed, consensus reached and conclusions disseminated via email.

5. Data is Scrubbed. We employed statistical models and visual screens to identify outliers in the submissions. Where outliers were discovered, we contacted the participants to determine whether variances stemmed from reporting errors or are true operational differences. Reporting errors were corrected for inclusion in the reports, while true operational differences were not corrected. Certain non-recurring, extraordinary and start-up expenses were excluded from the reports for comparability.

6. Reconciliation with Financial Statements. While we do not audit the submissions, each plan was requested provide audited consolidated financial information, which was compared  to revenue and expense information provided in the survey form.  If  there were differences  between the data submitted in survey form and in the audit, we asked for a reconciliation schedule. The plans’ reconciliation of their submitted data with their financials was intended to assist in assuring the accuracy and completeness of their information.

7. Review by Participants.  Prior to final printing, a draft of the document was submitted to each of the participants. This draft was similar to the final SEER report in that it highlighted their own results in the context of the universe as a whole. This permitted participants to identify and correct any anomalies that we may have missed prior to its formal distribution to the participating plans.

8. Practice Effect. A high repeat participation rate is indicative of the familiarity with the survey definitions and contributes to accuracy of submissions and comparability of results. In the 2014 cycle, 90% of all Blue Cross Blue Shield participants and 56% of Independent / Provider-Sponsored Plans have five or more years of participation in our benchmarks.

© 1997-2017 Sherlock Company. All Rights Reserved.  Legal Information