Skip Navigation Links
Skip Navigation Links
January 2021Expand January 2021
November 2020Expand November 2020
July/August 2020Expand July/August 2020
May/June 2020Expand May/June 2020
March/April 2020Expand March/April 2020
January/February 2020Expand January/February 2020
September/October 2019Expand September/October 2019
July/August 2019Expand July/August 2019
May/June 2019Expand May/June 2019
March/April 2019Expand March/April 2019
January/February 2019Expand January/February 2019
November/December 2018Expand November/December 2018
September/October 2018Expand September/October 2018
July/August 2018Expand July/August 2018
May/June 2018Expand May/June 2018
March/April 2018Expand March/April 2018
January/February 2018Expand January/February 2018
November/December 2017Expand November/December 2017
September/October 2017Expand September/October 2017
July/August 2017Expand July/August 2017
May/June 2017Expand May/June 2017
March/April 2017Expand March/April 2017
January/February 2017Expand January/February 2017
November/December 2016Expand November/December 2016
September/October 2016Expand September/October 2016
July/August 2016Expand July/August 2016
May/June 2016Expand May/June 2016
March/April 2016Expand March/April 2016
January/February 2016Expand January/February 2016
November/December 2015Expand November/December 2015
September/October 2015Expand September/October 2015
July/August 2015Expand July/August 2015
May/June 2015Expand May/June 2015
March/April 2015Expand March/April 2015
January/February 2015Expand January/February 2015
ArchiveExpand Archive
Special Edition - EPRExpand Special Edition - EPR
Special Edition: Title V Technical Assistance MeetingExpand Special Edition: Title V Technical Assistance Meeting
Title V Technical Assistance Meeting

 Measuring What Matters Most

By Theresa Christner, M.A.
Manager, Policy/Program Development Section
Children's Special Health Care Services Division
Michigan Department of Health and Human Services

Measuring what matters has been the topic of conversation within our Children's Special Health Care Services (CSHCS) program at the Michigan Department of Health and Human Services for quite some time. Given the size and complexity of our program, we have been challenged by the following questions:

  • What data do we review?
  • Is the data selected meaningful?
  • Does the data measure what we think it measures? 
  • Does the data paint a complete picture of the program, capturing its accomplishments as well as providing clear indications of its strengths and weaknesses?

Needing answers to these questions, I attended the Measuring What Matters – Making Progress Through Program Evaluation workshop at the 2018 AMCHP Conference. The skills-building session provided hands-on examples of how to use the U.S. Centers for Disease Control and Prevention's Framework for Evaluation in Public Health to develop an evaluation approach that is integrated into routine program operations. The framework incorporates a take action cycle that includes the following six steps:

  • Engaging Stakeholders – getting input, participation, and sharing power with those who are invested in the program;
  • Describing the Program using a Logic Model – identifying the relationships between program elements and expected changes;
  • Using Validated Measures to Focus the Evaluation Design – planning the end goals of the evaluation and the necessary steps of the evaluation;
  • Using a Measurement Table to Gather Credible Evidence – compiling information that stakeholders perceive as trustworthy and relevant for answering their questions;
  • Justifying Conclusions – making claims regarding the program that are warranted on the basis of data that has been compared against pertinent and defensible ideas of merit, value, or significance; and
  • Using and Sharing Lessons Learned – assuring that the findings are useful by using them to make improvements.

Underpinning these six steps are four standards that are essential for good evaluation:

  • Utility – ensuring that the evaluation will meet the information needs of the intended users;
  • Feasibility – ensuring that the evaluation will be realistic, practical, sensitive, and economical;
  • Propriety – ensuring that the evaluation will be legal, ethical, and appropriate; and
  • Accuracy – ensuring that the evaluation will be correct and precise.

As part of the session, the attendees at each table worked on developing a logic model together. I found this extremely helpful. Having cut my public health teeth on the traditional work plan, constructing logic models has seemed counter-intuitive and, to be honest, somewhat illogical. After all, counting outputs is so much easier than measuring outcomes. 

But after working through the process in this group setting, I see the benefit of developing logic models as a necessary step for program evaluation. That step builds the bridge between what a program does and the change that will we hope will result. 

Now our MCH group is working to create logic models as part of our maternal and child health fiscal year 2019 planning activities. I am excited to take our Children's Special Health Care Services logic model to the next step and to apply the CDC's evaluation framework and its tools to build out a measurement table that can be used to track outcomes. The measurement table template helps to simplify outcome tracking, linking outcomes with specific indicators, the data sources, data collection methodology, monitoring frequency, and – most importantly – who is going to carry out these specific tasks. Once in place, applying this framework and its various tools will help us answer the big evaluation questions related to overall program effectiveness.