Thursday 9 February 2012

Assignment 1: Windows to Youth Health

Choose a completed evaluation; any kind, your choice. Determine the model used and identify in your mind the strengths and weaknesses of the evaluation and the approach that was taken.



Windows to Youth Health
Please note:

PHF refers to Health Canada's Population Health Fund

PPHB refers to Health Canada's Population and Public Health Branch


There are a myriad of definitions for program evaluation. Weiss (1978) defines program evaluation as “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy” (p. 4). According to the Treasury Board of Canada Secretariat, program evaluation is “… the systematic collection and analysis of evidence on the outcomes of programs to make judgements about their relevance and performance, and to examine alternative ways to deliver them or to achieve the same results” (Government of Canada, 2009, p. 1). In simplified terms, these definitions suggest that program evaluation is about the assessment of a program’s identified outcomes, and the communication of both intervention and success feedback for the purpose of sustaining and improving the program. As a means to construct my understanding of this form of evaluation, I chose to analyze the completed program evaluation, Windows to Youth Health: Population Health Fund Program Evaluation for BC Region Youth Projects 2004/2005. By identifying the evaluation model used to assess the program, and examining the strengths and weaknesses of this evaluation, I can help to build and clarify my knowledge and understanding of program evaluation.

One important step in assessing an evaluation of a program is to identify the assessment model. There are a several models that Zena Simces and Associates, the external evaluators, could have applied to evaluate the youth projects program. One such model is Stufflebeam's CIPP model, a holistic decision-based model. This systematic and comprehensive approach to evaluation focuses on a program's context, inputs, process and product to aid the stakeholders in making decisions about program improvement. Provus' Discrepancy Model (DIPPC) and Scriven's focus on formative and summative evaluation could also have been used to evaluate the youth projects program. However, Stake’s Countenance Model, a judgement-oriented model, is clearly evident in this program evaluation. Stake identified that the two acts or countenances of evaluation are description and judgement of a program and are viewed according to their phase in the program: antecedent is prior to the program; transaction is during the process; and outcome is the effect of the program (Wood, 2001). This framework provides for a comprehensive description and judgement of a program, and examines the congruency of what was intended and what was observed, as well as the links or “…contingencies, between antecedents, transactions, and outcome variables” (Wood, 2001, p. 19).  By employing the Stake's Countenance Model to evaluate the program, the evaluators provided the stakeholders with descriptions, learnings and recommendations to inform future program planning to support youth at risk.

The primary focus of this program evaluation was summative as its purpose was to examine “… the overall effectiveness of the PHF regional funding for youth, and to produce a program evaluation roll-up report” (Public Health Agency of Canada, 2005, p. 7), or Stake’s description and judgement of the program’s outcomes.  The evaluation was also formative, as the intent of the PPHB was to use the report recommendations for future planning.  The external evaluator, Zena Simces and Associates, also identified seven key objectives for the evaluation of the ten youth projects funded by the PHF in British Columbia which included summative, process, and formative evaluation.  One of the objectives that addressed process evaluation or Stake’s component of transaction was the focus on evaluating the projects’ implementation over the course of the two year funding period.  The data collection methodology used to address the evaluation purposes and objectives included document reviews (Stake’s antecedent) and content analysis of the documents using a set of pre-determined evaluation questions, as well as interviews with key stakeholders using the same evaluation questions.  Data analysis was qualitative which allowed for in-depth descriptions and judgements of the successes and challenges of the funding program.

The evaluation Windows to Youth Health: Population Health Fund Program Evaluation for BC Region Youth Projects 2004/2005, conducted by Zena Simces and Associates, is an exemplary evaluation for a number of reasons.  The following bullets outline some of the evaluation’s strengths as well as suggestions for improvement:

Strengths of Evaluation:

Comprehensive, Connected, and Clear



-the 73 page evaluation included an executive summary, introduction and context, methodology including limitations and impacts, findings including learnings and recommendations, and appendices including an extensive list of documents reviewed prior to the evaluation (antecedent)
-written in accessible language for the stakeholders

Congruency and Contingency
-descriptions and judgements were based on the identified purpose of the evaluation as well as the seven stated objectives of the evaluation
-the evaluators focused on describing what was intended and what was observed or congruency
-the evaluators described the indicators or objectives, identified what could be learned from the findings, and stated recommended actions based on the indicators and learnings
-evaluators reviewed documents to gather data about what occurred before program funding (antecedent), examined the implementation process through review of documents including logic models developed by each program team, as well as conducting interviews (transactions), and evaluated the outcomes of the program (outcomes)
- a logic model was developed in consultation with the projects and five main program outcomes were identified
-the evaluators identified the purpose of the evaluation, the stakeholders and their intended used of the findings and recommendations
-limitations of conducting the evaluation were identified and addressed: for example, there were challenges around the availability and quality of documentation by the projects, so the evaluators collected additional data to address the void

Suggestions for Improvement:
Logic Model:
-perhaps the evaluators could have suggested that the PHFBC logic model, which was developed in consultation with the project leaders, be revised to address the challenges exerienced by the project leaders in developing their own logic models based on the BC logic model


Cost:
-inclusion of a cost analysis of the evaluation in terms of time and money
-the projects were government funded, so for transparency and accountability, tax payers should know how their tax dollars are allocated
-as this was an exemplary evaluation, other program evaluators could benefit from knowing the amount of time and money required to conduct this type of evaluation, and then use it as a benchmark for their own evaluation context


Zena Simces and Associates provided a descriptive and judgement based evaluation of Windows to Youth Health: Population Health Fund Program Evaluation for BC Region Youth Projects 2004/2005.  The comprehensive evaluation report was written in the language of the stakeholders, and focused on addressing and responding to the identified objectives of the evaluation.  The evaluator’s recommendations addressed both the strengths of the funding program, and areas for improvement.  Stake’s Countenance Model provided a framework for the evaluators to effectively examine the congruency of what was intended and what was observed, as well the contingency between the variables resulting in an exemplary evaluation.  As with all assessment and evaluation, the most important piece is what the stakeholders do with the provided information.


Kennedy, M. (2001). Race matters in the life/work of four,white female teachers. Unpublished

            doctoral dissertation, University of Alberta, Edmonton, Canada.
Treasury Board of Canada Secretariat. (2009). Policy on evaluation. Government of
Weiss, C. H. (1998). Evaluation: Methods for studying programs and policies (2nd ed.).
           Upper Saddle River, NJ: Prentice Hall.

Wood, B.B. (2001). Stake’s countenance model: Evaluating and environmental education

            professional development course. Journal of Environmental Education, 32 (2): 18-27.

Stake's Countenance Model








































    






























1 comment:

  1. Great choice Shelly I agree that there are a number of models mixed into this evaluation and it is common to see this happen. I also agree that anything funded by taxpayers should be comprehensive in the reporting of the details.

    Jay

    ReplyDelete