Post-Doc Blogpost: On Explicit Evaluation Reasoning

Evaluation reasoning leading from information and analyses to findings, interpretations, conclusions, and judgments should be clearly and completely documented (Yarbrough et al., 2011, p. 209). This standard arises not solely for the purpose of ensuring one’s conclusions are logical, rather this standard additionally emerges to function as both a final filter and an ultimate synthesizer of the results of all other accuracy standards. A7 – Explicit Evaluation Reasoning as per The Program Evaluation Standards serves to make known the efficacy of the process by which conclusions are reached. Said of this standard Yarbrough et al. (2011) continue, “If the descriptions of the program from our stakeholders are adequately representative and truthful, and if we have collected adequate descriptions from all important subgroups (have sufficient scope), then we can conclude that our documentation is (more) likely to portray the program accurately” (p. 209).  This level of holism leaves us with a critical imperative, to serve the program we are evaluating well, and to serve the negotiated purposes of the evaluation to their utmost.

Said of the need to ensure clarity, logic, and transparency of one’s process Booth, Colomb, & Williams (2008) elucidate, “[Research] is a profoundly social activity that connects you both to those who will use your research and to those who might benefit – or suffer – from that use” (p. 273). We then have a responsibility as evaluators and as researchers, to conduct ourselves and to document our process explicitly.  Doing so preserves such attributes tantamount to quality research as reproducibility, generalizability, and transferability. Yet there are also more specific considerations at-play. On the topic of this standard’s importance to current/future professional practice, we use the example of an extant job posting for a Program Evaluator with the State of Connecticut Department of Education. The description for this position includes the following, “A program evaluation, measurement, and assessment expert is sought to work with a team of professionals developing accountability measures for educator preparation program approval. Key responsibilities will include the development of quantitative and qualitative outcome measures, including performance-based assessments and feedback surveys, and the establishment and management of key databases for annual reporting purposes” (AEA Career, n.d., para. 2). This position covers a wide range of AEA responsibilities, and makes clear from only the second paragraph the sheer scope of responsibility under this position.  And while the required qualifications include mention of expertise in program evaluation, qualitative and quantitative data analyses, as well as research methods, it more importantly concludes with mention of the need to ‘develop and maintain cooperative working relationships’ and demonstrate skill in working ‘collaboratively and cooperatively with internal colleagues and external stakeholders’. What is required, then, is not solely a researcher with broad technical expertise, nor simply a methodologist with program evaluation background, but instead a member of the research community who can deliver on the palpable need to produce defensible conclusions from explicit reasoning in a way which connects with a broad audience of users and stakeholders.

Explicit reasoning, expressed in a way digestible by readers, defensible to colleagues, and actionable by program participants, requires the researcher be comfortable with where he/she is positioned in relation to the research itself when communicating both process and results.  This is also known among as the literature as positionality. Andres (2012) speaks of this in saying, “This positionality usually involves identifying your many selves that are relevant to the research on dimensions such as gender, sexual orientation, race/ethnicity, education attainment, occupation, parental status, and work and life experience” (p. 18). And yet why so many admissions solely for the purpose locating one’s self among the research? Because positionality has as much to do with the researcher, as it does the researcher’s position and its impact on program evaluation outcomes. An example of this need for clarity comes to us from critical action research.  Kemmis & McTaggart (2005) describe, “Critical action research is strongly represented in the literatures of educational action research, and there it emerges from dissatisfaction with classroom action research that typically does not take a broad view of the role of the relationship between education and social change… It has a strong commitment to participation as well as to the social analyses in the critical social science tradition that reveal the disempowerment and injustice created in industrialized societies” (p. 561). This in mind, it stands to reason that one can only be successful in such a position, if the researcher him/herself is made clear, his/her position to the research is clear, his/her stance on justice as only one example is considered, the process by which the research is conducted is clear, and how this person in relation to this research then renders subsequent judgment on data collected.  For this Program Evaluator role, just as many others like it, must be permitted to serve as both researcher and advocate, exercising objective candor throughout.

American Evaluation Association. (n.d.). Career. Retrieved October 9, 2013 from http://www.eval.org/p/cm/ld/fid=113.

Andres, L. (2012). Designing & doing survey research. London, England: Sage Publications Ltd

Booth, W. C., Colomb, G. G., & Williams, J. M. (2008). The craft of research (3rd Ed.). Chicago, IL: The University of Chicago Press.

Kemmis, S. & McTaggart, R. (2005). Participatory action research. In Denzin, N. K. & Lincoln, Y.S., The sage handbook of qualitative research (3rd Ed.), (559-604). Thousand Oaks, CA: Sage Publications, Inc.

Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program evaluation standards (3rd ed.). Thousand Oaks, CA: Sage Publications, Inc.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s