Resources

On this page you can find a selection of resources that you may find useful, and information on recent papers and presentations the EvalStars team have presented.

Resources

Below is a selected list of resources, including tools and literature that may be helpful at the relevant ‘plan, monitor, and change’ phases when conducting your evaluation study or results-focused monitoring and evaluation system.  We’ve also included literature to inform your understanding about evaluation capacity building and ethics, standards, and competencies.

Planning

  • Morra Imas, L. G. & Rist, R. C. (2009)The Road to Results: Designing and Conducting Effective Development Evaluations. Washington DC: The World Bank.
Models & Assumptions

Logic models provide a ‘picture’ of how your program, system, policy or organisations ‘works’, including the theory and assumptions that underpinned the intended outcomes. This tool underpins the all the subsequent phases of your planning, monitoring or research, and reporting.

  • W.K. Kellogg Foundation. (2001). Logic Model Development Guide. Battle Creek, Michigan.
  • Nkawe, A. M. (2013). Working with Assumptions in International Development Program Evaluation. Why are Assumptions Important? Springer New York: New York,p93-111.
  • Hurworth, Rosalind. (2008). Program Clarification: An Overview and Resources for Evaluability Assessment, Program Theory and Program Logic. Evaluation Journal of Australasia, 8(2), 42-48.  This article provides an extensive list of resources that either discuss program theory and program logic generally or describe their application to a range of program fields.

Context
  • Rog, D. J., Fitzpatrick, L. J., & Conner, R. F. (2012). Context: A framework for its influence on evaluation practice. New Directions for Evaluation. In Rog, D. J., Fitzpatrick, L. J., & Conner, R. F.  (Eds.) When background becomes foreground: Towards context-sensitive evaluation practice. 135, pp. 25-40.

Monitoring & Research

  • Bamberger, M. (2011). Real World Evaluation (2nd ed.). Thousand Oaks CA: Sage

  • McDavid, C. & Hawthorn, L. (2005). Program Evaluation and Performance Measurement – An Introduction to Practice. Thousand Oaks CA: Sage.
  • Davidson, E. J. (2004) Evaluation Methodology Basics: The nuts and bolts of sound evaluation. Sage Publications Inc.
  • Turner, D. W., III. (2010). Qualitative interview design: A practical guide for novice investigators. The Weekly Qualitative Report, 3(2), 7-13. Retrieved from http://www.nova.edu/ssss/QR/WQR/qid.pdf  (This short paper explores the effective ways to conduct a qualitative interviews for novice investigators by employing a step by step process for implementation.)
  • Funnell, S. (2000). Developing and using a program theory matrix for program evaluation and performance monitoring. In Rogers, P. J., Petrosino, A., Huebner, T. A., & Hacsi, T. A. (Eds.). Program Theory in Evaluation: Challenges and Opportunities, 87, 103-112. San Francisco: Jossey-Bass.

Results or performance-based monitoring and evaluation systems

For a recent collection of articles discussing the relationship between evaluation and performance measurement and management, see New Directions for Evaluation journal published in 2013 (No. 137).  Also see the articles below.

  • Nielsen, S.B., & Ejler, N. (2008) Improving Performance? Exploring the complementarities between evaluation and performance management. Evaluation, (14), 171-192.
  • Nielsen, S. B., & Hunter, D. E. K. (2013). Challenges to and forms of complementarity between performance management and evaluation. In Nielsen, S. B. & Hunter, D. E. K. (Eds.). Performance management and evaluation. New Directions for Evaluation, 137, 115–123.
  • Helden, G. J., Johnsen, A. & Vakkuri, J., (2012) The life-cycle approach to performance management: Implications for public management and evaluation. Evaluation, (18), 159-175.
  • Office of the Auditor General of Canada, Ottawa, Ontario. Mayne, J. (2004) Reporting on outcomes: setting performance expectations and telling performance stories. The Canadian Journal of Program Evaluation, 19(1),31-60. Click here to download pdf.
  • Mayne, J. (1999) Addressing Attribution through Contribution Analysis: Using Performance Measures Sensibly. Office of the Auditor General of Canada: Discussion Paper.
  • MacDonald, G. (no date). Criteria for Selection of High-Performing Indicators, A Checklist to Inform Monitoring and Evaluation. Retrieved from http://www.wmich.edu/evalctr/wp-content/uploads/2013/03/Indicator_checklist.pdf

Change & Reporting

  • Patton, M.Q. (2008) Utilization-Focused Evaluation (4th ed.) Sage Publications, Thousand Oaks, CA.

Evaluation Capacity Building 

Evaluation capacity building is an intentional process to increase individual motivation, knowledge, and skills, and to enhance a group or organisation’s ability to use evaluation (Labin et al 2012).

  • Preskill, H. & Russ-Eft, D. (2004). Building Evaluation Capacity – 72 Activities for Teaching and Training. Thousand Oaks CA: Sage.
  • Labin, S.N., Duffy,  J.L., Meyers, D.C., Wandersman, A. & Lesesne, C.A. (2012) A research synthesis of the evaluation capacity building literature. American Journal of Evaluation, 33, 307-338.
  • Bourgeois, I. & Cousins, J.B. (2013) Understanding Dimensions of Organizational Evaluation Capacity. American Journal of Evaluation. Published online 2 May 2013. DOI: 10.1177/1098214013477235.

 Evaluation Standards, Ethics and Competencies 

Evaluation can be a tricky business for practitioners, participants and users of evaluation. There are a variety of standards, ethical guidelines, and evaluator competencies available to help in planning to ensure evaluations are useful, feasible, accurate, and proper (ethical and legal). The AusAID standards provide a very good practical planning guide to ensure a quality evaluation plan. JCSEE standards provide a particularly useful guide to inform our understanding of the four dimensions of evaluation quality. The evaluation competencies developed by anzea outline the set of evaluator competencies specific to quality evaluation practice in New Zealand.

Website links

You may also find the following websites useful:

Presentations

2017. Australasian Evaluation Society International Conference. 4-6 September 2017. Canberra, Australia. Scally-Irvine, K. Who owns the data? Considerations of governance, ethics, and use of data for evaluators in 2017.

2017. Australasian Evaluation Society International Conference. 4-6 September 2017. Canberra, Australia. Averill, K. Integrated evaluation capital creation in a low capital environment: The design and use of an IT platform for evaluative management in the land of the unexpected (PNG).

2017. Australasian Evaluation Society International Conference. 4-6 September 2017. Canberra, Australia. Averill, K. He Kāinga Kōrerorero participatory evaluation.

2014. Australasian Evaluation Society International Conference. 8-12 September 2014, Darwin, Australia. Struwig, A. Evaluation as a agent for development sustainability: a real world example

2014. Australasian Evaluation Society International Conference. 8-12 September 2014, Darwin, Australia. Struwig, A. Designing and embedding strategic learning and management systems for organisations, programmes, and projects

2014. Aotearoa New Zealand Evaluation Association Conference. 7-10 July 2014, Wellington. Brown, R., Peterson, G., and Renwick, J. Laying the solid foundation to build effective evaluation practice

2013. Australasian Evaluation Society International Conference. 4-6 September 2013, Brisbane, Australia. Averill, K. Embedding results-focused evaluative monitoring as a “Business as Usual” management approach within organisations, teams and programmes, and building internal capability

2013. Aotearoa New Zealand Evaluation Association Conference. 22-24 July 2013, Auckland. Averill, K. and Peterson, Building, planning, monitoring and evaluation capacity within a bi-lateral aid partnership programme

2013. Aotearoa New Zealand Evaluation Association Conference. 22-24 July 2013, Auckland. Scally-Irvine, K. Demistifying ‘systems’ and ‘systems thinking’ – a practitioner’s guide to ‘systems’ in evaluation

2013. Aotearoa New Zealand Evaluation Association Conference. 22-24 July 2013, Auckland. Souness, C. Lessons from Afghanistan

2012. DevNet Conference. 3-5 December 2012, Auckland. Averill, K. From frameworks to governance: results frameworks – emerging research findings into the principles underpinning the architecture and use of country, sector and agency frameworks

2012. Australasian Evaluation Society International Conference. 28-31 August 2012, Adelaide, Australia. Scally-Irvine, K. , Averill, K. and Brace, J. Evaluative monitoring and complementary evaluative research: towards a new evaluation paradigm

2012. Aotearoa New Zealand Evaluation Association Conference. 8-11 July 2012, Hamilton. Scally-Irvine, K. and Averill, K. Evaluating for results: A practical approach linking planning, monitoring and evaluation. Pre-conference workshop. Evaluation in the real world.

2012. Thinking Recreation! Conference 2012. 18-20 July, Queenstown. Scally-Irvine, K. Putting petrol in the engine: getting reliable and regular information so good planning and decision making can happen

2011.  Aotearoa New Zealand Evaluation Association Conference. August 2011, Wellington. Scally-Irvine K. Mixing your methods: Useful learnings for the evaluation context from the frontlines of research.

2011.  Aotearoa New Zealand Evaluation Association Conference. August 2011, Wellington. Averill, K. and King, S. Changing landscape of evaluation within the New Zealand public sector

2011. IDEAS Global Assembly, Amman, Jordan. Averill, K. (Evaluation Consult). .Using results and outcomes frameworks to shift the focus of development and evaluation to a strategic level

2010. DevNet Conference, Palmerston North, New Zealand.  Averill, K. (Evaluation Consult). Using results frameworks to connect development outcomes, management, aid, monitoring and evaluation: emerging research on the principles underpinning country and sector results and outcomes frameworks

2009. Australasian Evaluation Society (AES) Conference, Canberra, Australia: Averill, K. (Evaluation Consult), William Sent (Papua New Guinea Department National Planning and Monitoring) & Jennifer Rush (Coffey International Development). Using a logic model as the framework for an evaluation in Papua New Guinea 

2008. American Evaluation Association Conference, Denver, Colorado, USA: Kate Averill (Evaluation Consult) and Paul W Duignan (Parker Duignan Consulting). Building a ‘world-centric’ rather than ‘program-centric’ logic model for a national problem gambling strategy: using logic modelling software

2008. anzea (Aotearoa New Zealand Evaluation Association) Conference: Kate Averill (Evaluation Consult) and Paul W Duignan (Parker Duignan Consulting). Using logic model software for a national problem gambling strategyClick here to open pdf.

2008. anzea Conference: Lloyd Jowsey and Kate Averill (Evaluation Consult). Communicating performance stories – using multimedia and story-based narratives to communicate evaluation findings to stakeholders

2007. anzea Seminar – evaluation reporting strategies seminar: Lloyd Jowsey and Kate Averill (Evaluation Consult).

Papers

2009. Australasian Evaluation Society (AES) conference paper, Canberra: Averill, K. Sent, M. W. & Rush, J. Using a logic model as the framework for an evaluation in Papua New Guinea

2007. Australasian Evaluation Society Conference: Lloyd Jowsey and Kate Averill. Evaluation reporting strategies paper. 

2007. Australasian Evaluation Society (AES). Doing Evaluation Better. September 5-7, 2007. Averill, K. & Jowsey, L. AES Conference paper V1 0 (2)

Reports

Reports completed by Evaluation Consult available to view are:

2011. Development West Coast: Averill, K., Simpson-Edwards, M., Smith, P. & Hains, J. West Coast Tourism Major Regional Initiative Outcomes Evaluation Report. Click here to read this report.

2008. Problem Gambling, Ministry of Health: Averill, K., Dowden, A., Mitchelmore, K. & Jones, M. Final monitoring and evaluation planClick here to open the pdf.

2006. Smokefree Pregnancy, Ministry of Health. Averill K, Dowden A, et al. Final Overview Smokefree Pregnancy Services. Wellington, New Zealand