ANZEA Conference 2022
10 - 12 October 2022 | Te Papa, Wellington
Back To Schedule
Monday, October 10 • 4:00pm - 5:30pm
C.01 If evaluation is a science, then robust evaluation uses a single core methodology - with plenty of contextual permutations

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Feedback form is now closed.

At its simplest form, robust evaluation is evaluation that has a clear and logical set of questions, an explicit evaluation framework that supports value judgments, and a transparent, defensible, analysis that is shared via a reporting process or report. So, what do robust questions, evaluation frameworks and reporting look like?

This workshop will explain what good looks like, and how these three components are each used to guide robust evaluation. It will also describe various common pitfalls and their solutions with some practical tips and tricks to help recognise and avoid them.

This will be useful for evaluators and those managing evaluations, with some prior experience. Good questions, frameworks and reporting look like:

  1. A concise set of evaluation questions, that includes priority questions, and some evaluative (not only descriptive) questions. Each are answered in the evaluation reporting.  
  2. An evaluation framework that describes evaluand and its intent, and provides agreed definitions of ‘good’ or what is valued and (where applicable) references standards.  
  3. A reporting process or report that makes explicit and defensible conclusions that are acceptable and believable and are therefore useful for decisions about proving, improving, expanding, or ceasing evaluand activities.  
Good use of an evaluation framework includes using it as a lens to guide the evaluation design, to refine the evaluation throughout and to support analysis, sense-making and value judgements. Robust reporting is an explicit and defensible presentation of findings and judgements. Each finding is based on cohesively presented evidence. Judgements build on these, and make explicit use of the evaluation framework. A number of common pitfalls that will be described and solutions shared. Three examples are:

a) Asking too many or too vague evaluation questions.
b) Not defining ‘good’ or what is valued, even in draft form.
c) Making indefensible conclusions or judgments.

avatar for Anne Dowden

Anne Dowden

Anne Dowden REWA
Anne is an evaluation consultant. She runs her own consultancy providing services to Governmentagencies and other entities. Anne collaborates and partners with other evaluation experts for mostevaluation projects.Anne is part of a large Australian-New Zealand family and has a partner... Read More →

Monday October 10, 2022 4:00pm - 5:30pm NZDT
Te Huinga Centre - Rangimarie Room 2