DOME™

EvalStars supports organisations in the planning and measuring of development outcomes, and using this information for decision-making and further planning. Its strategic, practical and straightforward approach is underpinned by DOME™ (Development Outcomes Monitoring and Evaluation). The DOME™ cyclical embedded systematic evaluative management approach underpins all EvalStar’s work across strategic planning; designing and implementing results-based performance management systems; monitoring; evaluation, impact studies and reviews; providing advisory and support services; and DOME™ training and capability building.

What is DOME™?

DOME™ is both an innovative approach to evaluation and management strategy, which is based on international best practice and solid theoretical foundations. As an approach to embedding evaluation, DOME™ links planning, evaluative monitoring/research, reporting, and change processes through a three-phased cycle. This practical approach provides timely results, accommodates uncertainty and change, and supports regular feedback and reporting cycles for on-going decision-making. The DOME™ approach enables both internal and external managers and evaluators to work collaboratively using an integrated approach of evaluative management.

The systematic DOME™ approach uses practical tools and templates, and an associated capability building programme. It is applicable, scalable and relevant across multiple programmes, teams, and organisations from small to large. It enables the growth of internal capability in organisations to tell their performance story.

The DOME™ approach is unique in that it provides a structure for undertaking stand-alone evaluations (a single cycle) and on-going monitoring (repeat cycles as part of results-focused management) to enable organisations and stakeholders to:

  • Respond to emerging results
  • Identify and track outputs and outcomes/impacts
  • Tailor their activities based on evidence gathered
  • Identify learning opportunities on an on-going basis
  • Deliver transparent accountability
  • Supports active participation in the development of an intervention.

As an evaluative management approach, DOME™ links planning, management, and the measurement of results to assist feedback decision-making, and enhance outcome effectiveness. It is a systematic and practical approach to planning; monitoring, evaluation, governance, and reporting that can be adopted by organisations and personnel at organisation, sector, programme or project level.

DOME™ provides a management strategy that:

  • Places evaluation as a strategic management function
  • Supports evaluative thinking and learning through all stages of the development cycle
  • Includes tools and processes at each stage that are robust, practical and straight forward
  • Is applicable and relevant for national, sector, agency, programme and project contexts
  • Is adaptable and flexible for use with a range of evaluation approaches and methodologies
  • Includes capability building for business ownership and sustainability

The core values underpinning DOME™ are collaboration, contextual sensitivity, and adaptation. Principles under which the DOME™ methodology are delivered include:

  • Developing capacity together
  • Engaging for collaborative learning
  • Making informed decisions
  • Being accountable for collective results.

One of the key aspects of DOME™ is supporting organisations to undertake evaluative activities around the cycle themselves. This requires sufficient organisational and individual capacity to ensure staff at all levels have key skills and knowledge to implement all phases of the DOME™ cycle. Capability training and support is incorporated into the implementation to develop individual, organisation, sector and country-wide capability for managers.
The development of DOME™ was informed by management and evaluation theorists including:

Plan Phase:

  • Readiness assessment – Rist 1
  • Strategy – Argyris 2, Mintzberg 3
  • Theory-based evaluation – Chen 4 , Weiss 5
  • Realistic evaluation – Pawson and Tilley 6
  • Accountability and Performance Management – Piccotto <sup7, Wholey 8

Monitor Phase:

  • Monitoring – Kaplan and Norton (Balanced Score Card) 9, 10
  • Rapid Evaluation Assessment Methodology (REAM) – Beebe 11
  • Mixed methodology – qualitative and quantitative, Tashkkori and Teddlie 12

Change Phase:

  • Learning – Schon 13
  • Decision-making – Kaplan 14
  • Results – Drucker 15</sup, Rist <sup16
  • Participatory and building capacity – Fetterman 17
  • Utility of evaluation/Developmental – Patton 18
  • Systems thinking – Checkland 19

References

  1. Rist, R. (2009). Country-led monitoring and evaluation systems. Better evidence, better policies better development results. UNICEF.
  2. Argyris, C. (1999). On organizational learning (2nd ed.). Malden, MA: Blackwell Publishing.
  3. Mintzberg, H. (1994). The Rise and Fall of Strategic Planning.
  4. Chen, H. (1990). Theory-driven evaluations. California: Sage Newbury Park.
  5. Weiss, C. H., & Barton, A. H. (Eds.). (1980). Making bureaucracies work. Beverly Hills: Sage Publications.
  6. Pawson, R., & Tilley, N. (1997). Realistic evaluation.
  7. Picciotto. (2009). Evaluating development: Is the country the right unit of account? Country-led monitoring and evaluation systems: Better evidence, better policies, better development results (pp. 32-55). New York: UNICEF.
  8. Wholey, J. (1999). Performance-Based Management: Responding to the Challenges. Public Productivity and Management Review, 22(3), 288-307.
  9. Kaplan, Robert S; Norton, D. P. (1992). “The Balanced Scorecard – Measures That Drive Performance”. Harvard Business Review (January–February): 71–79.
  10. Kaplan, Robert S; Norton, D. P. (1996). The Balanced Scorecard: Translating Strategy into Action. Boston, MA.: Harvard Business School Press. ISBN 978-0-87584-651-4.
  11. Beebe, J. (2001). Rapid assessment process: An introduction. Walnut Creek, CA: Alta Mira Press.
  12. Tashakkori, A. & Teddlie, C. (2003). Handbook of Mixed Methods in Social & Behavioral Research. Thousand Oaks: Sage.
  13. Schon, D. A. (1983). The reflective practitioner: how professionals think in action. New York: Basic Books.
  14. Kaplan, Martin F., and Charles E. Miller. “Group decision making and normative versus informational influence: Effects of type of issue and assigned decision rule.” Journal of Personality and social psychology 53.2 (1987): 306.
  15. Drucker, P. F. (1964). Managing for results: Economic tasks and risk-taking decisions. New York: Harper & Row.
  16. Rist. (2006). Building a Results Based Management System to Measure Development Results. Washington D.C: The World Bank.
  17. Fetterman, D. F. (2004). Branching Out or Standing on a Limb: Looking to Our Roots for Insight. In M. C. Alkin (Ed.), Evaluation Roots. Tracing the Theorists’ Views and Influences. Thousand Oaks, London, New Delhi: Sage Publications.
  18. Patton, M. Q. (2002). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks, Calif.: Sage Publications.
  19. Checkland, P.B. (1981) Systems Thinking, Systems Practice. John Wiley, Chichester, UK (republished 1999 in paperback, with a 30-year retrospective).