Search

2024 Project Monitoring And Evaluation

Overview

This seminar familiarizes participants with project monitoring and evaluation (M&E) systems and tools that focus on results in international development. The seminar offers participants both a conceptual framework and practical skill development.

Format

The seminars are currently offered both in-person and online simultaneously, at the choice of the participant. This choice must be indicated at the time of registration. A small number of courses are scheduled to be delivered exclusively in person or online, and are indicated as such in the 2024 schedule. In-Person Only seminars usually start at 9:30 am Washington D.C. time. Daily sessions usually end at 4:00 pm. Breaks (including the lunch break) are allocated as appropriate. Online Only seminars will be delivered through five (for 1-week course) or ten (for 2-weeks course) live online sessions via videoconferencing platform. Each session will last approximately 3.5 hours and will be scheduled to start within a time window of 7:00 am – 8:30 am Washington D.C. time. Hybrid In-Person/Online seminars will start at a time most convenient to both in-person and online participants, and will generally follow the In-Person seminar format. We expect the classes to be highly interactive and can include presentations, case studies and exercises.

Course Outline

Results-Based Management (RBM) in International Development

  • Understanding and distinguishing between monitoring and evaluation in the context of RBM
  • Problem identification
  • Development of casual hypotheses (inputs, outputs, outcomes and impacts)
  • Feeding monitoring and evaluation findings into decision-making
  • Role of partners and stakeholders
  • Significance of “soft” assistance

Planning for and Executing the Monitoring and Evaluation Processes

  • Key principles for overall work planning
  • Purpose and timing (including ex-post) of monitoring and evaluation
  • Involving key partners and stakeholders
  • Building teams with defined roles and strong capabilities
  • Establishing a hierarchy of project objectives
  • Defining scope of monitoring and evaluations
  • Selecting analytical tools, methodologies or approaches enabling measurement and attribution
  • Importance of data quality and collection, and baseline data
  • Developing indicators to measure progress and identify gaps
  • Development and selection of evaluation questions and teams
  • Budgeting for monitoring and evaluation
  • Managing monitoring and evaluation processes
  • Anticipating and resolving problems

Tools, Methods and Approaches Facilitating Monitoring and Evaluation

  • Performance indicators and common rating systems
  • Logical framework approach (LogFrame) and results framework approach
  • Qualitative and quantitative data collection methods
  • Formal surveys
  • Rapid appraisal methods
  • Participatory methods
  • Field visits
  • Public expenditure tracking surveys
  • Economic analysis, including cost-benefit and cost-effectiveness analysis
  • Performance and process evaluation design
  • Impact evaluation design and purpose
  • Evaluation and tracking plans
  • Annual reviews and reports
  • Comparative overview of other tools, methods and approaches used by leading global institutions

Knowledge and Learning

  • Learning from evaluative evidence and applying recommendations from feedback
  • Improving evaluation feedback
  • Knowledge management
  • Institutionalization of learning

Course Advisor

Ms. Danielle de Garcia is the Vice President, Strategy, Performance and Learning at Social Impact (SI). Currently, she leads SI’s technical team that provides services in monitoring, performance evaluation, learning, organizational development, and strategic planning. She oversees a portfolio of staff and projects, providing leadership, quality assurance, and technical direction. She has extensive experience with both management and implementation; and has worked in more than 25 countries implementing a variety of monitoring, evaluation, and learning approaches. Dani’s recent work includes leading and participating in a number of evaluations, learning events, and strategic planning engagements for USAID, Carter Center, MasterCard Foundation, MCC, US Department of State, and MacArthur Foundation initiatives around the world. Throughout her career, Dani has provided performance management and organizational development assistance to the World Bank, CATHALAC, the United States Institute of Peace, and international NGOs; and has trained hundreds of NGO and USG staff on M&E designs and approaches. Through her work, she has helped a number of organizations institutionalize sound monitoring, evaluation, and learning practices. She has also facilitated M&E and strategic planning sessions for many USAID Missions around the world. Skilled in social network analysis and participatory approaches, Dani focuses on enhancing the results-based management capacity of people and organizations through evidence-based decision making and relevant innovations. Ms. de García holds an MPA in International Management and certification in Development Project Management. She is a Certified Performance Technologist (CPT) for human and institutional capacity development and is fluent in Spanish.