Evaluate Impact

Impact Evaluation

Evaluation of integrated SBCC should focus on extracting lessons learned, identifying challenges, and explaining successes that would not occur in the absence of integration. In other words, integration can be considered effective if it offers advantages over a singular or vertical SBCC program approach. Impact evaluation is crucial for understanding whether or not these additional gains took place and the degree to which they can be attributed to the integration efforts. Evaluation questions may revolve around:

  • Whether integration adds value over non-integrated SBCC programs, for example, cost-effectiveness, health outcomes, quality of services, intra or inter-organization collaboration or reduced redundancy of effort.

  • If the planned amplified effects or operational benefits from integration were realized

  • The strength of the type of integration model deployed, compared to other integration approaches

  • The effect of communication separately and in combination with other intervention strategies, such as increased availability of products or services, and how this is different in integrated approaches

  • The impact of topic-specific messages separately and when combined across topics

  • The combined effect or dose effect of the particular mix of cross-sectoral topics/messages or branding

  • Social norm change and impact on social capital and how integration amplifies these

  • Capacity of government or stakeholders to design, implement, and monitor and evaluate integrated SBCC programs and/or how that capacity changes over the course of the intervention

  • Degree to which the processes and outcomes of integration are sustained or sustainable

  • Potential for scale-up, and/or the effectiveness of integrated programs that have moved from a pilot to a scale-up phase

Evaluation of integrated SBCC programs should also seek to capture any unanticipated positive and negative consequences of integration. The more (and more varied) data that can be collected linking program inputs to outputs and outcomes, and the more systematically that these are tracked over time, the better the chances of detecting negative consequences that reveal unanticipated costs or burdens on the systems created by the new approach. For example, given the connectedness of topics in an integrated campaign, change in one sector may have unexpected consequences in another sector, such as shifting of priorities and resources. Or the addition of a new topic could distract the audiences from other important topics or behaviors. Conducting frequent site visits and holding consultations with stakeholders also will help you discover unintended consequences of your efforts.