search advanced search
UNICEF Innocenti
Office of Research-Innocenti
search menu

Publications

UNICEF Innocenti's complete catalogue of research and reports
Places and Spaces: Environments and children’s well-being
SPOTLIGHT

Places and Spaces: Environments and children’s well-being

Report Card 17 explores how 43 OECD/EU countries are faring in providing healthy environments for children. Do children have clean water to drink? Do they have good-quality air to breathe? Are their homes free of lead and mould? How many children live in overcrowded homes? How many have access to green play spaces, safe from road traffic? Data show that a nation’s wealth does not guarantee a healthy environment. Far too many children are deprived of a healthy home, irreversibly damaging their current and future well-being. Beyond children’s immediate environments, over-consumption in some of the world’s richest countries is destroying children’s environments globally. This threatens both children worldwide and future generations. To provide all children with safe and healthy environments, governments, policymakers, businesses and all stakeholders are called to act on a set of policy recommendations.
READ THE FULL REPORT

RESULTS:   14     SORT BY:
Prev 1 2 Next

FILTER BY:

PUBLICATION DATE:
1 - 12 of 14
First Prev 1 2 Next Last
Ghana LEAP 1000 Impact Evaluation: Overview of Study Design
Ghana LEAP 1000 Impact Evaluation: Overview of Study Design

AUTHOR(S)
Richard de Groot

Published: 2016 Innocenti Research Briefs

Sharing of good, practical research practices and lessons learned from development and humanitarian contexts is in high demand not only within UNICEF, but also in the broader international development and humanitarian community, ‘Impact Evaluation in the Field’ complements other methodological briefs by discussing how textbook approaches are applied in often challenging, under-resourced development contexts as well as the innovative solutions that are needed to ensure that practical demands do not compromise methodological rigour. The series will grow over time, allowing UNICEF staff and partners to share new experiences and approaches as they emerge from applied research. The overarching aim is to contribute to strengthening capacity in research and evaluation, improving UNICEF and partners’ ability to provide evidence-based, strategic, long-term solutions for children. This brief documents the impact evaluation design of the Ghana Livelihood Empowerment against Poverty (LEAP) 1000 programme which is being piloted in ten districts in two regions and targets about 6,000 households initially.

Utilizing Qualitative Methods in the Ghana LEAP 1000 Impact Evaluation
Utilizing Qualitative Methods in the Ghana LEAP 1000 Impact Evaluation

AUTHOR(S)
Michelle Mills; Clare Barrington

Published: 2016 Innocenti Research Briefs

Sharing of good, practical research practices and lessons learned from development and humanitarian contexts is in high demand not only within UNICEF, but also in the broader international development and humanitarian community, ‘Impact Evaluation in the Field’ complements other methodological briefs by discussing how textbook approaches are applied in often challenging, under-resourced development contexts as well as the innovative solutions that are needed to ensure that practical demands do not compromise methodological rigour. The series will grow over time, allowing UNICEF staff and partners to share new experiences and approaches as they emerge from applied research. The overarching aim is to contribute to strengthening capacity in research and evaluation, improving UNICEF and partners’ ability to provide evidence-based, strategic, long-term solutions for children. This methodological brief focuses on the qualitative component of the evaluation of the Ghana Livelihood Empowerment against Poverty (LEAP) 1000. Quantitative measures will indicate if LEAP 1000 reduces child poverty, stunting and other measures of well-being, while qualitative research explores in more depth the reasons why and how this may or may not be happening.

Evaluative Criteria: Methodological Briefs - Impact Evaluation No. 3
Evaluative Criteria: Methodological Briefs - Impact Evaluation No. 3

AUTHOR(S)
Greet Peersman

Published: 2014 Methodological Briefs
Evaluation relies on a combination of facts and values to judge the merit of an intervention. Evaluative criteria specify the values that will be used in an evaluation. While evaluative criteria can be used in different types of evaluations, this brief specifically addresses their use in impact evaluations.
Overview: Strategies for Causal Attribution: Methodological Briefs - Impact Evaluation No. 6
Overview: Strategies for Causal Attribution: Methodological Briefs - Impact Evaluation No. 6

AUTHOR(S)
Patricia Rogers

Published: 2014 Methodological Briefs
One of the essential elements of an impact evaluation is that it not only measures or describes changes that have occurred but also seeks to understand the role of particular interventions (i.e., programmes or policies) in producing these changes. This process is known as causal attribution. In impact evaluation, there are three broad strategies for causal attribution: 1) estimating the counterfactual; 2) checking the consistency of evidence for the causal relationships made explicit in the theory of change; and 3) ruling out alternative explanations, through a logical, evidence-based process. The ‘best fit’ strategy for causal attribution depends on the evaluation context as well as what is being evaluated.
Randomized Controlled Trials (RCTs): Methodological Briefs - Impact Evaluation No. 7
Randomized Controlled Trials (RCTs): Methodological Briefs - Impact Evaluation No. 7

AUTHOR(S)
Howard White; Shagun Sabarwal; Thomas de Hoop

Published: 2014 Methodological Briefs
A randomized controlled trial (RCT) is an experimental form of impact evaluation in which the population receiving the programme or policy intervention is chosen at random from the eligible population, and a control group is also chosen at random from the same eligible population. It tests the extent to which specific, planned impacts are being achieved. The distinguishing feature of an RCT is the random assignment of units (e.g. people, schools, villages, etc.) to the intervention or control groups. One of its strengths is that it provides a very powerful response to questions of causality, helping evaluators and programme implementers to know that what is being achieved is as a result of the intervention and not anything else.
Quasi-Experimental Design and Methods: Methodological Briefs - Impact Evaluation No. 8
Quasi-Experimental Design and Methods: Methodological Briefs - Impact Evaluation No. 8

AUTHOR(S)
Howard White; Shagun Sabarwal

Published: 2014 Methodological Briefs
Quasi-experimental research designs, like experimental designs, test causal hypotheses. Quasi-experimental designs identify a comparison group that is as similar as possible to the intervention group in terms of baseline (pre-intervention) characteristics. The comparison group captures what would have been the outcomes if the programme/policy had not been implemented (i.e., the counterfactual). The key difference between an experimental and quasi-experimental design is that the latter lacks random assignment.
Comparative Case Studies: Methodological Briefs - Impact Evaluation No. 9
Comparative Case Studies: Methodological Briefs - Impact Evaluation No. 9

AUTHOR(S)
Delwyn Goodrick

Published: 2014 Methodological Briefs
Comparative case studies involve the analysis and synthesis of the similarities, differences and patterns across two or more cases that share a common focus or goal in a way that produces knowledge that is easier to generalize about causal questions – how and why particular programmes or policies work or fail to work. They may be selected as an appropriate impact evaluation design when it is not feasible to undertake an experimental design, and/or when there is a need to explain how the context influences the success of programme or policy initiatives. Comparative case studies usually utilize both qualitative and quantitative methods and are particularly useful for understanding how the context influences the success of an intervention and how better to tailor the intervention to the specific context to achieve the intended outcomes.
Developing and Selecting Measures of Child Well-Being: Methodological Briefs - Impact Evaluation No. 11
Developing and Selecting Measures of Child Well-Being: Methodological Briefs - Impact Evaluation No. 11

AUTHOR(S)
Howard White; Shagun Sabarwal

Published: 2014 Methodological Briefs
Indicators provide a signal to decision makers by indicating whether, and to what extent, a variable of interest has changed. They can be used at all levels of the results framework from inputs to impacts, and should be linked to the programme’s theory of change. Most important at the lower levels of the causal chain are monitoring indicators such as inputs (e.g., immunization kits supplied), activities (e.g., immunization days held) and outputs (e.g., clinics built). For higher-level indicators of outcomes and impact, however, monitoring tells us what has happened but not why it happened. To understand this, impact evaluation must be used to increase our understanding of the factors behind achieving or not achieving the goal.
Présentation de l'évaluation d’impact : Note méthodologique - Évaluation d'impact n° 1
Présentation de l'évaluation d’impact : Note méthodologique - Évaluation d'impact n° 1
Published: 2014 Methodological Briefs
L’évaluation d’impact fournit des informations sur les effets induits par une intervention. Elle peut être réalisée dans le cadre d’un programme, d’une politique ou de travail en amont, par exemple le renforcement des capacités, le plaidoyer politique et l’appui à la mise en place d’un environnement favorable. Cela va au-delà d’une simple étude des buts et objectifs et examine également les impacts inattendus.
Présentation des stratégies d'attribution causale : Note méthodologique - Évaluation d'impact n° 6
Présentation des stratégies d'attribution causale : Note méthodologique - Évaluation d'impact n° 6

AUTHOR(S)
Patricia Rogers

Published: 2014 Methodological Briefs
L’un des éléments essentiels d’une évaluation d’impact est qu’il ne s’agit pas seulement de mesurer ou de décrire les changements survenus, mais également de comprendre le rôle joué par certaines interventions particulières (programmes ou politiques) dans ces changements. Ce processus est appelé attribution causale. Il existe trois grandes stratégies d’attribution causale dans les évaluations d’impact : 1) l’estimation du scénario contrefactuel ; 2) la vérification de la cohérence des données probantes pour les relations de cause à effet explicitement exposées dans la théorie du changement ; et 3) l’exclusion d’autres explications par le biais d’un processus logique fondé sur des données probantes. La stratégie d’attribution causale la mieux adaptée dépend du contexte d’évaluation et de ce qui est évalué.
Présentation des méthodes de collecte et d'analyse de données dans l'évaluation d'impact : Note méthodologique - Évaluation d'impact n° 10
Présentation des méthodes de collecte et d'analyse de données dans l'évaluation d'impact : Note méthodologique - Évaluation d'impact n° 10

AUTHOR(S)
Greet Peersman

Published: 2014 Methodological Briefs
Les évaluations d’impact ne doivent pas se cantonner à déterminer l’ampleur des effets (c’est-à-dire l’impact moyen), mais doivent également identifier qui a bénéficié de ces programmes ou politiques et comment. Il convient de préciser dès le début ce qui constitue une « réussite » et la façon dont les données seront analysées et synthétisées pour répondre aux questions clés d’évaluation. La collecte de données doit en effet permettre d’obtenir l’ensemble de données probantes nécessaires pour porter des jugements appropriés sur le programme ou la politique.
Sinopsis de la Evaluación de Impacto: Síntesis metodológica - Sinopsis de la evaluación de impacto n° 1
Sinopsis de la Evaluación de Impacto: Síntesis metodológica - Sinopsis de la evaluación de impacto n° 1

AUTHOR(S)
Patricia Rogers

Published: 2014 Methodological Briefs
La evaluación de impacto proporciona información sobre los impactos que produce una intervención. Puede realizarse una evaluación de impacto de un programa o una política o del trabajo preliminar, como la creación de capacidad, la promoción de políticas y el apoyo a la creación de un entorno propicio. Esto supone examinar no solo las metas y los objetivos, sino también los impactos imprevistos.
1 - 12 of 14
First Prev 1 2 Next Last
INNOCENTI DISCUSSION PAPERS INNOCENTI REPORT CARD INNOCENTI RESEARCH BRIEFS INNOCENTI WORKING PAPERS MISCELLANEA INNOCENTI RESEARCH REPORT BEST OF UNICEF RESEARCH
JOURNAL ARTICLES BLOGS
Return on Knowledge: How international development agencies are collaborating to deliver impact through knowledge, learning, research and evidence
Publication

Return on Knowledge: How international development agencies are collaborating to deliver impact through knowledge, learning, research and evidence

Effective collaboration around knowledge management and organizational learning is a key contributor to improving the impact of international development work for the world’s most vulnerable people. But how can it be proven? With only 10 years from the target date for the Sustainable Development Goals, nine of the world’s most influential agencies set out to show to the connection between the use of evidence, knowledge and learning and a better quality of human life. This book – a synthesis of stories, examples and insights that demonstrate where and how these practices have made a positive impact on development programming – is the result of the Multi-Donor Learning Partnership (MDLP), a collective effort to record the ways each of these organizations have leveraged intentional, systematic and resourced approaches to knowledge management and organizational learning in their work.
Gender Solutions: Capturing the impact of UNICEF’s gender equality evidence investments (2014–2021)
Publication

Gender Solutions: Capturing the impact of UNICEF’s gender equality evidence investments (2014–2021)

UNICEF has undertaken hundreds of gender evidence generation activities, supporting programmatic action, advocacy work and policymaking. The Gender Solutions project aims to draw together the knowledge, innovations and impacts of gender evidence work conducted by UNICEF offices since the first UNICEF Gender Action Plan was launched in 2014. A desk review identified over 700 gender-related UNICEF research, evaluation and data evidence generation activities since 2014. Twenty-five outputs were shortlisted because of their high quality and (potential for) impact and three were selected as Gender Evidence Award winners by an external review panel. By capturing the impact of this broad body of work, Gender Solutions aims to showcase UNICEF’s evidence investments, reward excellence and inform the rollout of the UNICEF Gender Policy 2021–2030 and Action Plan 2022–2025.
Annual Report 2021
Publication

Annual Report 2021

The UNICEF Innocenti Annual Report 2021 highlights the key results achieved in research and evidence to inform policymaking and programming.
Responsible Innovation in Technology for Children: Digital technology, play and child well-being
Publication

Responsible Innovation in Technology for Children: Digital technology, play and child well-being

Digital experiences can have significant negative impact on children, exposing them to risks or failing to nurture them adequately. Nevertheless, digital experiences also potentially yield enormous benefits for children, enabling them to learn, to create, to develop friendships, and to build worlds. While global efforts to deepen our understanding of the prevalence and impact of digital risks of harm are burgeoning – a development that is both welcome and necessary – less attention has been paid to understanding and optimizing the benefits that digital technology can provide in supporting children’s rights and their well-being. Benefits here refer not only to the absence of harm, but also to creating additional positive value. How should we recognize the opportunities and benefits of digital technology for children’s well-being? What is the relationship between the design of digital experiences – in particular, play-centred design – and the well-being of children? What guidance and measures can we use to strengthen the design of digital environments to promote positive outcomes for children? And how can we make sure that children’s insights and needs form the foundation of our work in this space? These questions matter for all those who design and promote digital experiences, to keep children safe and happy, and enable positive development and learning. These questions are particularly relevant as the world shifts its attention to emerging digital technologies and experiences, from artificial intelligence (AI) to the metaverse, and seeks to understand their impact on people and society. To begin to tackle these questions, UNICEF and the LEGO Group initiated the Responsible Innovation in Technology for Children (RITEC) project in partnership with the Young and Resilient Research Centre at Western Sydney University; the CREATE Lab at New York University; the Graduate Center, City University of New York; the University of Sheffield; the Australian Research Council Centre of Excellence for the Digital Child; and the Joan Ganz Cooney Center. The research is funded by the LEGO Foundation. The partnership is an international, multi-stakeholder and cross-sectoral collaboration between organizations that believe the design and development of digital technology should support the rights and well-being of children as a primary objective – and that children should have a prominent voice in making this a reality. This project’s primary objective is to develop, with children from around the world, a framework that maps how the design of children’s digital experiences affects their well-being, and to provide guidance as to how informed design choices can promote positive well-being outcomes.

Share:

facebook twitter linkedin google+ reddit print email