Logo UNICEF Innocenti
Office of Research-Innocenti
menu icon
EventEvent

Ethics in Humanitarian Research: A Practical Discussion

(Past event)

Event type: Webinar

Related research: Ethical research and children

events20 - 12 May 2020
Please save the date for the next Humanitarian Monitoring, Evaluation & Learning (HuMEL) meeting on Wednesday, May 20!

About this Event

Humanitarian Monitoring, Evaluation & Learning (HuMEL) is an emergency M&E learning network centered around the increasingly complex humanitarian context. The purpose of the network is to share lessons learned, methodologies, and experiences between implementers, donors, and other stakeholders with the goal of improving programming.

Join us for a lively discussion on ethical reviews for humanitarian research and monitoring. The webinar will kick off with a panel presentation from several implementing organizations that have in-house ethical oversight processes and procedures including ERC/ERBs and other less centralized mechanisms for ensuring ethical research and MEAL. We will then break into small groups to discuss challenges and successful approaches for conducting ethics reviews for humanitarian research and monitoring. Topics for small-group discussion will include research ethics during COVID-19, balancing local and international IRB approval timelines and the humanitarian imperative for rapid implementation, the pros and cons of in-house ethics reviews, and a selection of other topics proposed by participants during registration.


Experts

Gabrielle Berman

UNICEF Innocenti

Contact

Related Content

Ethical Considerations for Evidence Generation Involving Children on the COVID-19 Pandemic
Publication

Ethical Considerations for Evidence Generation Involving Children on the COVID-19 Pandemic

This paper identifies key ethical considerations when undertaking evidence generation involving children during the mitigation stage of the pandemic (emergency phase), on subject matter relating to COVID-19 once the pandemic has been contained, and once containment policy measures, including lockdowns, have been lifted (post-emergency phase). While the COVID-19 pandemic is undoubtedly a global crisis, with evidence generation activities raising critical ethical issues that have been captured in the literature and relevant guidelines, there are specificities relating to this emergency that must be considered when unpacking potential ethical issues. Hence while ethical issues pertaining to evidence generation involving children in emergencies and humanitarian contexts are relevant and should be considered, there are factors that define this ‘special case’ that must be considered from the outset. These will inform the core ethical considerations that need to be addressed.
Ethical research and children
Project

Ethical research and children

UNICEF is committed to ensuring that all research, evaluation and data collection processes undertaken by UNICEF and its partners are ethical. To this end, procedures and guidelines have been created to embed ethical principles and practices in all our evidence generation programmes. UNICEF recognizes the critical importance of children’s voice in evidence generation and is developing tools to support and advocate for ethical evidence generation involving children.
Ethical research for children
Project

Ethical research for children

UNICEF is committed to ensuring that all research, evaluation and data collection processes undertaken by UNICEF and its partners are ethical.
New Technologies: Rich Source of Data or Ethical Minefield for Researchers?
Blog Post

New Technologies: Rich Source of Data or Ethical Minefield for Researchers?

We sat down with Gabrielle Berman, our expert on research ethics, to chat about her two new discussion papers which explore the ethics of using new technologies to generate evidence about children. The papers, written collaboratively with UNICEF’s Office of Innovation, highlight the advantages and risks of using these technologies to gather data about children. They also provide useful guidance for researchers – especially those unfamiliar with technology – on the questions they should be asking in order to protect children’s rights. What inspired you to produce these two discussion papers? During staff trainings, we kept getting requests from staff who wanted technical advice around technology and the use of technologies for data collection and evidence generation. Most didn’t know where to start or what to consider when thinking about using these technologies. This was the initial foray into an incredibly complex and important area for UNICEF around using technology for evidence generation. Why focus on social media and geospatial technology specifically? We started with social media and geospatial technology because these were the two that were the most prevalent in the organisation at the time, and there was the most demand for guidance. We should now also be considering the ethical implications of technologies like biometrics, blockchain and wearable technology. We have already started receiving requests from staff for technical advice around the ethical implications of these new technologies. From my perspective, one of the biggest misconceptions is that technology is unequivocally good, meaning these technologies could be used without appropriate reflection on the implications and potential impacts.The papers were written in collaboration with the Office of Innovation. What were the benefits of the process? Without this collaboration, the papers wouldn’t have the same status. The dialogue and relationships that were established through the collaboration of the two offices are just as important as the papers themselves. Working together also meant that we could establish an advisory group which spanned everything from ICT for Development, to communications, to data and analytics.  In this way, people with all sorts of expertise across the organization could input into the papers. What are some of the most common misconceptions about the use of these technologies for evidence generation? It depends on what side of the fence you’re on. From my perspective, one of the biggest misconceptions is that technology is unequivocally good, meaning these technologies could be used without appropriate reflection on the implications and potential impacts. However, for those on the technology side, one of the biggest issues is that technology won’t be used for fear of the complexity of the ethical implications. The benefit of collaborating with the Innovation office was that we had two different perspectives, but both are equally valid. Through dialogue, we acknowledged the benefits as much as the risks and agreed that what was required was reflective practice. It’s not an absolute yes or a no, but rather an “if” and “how can we do this?” and what strategies do we need to consider? The biggest challenge is understanding the implications of the technology when you’re not a native technocrat. It’s incredibly difficult for staff in the field to understand the type of questions they should be asking.What is the biggest challenge with regards ensuring ethical compliance when using these technologies? The biggest challenge is understanding the implications of the technology when you’re not a native technocrat. It’s incredibly difficult for staff in the field to understand the type of questions they should be asking. They also have to receive responses in simple English in order for them to start thinking through the ethical issues and the potential mitigation strategies. Part of the challenge is to change thinking and empower those who aren’t tech natives to feel comfortable enough to ask questions and interrogate the potential implications of the technology. We must avoid abdication of responsibility to tech experts and remind staff that they are in fact the experts on potential implications for children. To be a child advocate it’s incredibly important to ask the right questions, to understand and to take responsibility for the technology and its implications.Adolescent girls look at social media posts while attending a "Lifeskills" event in Union Development & Culture Community Centre in Djibouti.How can we empower people to feel like they can ask the right questions? Firstly, we need to provide them with guidance. But importantly, we need to get stakeholders around the same table - including the social media companies, the data scientists, and the communities we work with. We should bring these people with different perspectives together, acknowledging everyone’s expertise, and engaging in dialogue on what the potential ethical issues are and how they can be mitigated. This joint risk assessment is a key way to start a constructive dialogue on the issues and potential mitigation strategies. How can we mitigate the threat of data and algorithms informing policy, without appropriate engagement and dialogue? Firstly, it’s important to appreciate the implications of the data sets on which algorithms are based and to be aware that the algorithms may have built-in biases. For example, certain populations are more likely to be monitored and so data on arrests are more likely to be higher for this group. Following from this, we need to understand that certain populations may be excluded from the data sets. Algorithms are based on training data, so unless all communities are included in the data, the outcomes and predictions are not going to be representative of these communities. For example, when data is gathered via smartphones, those who don’t have a smartphone are excluded. We must not forget that there are very real risks in making decisions based on quantitative trends alone.Thirdly, we must recognize that modelling looks at trends only and does not consider individuals. Algorithms may see these trends as a whole but, like any type of quantitative data, it will miss the qualitative nuances underpinning these findings. While it may be very easy to adopt big data sets and use them to determine policies, we must not forget that there are very real risks in making decisions based on quantitative trends alone. The Convention on the Rights of the Child very clearly says that a child has the right to have a voice on matters that affect them. If we start basing policy exclusively on quantitative data, we are not giving voice to the nuances that may explain the data or the nuances of the individual who may differ from the broader findings. It’s very important that we acknowledge the value of big data, but we must also acknowledge that individuals still need to have a voice. Listening to children’s voices should never be replaced by a purely quantitative approach, so while data is a very valuable tool, it is only one tool. Are there any other ethical risks that stem from using this type of data? With social media data in particular, if you run algorithms against the data you may start using it for purposes other than those originally intended when people submitted this data. The moment you start taking data out of context, it may lose its contextual integrity in which case we have to ask whether it is still relevant. This idea of contextual integrity needs to be interrogated.Adolescent girls use cellphones and tablets in a solar kiosk providing internet connectivity in the Za’atari camp for Syrian refugees, Jordan.Does ensuring ethical compliance also provide an opportunity to educate people on technology and online privacy? I don’t see it as an opportunity but rather an obligation stemming from our use of these technologies. If you’re using third party data in a public way, it must also be announced publicly, in the spirit of full disclosure. We must recognise the importance of transparency in the work we do, particularly when it may be difficult if not impossible to secure informed consent. With social media projects, where you’re actively using the platform to engage young people, it’s incredibly important that you actually provide information around privacy. You cannot guarantee that children have child-friendly explanations, and so it’s our responsibility to educate and to be clear about the risks involved. Are there any additional potential risks specifically associated with geospatial technology? Geospatial technology has been invaluable in both development and humanitarian contexts, but we need to think about where we source the data, how useful it is, whether it’s a two-way dialogue, and if can we respond to any requests for help in humanitarian contexts. We particularly need to be concerned with the potential risks involved and the security of this data in humanitarian contexts. In these situations, we’re often dealing with vulnerable populations. Because of this, we must be very careful to ensure that this data is limited to those who absolutely need to access the data, particularly any raw and individual data, with specific consideration for the safety of the populations that may be captured by the technology. For this reason, we really need to think about the interoperability of geospatial data systems: which partners are we working with? who in these organisations has access to the data? why do they have access? do we have sufficient security measures? These types of reflections are necessary to fulfill our obligations to protect the communities that we work with.   You can download both papers on ethical considerations when using geospatial technology and social media on unicef-irc.org now.
Ethical collection of data from children during the COVID-19 pandemic
Blog Post

Ethical collection of data from children during the COVID-19 pandemic

Our need to understand, quantify, forecast, track and unpack the COVID-19 pandemic fuels an insatiable need for data. While children are not the primary victims, they are significantly impacted in most areas of their lives, and will continue to be well after the pandemic is contained. Understanding the impact on children is critical. Understanding their circumstances will be necessary for current and future predictions of impacts of the crisis on them. Collecting information that helps us determine how best to respond to similar future outbreaks is essential. There is so much we don’t know, and our children’s futures depend on us knowing. We need to take care. We need to ensure that our desire to help, to understand, to learn and to do all of this quickly doesn’t overshadow the basic principle of 'do no harm.'We need children’s data, and we need it yesterday. We need data about them, and we will need to get data directly from them. This is necessary to secure the rights of children, ensure that they have a voice, are safe and protected and that their basic needs are met. Where physical distancing is in place, we will look to use both old and new tech to gather the data online or by using phones. We will explore pre-existing big data sets and try and create new ones with new tech. When the crisis is over, we may revert whole or in part to face to face interviews. Whether using primary or secondary data we will be collecting information on children because we need to. Despite restrictions on movement due to coronavirus containment measures health workers in Al-Hasakeh city, Syria, continue to provide guidance to a refugee mother at the shelters on infant and young child feeding practices.However, and this is a big however, we need to take care. We need to ensure that our desire to help, to understand, to learn and to do all of this quickly doesn’t overshadow the basic principle of “do no harm.” Whether we are considering using apps for contact tracing, or thinking of asking children via social media platforms about their day to day lives in lock-down, we need to do so with a critical lens on our belief that we will do good through the data collection. Read new Discussion Paper: Ethical Considerations for Evidence Generation Involving Children on the COVID-19 PandemicThe risk is obvious, if we don’t consider issues like equity, justice, respect, privacy, purpose limitations and data value add, then we are at risk of negatively impacting those that we are looking to support. Without appropriate ethical reflection throughout and beyond the pandemic, a number of negative outcomes for children will likely ensue including: Significant exposure to risk of traumatization due to inappropriate questions and timing and an inability to determine where they may be within trauma and healing cycles;Difficulties responding to and ensuring an appropriate duty of care during the emergency and immediately after, if observation or disclosure of abuse occurs and/or if significant psychological/physical needs are evidenced given limited access to services;Perceived and actual privacy and confidentiality violations and data collection in excess of requirements and without appropriate and truly informed consent impacting the child and eroding the trust of children and their communities;Data obtained for one purpose, such as contract tracing, being misused for political/social surveillance;Potential reprisals against child participation or even consequent to attempts at recruitment in evidence generation, heightened during the mitigation stage of the outbreak in contexts where children are in lock-down;Poorly designed evidence generation that produces unreliable data including: a) poorly designed instruments that make incorrect assumptions relating to impacts, needs, experiences, and homogeneity of children and their experiences and, b) using technologies that may not be accessible to disadvantaged children resulting in poorly informed policies and future risk mitigation in outbreaks that fail to equitably meet children’s needs and long-term development;Missed opportunities to obtain children’s perspectives and insights – not just those considered ‘children’s issues’ – and/or prioritizing subject matter that fails to take into account children’s priorities in relation to support during COVID-19 and other future outbreaks.A young girl in Cairo, Egypt, does schoolwork on a tablet while staying at home during the COVID-19 pandemic.In all the above examples our desire to help and to understand becomes subverted by the benign neglect of the ethical imperative. So whenever primary or secondary data collection from or about children is undertaken, explicit reflection is required on the timing, approach, necessity and transparency of the process. Consideration also should be given to privacy, representation, consent and importantly the circumstances of children. As always it will be the most disadvantaged and marginalized children that will suffer the greatest. Whether through exclusion, surveillance or lack of access to resources and cramped conditions these cohorts are at a heightened risk of vulnerability to multiple deprivations, psychological distress and exploitation both before and after the pandemic is contained. In these and all instances our response needs to be continuous vigilance, reflection and development of mitigation strategies. In some instances, the data collection should simply not go ahead in either the short term or indeed ever. We have to be prepared to put aside our ambitions for principles, to swap idealism for the realities of context, and to be ever vigilant.   Gabrielle Berman is Senior Advisor, Ethics in Evidence Generation at UNICEF Innocenti. This blog builds on a new Innocenti Discussion Paper highlighting ethical considerations involving children in research on the COVID-19 pandemic. For more visit UNICEF Innocenti's COVID-19 and children rapid research response  microsite.