UNICEF Innocenti
Office of Research-Innocenti
search menu

Profiles

email

Related Innocenti Project(s):

Gabrielle Berman

Senior Advisor - Ethics in Evidence Generation (Former title)

Gabrielle Berman is responsible for providing advisory and technical support to ensure the highest ethical standards within UNICEF’s research, evaluation and data collection and analysis programmes globally. Her role includes the development of relevant guidance and resources and advocating for ethical practices for evidence generation involving children in different contexts, for different cohorts and utilizing existing, new and emerging technologies. Prior to this role she has worked as a consultant to UN Agencies, Governments and NGO’s providing research, policy and programming advice on a range of issues including ICT and young people in developing countries, human rights, migrant health and homelessness. Her professional experience includes working as a strategic consultant and senior policy advisor in Government and Academia. She has undertaken 3 post-doctoral fellowships in the areas of not-for-profit economics, social policy and population health.

Publications

Encryption, Privacy and Children’s Right to Protection from Harm
Publication

Encryption, Privacy and Children’s Right to Protection from Harm

This working paper provides a short overview of the challenges and opportunities related to child protection and the use of encryption technology. While it does not constitute the UNICEF organizational position on the topic, it is meant to inform UNICEF on the issue and to reach and engage professionals, including nonexperts, within and between the child rights and privacy rights sectors. This paper will provide an overview of the debate around encryption and its possible impact on children’s right to protection from harm. It also reflects on the pros and cons of some proposed solutions.
Digital Contact Tracing and Surveillance During COVID-19. General and child-specific ethical issues
Publication

Digital Contact Tracing and Surveillance During COVID-19. General and child-specific ethical issues

The response to the pandemic has seen an unprecedented rapid scaling up of technologies to support digital contact tracing and surveillance.This working paper explores the implications for privacy as the linking of datasets: increases the likelihood that children will be identifiable; increases the opportunity for (sensitive) data profiling; and frequently involves making data available to a broader set of users or data managers.
Digital Contact Tracing and Surveillance During COVID-19: General and child-specific ethical issues
Publication

Digital Contact Tracing and Surveillance During COVID-19: General and child-specific ethical issues

The response to COVID-19 has seen an unprecedented rapid scaling up of technologies to support digital contact tracing and surveillance. This means that we need to establish clear governance processes for these tools and the data collection process and engage with a broader set of government and industry partners to ensure that children’s rights are not overlooked.
Ethical Considerations for Evidence Generation Involving Children on the COVID-19 Pandemic
Publication

Ethical Considerations for Evidence Generation Involving Children on the COVID-19 Pandemic

This paper identifies key ethical considerations when undertaking evidence generation involving children during the mitigation stage of the pandemic (emergency phase), on subject matter relating to COVID-19 once the pandemic has been contained, and once containment policy measures, including lockdowns, have been lifted (post-emergency phase). While the COVID-19 pandemic is undoubtedly a global crisis, with evidence generation activities raising critical ethical issues that have been captured in the literature and relevant guidelines, there are specificities relating to this emergency that must be considered when unpacking potential ethical issues. Hence while ethical issues pertaining to evidence generation involving children in emergencies and humanitarian contexts are relevant and should be considered, there are factors that define this ‘special case’ that must be considered from the outset. These will inform the core ethical considerations that need to be addressed.

Blogs

Ethical collection of data from children during the COVID-19 pandemic
Blog

Ethical collection of data from children during the COVID-19 pandemic

Our need to understand, quantify, forecast, track and unpack the COVID-19 pandemic fuels an insatiable need for data. While children are not the primary victims, they are significantly impacted in most areas of their lives, and will continue to be well after the pandemic is contained. Understanding the impact on children is critical. Understanding their circumstances will be necessary for current and future predictions of impacts of the crisis on them. Collecting information that helps us determine how best to respond to similar future outbreaks is essential. There is so much we don’t know, and our children’s futures depend on us knowing. We need to take care. We need to ensure that our desire to help, to understand, to learn and to do all of this quickly doesn’t overshadow the basic principle of 'do no harm.'We need children’s data, and we need it yesterday. We need data about them, and we will need to get data directly from them. This is necessary to secure the rights of children, ensure that they have a voice, are safe and protected and that their basic needs are met. Where physical distancing is in place, we will look to use both old and new tech to gather the data online or by using phones. We will explore pre-existing big data sets and try and create new ones with new tech. When the crisis is over, we may revert whole or in part to face to face interviews. Whether using primary or secondary data we will be collecting information on children because we need to. Despite restrictions on movement due to coronavirus containment measures health workers in Al-Hasakeh city, Syria, continue to provide guidance to a refugee mother at the shelters on infant and young child feeding practices.However, and this is a big however, we need to take care. We need to ensure that our desire to help, to understand, to learn and to do all of this quickly doesn’t overshadow the basic principle of “do no harm.” Whether we are considering using apps for contact tracing, or thinking of asking children via social media platforms about their day to day lives in lock-down, we need to do so with a critical lens on our belief that we will do good through the data collection. Read new Discussion Paper: Ethical Considerations for Evidence Generation Involving Children on the COVID-19 PandemicThe risk is obvious, if we don’t consider issues like equity, justice, respect, privacy, purpose limitations and data value add, then we are at risk of negatively impacting those that we are looking to support. Without appropriate ethical reflection throughout and beyond the pandemic, a number of negative outcomes for children will likely ensue including: Significant exposure to risk of traumatization due to inappropriate questions and timing and an inability to determine where they may be within trauma and healing cycles;Difficulties responding to and ensuring an appropriate duty of care during the emergency and immediately after, if observation or disclosure of abuse occurs and/or if significant psychological/physical needs are evidenced given limited access to services;Perceived and actual privacy and confidentiality violations and data collection in excess of requirements and without appropriate and truly informed consent impacting the child and eroding the trust of children and their communities;Data obtained for one purpose, such as contract tracing, being misused for political/social surveillance;Potential reprisals against child participation or even consequent to attempts at recruitment in evidence generation, heightened during the mitigation stage of the outbreak in contexts where children are in lock-down;Poorly designed evidence generation that produces unreliable data including: a) poorly designed instruments that make incorrect assumptions relating to impacts, needs, experiences, and homogeneity of children and their experiences and, b) using technologies that may not be accessible to disadvantaged children resulting in poorly informed policies and future risk mitigation in outbreaks that fail to equitably meet children’s needs and long-term development;Missed opportunities to obtain children’s perspectives and insights – not just those considered ‘children’s issues’ – and/or prioritizing subject matter that fails to take into account children’s priorities in relation to support during COVID-19 and other future outbreaks.A young girl in Cairo, Egypt, does schoolwork on a tablet while staying at home during the COVID-19 pandemic.In all the above examples our desire to help and to understand becomes subverted by the benign neglect of the ethical imperative. So whenever primary or secondary data collection from or about children is undertaken, explicit reflection is required on the timing, approach, necessity and transparency of the process. Consideration also should be given to privacy, representation, consent and importantly the circumstances of children. As always it will be the most disadvantaged and marginalized children that will suffer the greatest. Whether through exclusion, surveillance or lack of access to resources and cramped conditions these cohorts are at a heightened risk of vulnerability to multiple deprivations, psychological distress and exploitation both before and after the pandemic is contained. In these and all instances our response needs to be continuous vigilance, reflection and development of mitigation strategies. In some instances, the data collection should simply not go ahead in either the short term or indeed ever. We have to be prepared to put aside our ambitions for principles, to swap idealism for the realities of context, and to be ever vigilant.   Gabrielle Berman is Senior Advisor, Ethics in Evidence Generation at UNICEF Innocenti. This blog builds on a new Innocenti Discussion Paper highlighting ethical considerations involving children in research on the COVID-19 pandemic. For more visit UNICEF Innocenti's COVID-19 and children rapid research response  microsite.
New Technologies: Rich Source of Data or Ethical Minefield for Researchers?
Blog

New Technologies: Rich Source of Data or Ethical Minefield for Researchers?

We sat down with Gabrielle Berman, our expert on research ethics, to chat about her two new discussion papers which explore the ethics of using new technologies to generate evidence about children. The papers, written collaboratively with UNICEF’s Office of Innovation, highlight the advantages and risks of using these technologies to gather data about children. They also provide useful guidance for researchers – especially those unfamiliar with technology – on the questions they should be asking in order to protect children’s rights. What inspired you to produce these two discussion papers? During staff trainings, we kept getting requests from staff who wanted technical advice around technology and the use of technologies for data collection and evidence generation. Most didn’t know where to start or what to consider when thinking about using these technologies. This was the initial foray into an incredibly complex and important area for UNICEF around using technology for evidence generation. Why focus on social media and geospatial technology specifically? We started with social media and geospatial technology because these were the two that were the most prevalent in the organisation at the time, and there was the most demand for guidance. We should now also be considering the ethical implications of technologies like biometrics, blockchain and wearable technology. We have already started receiving requests from staff for technical advice around the ethical implications of these new technologies. From my perspective, one of the biggest misconceptions is that technology is unequivocally good, meaning these technologies could be used without appropriate reflection on the implications and potential impacts.The papers were written in collaboration with the Office of Innovation. What were the benefits of the process? Without this collaboration, the papers wouldn’t have the same status. The dialogue and relationships that were established through the collaboration of the two offices are just as important as the papers themselves. Working together also meant that we could establish an advisory group which spanned everything from ICT for Development, to communications, to data and analytics.  In this way, people with all sorts of expertise across the organization could input into the papers. What are some of the most common misconceptions about the use of these technologies for evidence generation? It depends on what side of the fence you’re on. From my perspective, one of the biggest misconceptions is that technology is unequivocally good, meaning these technologies could be used without appropriate reflection on the implications and potential impacts. However, for those on the technology side, one of the biggest issues is that technology won’t be used for fear of the complexity of the ethical implications. The benefit of collaborating with the Innovation office was that we had two different perspectives, but both are equally valid. Through dialogue, we acknowledged the benefits as much as the risks and agreed that what was required was reflective practice. It’s not an absolute yes or a no, but rather an “if” and “how can we do this?” and what strategies do we need to consider? The biggest challenge is understanding the implications of the technology when you’re not a native technocrat. It’s incredibly difficult for staff in the field to understand the type of questions they should be asking.What is the biggest challenge with regards ensuring ethical compliance when using these technologies? The biggest challenge is understanding the implications of the technology when you’re not a native technocrat. It’s incredibly difficult for staff in the field to understand the type of questions they should be asking. They also have to receive responses in simple English in order for them to start thinking through the ethical issues and the potential mitigation strategies. Part of the challenge is to change thinking and empower those who aren’t tech natives to feel comfortable enough to ask questions and interrogate the potential implications of the technology. We must avoid abdication of responsibility to tech experts and remind staff that they are in fact the experts on potential implications for children. To be a child advocate it’s incredibly important to ask the right questions, to understand and to take responsibility for the technology and its implications.Adolescent girls look at social media posts while attending a "Lifeskills" event in Union Development & Culture Community Centre in Djibouti.How can we empower people to feel like they can ask the right questions? Firstly, we need to provide them with guidance. But importantly, we need to get stakeholders around the same table - including the social media companies, the data scientists, and the communities we work with. We should bring these people with different perspectives together, acknowledging everyone’s expertise, and engaging in dialogue on what the potential ethical issues are and how they can be mitigated. This joint risk assessment is a key way to start a constructive dialogue on the issues and potential mitigation strategies. How can we mitigate the threat of data and algorithms informing policy, without appropriate engagement and dialogue? Firstly, it’s important to appreciate the implications of the data sets on which algorithms are based and to be aware that the algorithms may have built-in biases. For example, certain populations are more likely to be monitored and so data on arrests are more likely to be higher for this group. Following from this, we need to understand that certain populations may be excluded from the data sets. Algorithms are based on training data, so unless all communities are included in the data, the outcomes and predictions are not going to be representative of these communities. For example, when data is gathered via smartphones, those who don’t have a smartphone are excluded. We must not forget that there are very real risks in making decisions based on quantitative trends alone.Thirdly, we must recognize that modelling looks at trends only and does not consider individuals. Algorithms may see these trends as a whole but, like any type of quantitative data, it will miss the qualitative nuances underpinning these findings. While it may be very easy to adopt big data sets and use them to determine policies, we must not forget that there are very real risks in making decisions based on quantitative trends alone. The Convention on the Rights of the Child very clearly says that a child has the right to have a voice on matters that affect them. If we start basing policy exclusively on quantitative data, we are not giving voice to the nuances that may explain the data or the nuances of the individual who may differ from the broader findings. It’s very important that we acknowledge the value of big data, but we must also acknowledge that individuals still need to have a voice. Listening to children’s voices should never be replaced by a purely quantitative approach, so while data is a very valuable tool, it is only one tool. Are there any other ethical risks that stem from using this type of data? With social media data in particular, if you run algorithms against the data you may start using it for purposes other than those originally intended when people submitted this data. The moment you start taking data out of context, it may lose its contextual integrity in which case we have to ask whether it is still relevant. This idea of contextual integrity needs to be interrogated.Adolescent girls use cellphones and tablets in a solar kiosk providing internet connectivity in the Za’atari camp for Syrian refugees, Jordan.Does ensuring ethical compliance also provide an opportunity to educate people on technology and online privacy? I don’t see it as an opportunity but rather an obligation stemming from our use of these technologies. If you’re using third party data in a public way, it must also be announced publicly, in the spirit of full disclosure. We must recognise the importance of transparency in the work we do, particularly when it may be difficult if not impossible to secure informed consent. With social media projects, where you’re actively using the platform to engage young people, it’s incredibly important that you actually provide information around privacy. You cannot guarantee that children have child-friendly explanations, and so it’s our responsibility to educate and to be clear about the risks involved. Are there any additional potential risks specifically associated with geospatial technology? Geospatial technology has been invaluable in both development and humanitarian contexts, but we need to think about where we source the data, how useful it is, whether it’s a two-way dialogue, and if can we respond to any requests for help in humanitarian contexts. We particularly need to be concerned with the potential risks involved and the security of this data in humanitarian contexts. In these situations, we’re often dealing with vulnerable populations. Because of this, we must be very careful to ensure that this data is limited to those who absolutely need to access the data, particularly any raw and individual data, with specific consideration for the safety of the populations that may be captured by the technology. For this reason, we really need to think about the interoperability of geospatial data systems: which partners are we working with? who in these organisations has access to the data? why do they have access? do we have sufficient security measures? These types of reflections are necessary to fulfill our obligations to protect the communities that we work with.   You can download both papers on ethical considerations when using geospatial technology and social media on unicef-irc.org now.
Big Data, Ethics and Children
Blog

Big Data, Ethics and Children

We are now at a point where we must educate our children in what no one knew yesterday, and prepare our schools for what no one knows yet - Margaret MeadIn a matter of years the recording of a child or young person’s activities within the public sphere has gone from being consequent to an act of god (or heroics) to a relatively ubiquitous phenomena, slowly conquering continents, and reflected in the statistical estimate that 1 in 3 internet users are children (over 2 billion children). In this context, how can we fathom day to day lived reality of those 1 in 3 any more than they could conceive of my own childhood where data and information was found via little multi-coloured cards in wooden library catalogues? Further, and more importantly, what is the future of those populating and being shaped by this statistic? School children in Udaipur, Rajasthan work on computers during a class, at the Government Upper Primary School, Tidi.The answer to these questions are complex, and the solutions largely unknown. Like all ethical and philosophical conundrums there are frameworks that provide some guidance, but rarely specifics. The devil is in the details and the details need to be understood before we even begin to move forward. The facts are as follows: (1) This generation of children have had their lives ‘datafied’ – their digital footprints have been captured over their entire lifespans, and will continue to be, (2) The information contained on the internet and held within big data sets is pervasive and has the potential to substantially influence their opportunities as well as their ‘digital’ and ‘offline’ identities, with significant implications for their life course, (3) Provision, creation, ownership and utilisation of this data involves a complex chain of actors, with varying degrees of understanding of the implications, risks and potential mitigation strategies (4) We have not yet imagined future data applications, finally, (5) Children’s rights are enshrined in international and national legislation, and we have a duty of care to protect them and to respect and uphold their rights as their capacities evolve. So what does this all mean? First and critically: conversations about children and big data need to be had. A recent working paper from UNICEF Innocenti adds its voice to the movement to get these issues on the table and to push this discussion further. Beyond the written word we need knowledge exchange between all the stakeholders in the data chain. We need metaphorical group study rooms where communities, data analysts, child advocates and tech giants can share knowledge and reflect on data impacts, legacies and children’s futures. We also require technological solutions as well as systematic efforts to embed critical thinking on big data and children into both generic educational programmes as well as within programmes specifically targeted at data analysts. CLICK IMAGE TO ENLARGEMost importantly, but frequently omitted, is that we need to listen. We need to understand children’s understanding of privacy, their perspectives on how their data should be treated, who should have access, and what controls they would like. It would be as absurd to ask a contemporary pre-teen’s perspectives on the Dewey system as it is for institutions and governments to omit children’s voices in a context where, frequently, our perspectives are the anachronisms. In saying this, there are likely to be more solutions to protecting the security, wellbeing and data of children than those mentioned, and there will certainly be more in the future. However, it is not yet tomorrow. Today we should be laying the foundations and undertaking the preparations for a future where the rights of children and the generations that follow them are respected, recognising no one knows yet what the future will look like….but we all need to start learning and teaching now… Gabrielle Berman is responsible for providing advisory and technical support to ensure the highest ethical standards within UNICEF’s research, evaluation and data collection and analysis programmes globally. Subscribe to UNICEF Innocenti emails on any web page. Follow UNICEF Innocenti on Twitter @UNICEFInnocenti. Access our research catalogue here.

Events

Ethics in Humanitarian Research: A Practical Discussion
Event

Ethics in Humanitarian Research: A Practical Discussion

Join us for a lively discussion on ethical reviews for humanitarian research and monitoring.
Safeguarding and Ethics in Evidence Generation
Event

Safeguarding and Ethics in Evidence Generation

This webinar will connect issues relating to child safeguarding with those relating to ethics in evidence generation. It will underline key considerations in evidence generation and the planning and clearance process.   
Ethics, data & technologies in evidence generation
Event

Ethics, data & technologies in evidence generation

Reflecting on the use of social media and geospatial technologies in evidence generation,  UNICEF Innocenti's expert on ethics in evidence generation, Gabrielle Berman, contributed to UNICEF's Webinar on Ethics, Data and Technologies held on September 11 2019.

Podcasts

Ethics, data & technologies in evidence generation
Podcast

Gabrielle Berman on ethical research on children in humanitarian situations