Shauna Concannon (Giving Voice to Digital Democracies project, CRASSH) introduces Children and artificial intelligence, a workshop that brings together experts from academic disciplines such as sociology, psychology, computer science and linguistics, as well as leading figures from regulatory bodies, the charity sector, and child-focused agencies.

Children and artificial intelligence: risks, opportunities and the future’ takes place on 25 April 2022.


Q: Shauna, what is your event about? What drew you to the subject?

This workshop explores the various ways that children’s lives are impacted by artificial intelligence (AI). From smart toys and virtual personal assistants, through to recommender systems and educational technologies, many young people are already interacting with AI on a regular basis. This has presented a range of opportunities for young people, across creative, playful, educational and social domains. However, it is also acknowledged that beneficial factors may be differentially experienced, with some groups having greater access than others. Furthermore, there are evident negative impacts that may result from these engagements, at both the individual and societal level.

Recent initiatives have begun to examine the ways that AI systems and online interactions may have harmful consequences for children. For example, interactions with Virtual Personal Assistants have been shown to reinforce gendered stereotypes and on social media pre-teens and teens may be exposed to cyberbullying, or content that is sexually explicit, racist, sexist, homophobic or transphobic. DCMS reported that 80% of 6-12 year olds have accessed harmful content online, but different groups are subject to distinct risks and developing our understanding in this area is critical.

The Draft Online Safety bill sets out government plans to protect children from harmful content online. However, it has also sparked a boarder debate about how harm and safety are conceptualised and defined. While legal and regulatory frameworks are in the process of being established, now is the time to collectively, and critically, examine the future of online safety.

Q: What do you find particularly interesting about the topic?

How to promote online safety raises important questions surrounding agency. For adults, exercising agency over how we access content and engage with technology is not always straightforward and this is clearly further complicated for children. How young people engage online may involve complex and blurred boundaries of risk and reward. These issues require a nuanced treatment as there are key benefits too, that are especially essential for certain groups. Consequently, we need more research that enables us to understand these behaviours and examine how (direct and indirect) harms manifest in practice.

In addition, it is important to consider how we can support young people to cultivate the agency and skills required to develop positive strategies for prioritising their own well-being and safety in digital contexts. There can be a tendency for discussions around online harms and young people to come across quite paternalistic. I’ve worked with young people previously on projects about technology and data, and I’m often impressed by the sophistication and thoughtfulness they demonstrate in discussions on complex topics. As well as assessing the potential of technical and regulatory approaches, I’m interested in how children can be involved more directly in the decisions that are made about them.

In your view, wherein lies the event’s main contribution to our understanding of how technologies impact the lives of Children?

The event aims to enrich our collective understanding of how we conceptualise harm and enable us to think through what a duty of care framework should practically entail. While many organisations are beginning to develop strategies for addressing emergent safeguarding issues that affect children, a collaborative effort to share best practice more widely could lead to more effective handling of these challenges across sectors.

The technical challenge of developing and implementing robust strategies for identifying problematic content is complex. However, understanding what constitutes harm, how it manifests in actual interactions, and defining best practice approaches is fundamental too. Access to data that would allow researchers to examine harmful interactions and interrogate edge cases more thoroughly is often difficult to obtain. A more joined-up approach that addresses such challenges would be highly beneficial.

Q: Around which themes did you decide to structure the event, and to what end? 

The event is structured across two sessions. In the first session, the focus is on projects that involve the design and evaluation of technologies for and with young people. The speakers will present case studies that surface key ethical and practical considerations. The second session considers the future of online safety, with some opening remarks from individuals working on child safety, online harms, and digital wellbeing, followed by a Q&A and discussion.

In addition to the workshop, attendees are invited to attend our lunchtime work-in-progress session. This will provide an opportunity to meet others working on related topics, and discuss their current projects in short, informal presentations. Anyone interested in joining, please contact Bianca Schor with the title of your presentation.

Q: What are the key questions you hope to address at the event?

Key questions we hope to explore in the workshop include:

  • How do AI systems impact on children’s development and wellbeing?
  • How can natural language processing be usefully applied for creating safer online environments for young people?
  • How should harm be conceptualised and what are the challenges involved in automatically classifying content as harmful?
  • How can technology developers be incentivised to prioritise a duty of care framework and what role should regulation and legislation play in ensuring this is implemented effectively?
  • How can young people be supported to develop a critical understanding of AI and how it may impact their lives?
  • How can a cross-sector approach be implemented to address this issue more effectively?

Q: Who did you have in mind when you organised the event? Would it make sense to someone outside your field?

This is an emerging challenge area that will benefit from a collaborative and interdisciplinary approach, so we hope the event will be accessible to a wide range of audiences – from developers of technologies or policy, through to individuals working in the field of children’s rights or digital wellbeing. Anyone interested in how technology impacts on the lives of children and the ethical implications of AI is very welcome to join!

Q: Where might one find more details about the event? 

More information about speakers and timings for the event can be found on the CRASSH event page and registration is open on the Eventbrite page.


Shauna Concannon is a researcher on the Giving Voice to Digital Democracies project which looks at the societal and ethical implications of Artificially intelligent communications technologies. I am interested in human behaviour, sense-making practices with data and technology mediated interactions.

CENTRE FOR RESEARCH IN THE ARTS, SOCIAL SCIENCES AND HUMANITIES

Tel: +44 1223 766886
Email enquiries@crassh.cam.ac.uk