Convenors Inigo Howe and Naomi van den Berg reflect on the organising and running of the Cambridge misinformation hackathon which took place in November 2023.


In a recent report, the World Economic Forum highlighted misinformation and disinformation as the predominant global risks for the next two years. This isn’t surprising to many, as we’ve increasingly witnessed how misinformation amplifies various crises; from armed conflicts to pandemic responses. Tackling this information distortion isn’t straightforward. It demands an interdisciplinary strategy, intertwining policy, education, and technology. Recognising the complexity of this challenge, we initiated ‘Students Against Pseudoscience‘ at Cambridge University. At Cambridge, we are lucky to have access to a highly talented and motivated student body. We saw a great opportunity there – to bring together students from different disciplines to think about practical ways to tackle misinformation. It’s led us to the idea of organising a hackathon.

Organising the hackathon was both challenging and enlightening. With invaluable support from CRASSH in terms of planning assistance, funding, and hosting, we embarked on a mission to organise a hackathon that would encourage both innovative thinking and realistic solution generation. The interdisciplinary nature of misinformation required a hackathon design that accommodated a wide array of perspectives and solutions, from technological innovations to policy drafts and literature reviews. This can make it challenging to strike the balance between giving participants the freedom to do what they want and giving everyone a similar direction. Let alone a set of ideas that can be easily compared by the judges.

The hackathon was structured to attempt to give a full understanding of misinformation and its ramifications. It consisted of two main phases:

  1. Analytical phase: Participants were tasked with identifying a misinformation issue, contextualising it within a broader societal framework, brainstorming possible interventions, and outlining a detailed implementation plan for a chosen solution. This phase was designed to foster critical thinking and ensure a deep comprehension of the misinformation challenge. They were asked to think about how their ideas fit within existing solutions and what potential pitfalls could exist.
  2. Implementation phase: Teams had the freedom to further develop their ideas, putting theoretical plans into practice. This segment of the hackathon was aimed at encouraging creativity and practical engagement with proposed solutions.
Participants at the Hackataon in a seminar room.

The Hackathon gets underway. All images © Judith Weik

This format was chosen to ensure participants not only engaged with the problem of misinformation on a theoretical level but also considered the practicality and scalability of their solutions.

We ended up with seven teams of 4-6 people signed up to the hackathon. Fuelled by pizza and tea, they set off on brainstorming solutions.

The diversity of projects presented at the hackathon was a testament to the wide-ranging approach taken towards tackling misinformation. For instance, one team made a plan for a digital platform aimed at enhancing children’s critical thinking and digital literacy through engaging, game-like learning experiences. Another team produced a plugin tailored to address the issue of misleading numerical data in news articles, making complex information more accessible and understandable to the general public. This plugin stood out for its practical applicability and potential to make a significant impact in enhancing media literacy.

Other notable projects included an idea for a sophisticated system designed to combat video misinformation through a community-driven evaluation process. Another prototyped tool leveraged AI to detect and clarify manipulated images by analysing historical data via a novel reverse image search approach. Another project proposed a colour-coded system aimed at simplifying the evaluation of medical information on social media platforms. Each solution reflected an impressive engagement with the challenge of misinformation.

The event concluded with presentations from each team, followed by a rigorous judging process. The plugin for clarifying numerical data in news articles was ultimately awarded the top prize; a decision that highlighted not just the solution’s ingenuity but also its potential for real-world application. I was delighted that the judges expressed their admiration for the high level of thought and creativity evident in all the projects.

What’s next?

We hope that the participants have left with a heightened awareness and a resolve to contribute to the ongoing dialogue and solution-finding process regarding misinformation. To build on the foundations laid by the hackathon, we are looking to implement three key strategies:

  • To look inward: We’re taking a reflective approach to examine our practices within the Cambridge community. This involves critically assessing how internal attitudes, current communication styles and aspects of academic culture might stand in the way of effectively addressing misinformation. We want to make sure that we’re not just part of the conversation but also part of the solution, by understanding where misinformation starts and how it spreads.
  • Centralise: Our goal is to create a central hub of resources and interdisciplinary expertise of relevance in mapping and tackling misinformation. By doing so, we aim to facilitate better communication and collaboration among those of us dedicated to this cause across the University. A central repository of information and platform to exchange useful tools will also help in elevating the importance of addressing misinformation.
  • Lead: As a leading institution, Cambridge is in a prime position to pioneer efforts in enhancing information resilience. We intend to draw upon our rich research culture to develop resources and guidelines that can aid in institutional and individual resilience against misinformation. We will continue to push people in positions of influence at Cambridge to step up to this challenge.

Watch a short video summary

 

CENTRE FOR RESEARCH IN THE ARTS, SOCIAL SCIENCES AND HUMANITIES

Tel: +44 1223 766886
Email enquiries@crassh.cam.ac.uk