My final blog post will explore the potential for WhatsApp to both spread misinformation about coronavirus – and to combat it through chat-bots and sign-posting.
“I thought this was good info. It’s from a member of the Stanford hospital board. This is their feedback for now on the Coronavirus,” said a message from my apartment block WhatsApp group, usually used to complain about rent increases, offer up unwanted items and share funny coronavirus memes. The message went on to say that the virus could be killed by warm weather or by taking a few sips of water every 15 minutes. The information looked legitimate. It was purportedly from a member of a trusted academic body and used medical language. It was sent during a time when people were thirsty for information about how to protect themselves and their networks from COVID-19. A week later, the original sender responded to the group with an embarrassed blushing emoji, telling us that he had just found out the message was fake. But the damage was done. It was the sixth time my partner and I had received that message in a week.
Since novel coronavirus was first identified in Wuhan, China last year, the world has scrambled to contain, and control, the facts around it. The sheer amount of information flooding social media and the wider web has led to the World Health Organisation (WHO) labeling the outbreak as an ‘Infodemic’.
Over the past three months, analytics firm Brandwatch has reported over 300 million online mentions of coronavirus. In the early days of the outbreak, when information was repressed and little was known about the causes and effects of the virus, the information vacuum was rapidly filled with rumours. Even as more has become known about how to contain and control coronavirus, official voices risked being drowned out by the sheer number of alternative information sources. This has included a host of hoaxes and misinformation which could cause even experts to double take.
With 2 billion users in over 180 countries, WhatsApp is an important player in the global coronavirus response. Scaled sociability is a term used to describe the ability of social media to link the most private groups to the most public groups, and the smallest groups to the largest groups (Miller, D.; Costa, E.; Haynes, N. et al. 2016, pp. 3). In the same way that WhatsApp enables ‘scaled sociability’ by connecting people within my apartment block, it’s also scaling up the ability for misinformation to be shared across large groups of people. Social media has enabled us to become hyperconnected. The WhatsApp group is more than just a ‘group’, it’s a network of connections that span across the world.
Why we share
We are programmed as humans to share information which ensures our survival. We also attribute hierarchies to this social learning; trusting certain sources over others (Christakis, 2019, pp. 13). Problems occur when the information is wrong or misleading and when trusted sources of information, like medical staff and international organisations are appropriated or misrepresented.
Misinformation can be defined as ‘incorrect information shared in the misguided belief that it is correct. (Social Sciences in Humanitarian Action, 2020). One fake message purporting to be from UNICEF, telling people to ‘stay away from ice cream’ to avoid coronavirus, was translated into multiple languages and shared across the world. UNICEF country offices have reported the same forwarded message in Haiti, Uganda, Greece, Panama, Fiji, Myanmar, Philippines, Nepal and India. Not only is this information misleading – it also erodes precious trust in organisations when the information is found out to be wrong.
WhatsApp’s infrastructure, allowing for groups of up to 256 members, and it’s easy sharing functionality enables information to course through ‘distributed networks’ of individuals and groups (Rettberg, 2008, pp.69). Imagine you are a member of five WhatsApp groups with 20 to 50 members, and you share misinformation to all of them. At the touch of a share button, 100 – 250 people have this message – with the same ability to share amongst their networks too. In this way, misinformation on WhatsApp spreads much like a highly contagious virus – multiplying each time it hits a new cluster of contacts.
End-to-end encryption also makes misinformation much harder to track on WhatsApp. This means that although the app can deactivate accounts and groups which consistently violate the platform’s policies (including sharing misinformation), no-one, not even WhatsApp, can truely track the virality of messages sent on the platform. Because of the depth of ‘shares’ on WhatsApp, tracking down the genesis of a particular post can be near impossible. And when left unchecked, rumours proliferate (Social Science in Humanitarian Action, 2020). Once viral on WhatsApp, misinformation easily crosses over to other platforms, with the same UNICEF misinformation reportedly shared on Twitter, Facebook and Viber. The infrastructure of mobile phones and internet connectivity paired with the sharing functionality of social media platforms helps information spread at breakneck speed.
The WHO is working against this tide of information to make sure distributed networks such as social media platforms, media outlets and influencers refer back to them as a centralized source of factual information. This helps reduce misinformation by ensuring everyone is consistently sharing science-based facts in a rapidly evolving situation. The more users armed with the correct information, theoretically, the more people can break the chain of misinformation by actively challenging it within their networks.
One innovative way WHO is centralizing information on WhatsApp is through the creation of an automated chat-bot. Users are able to click on a sign up link which directs them to a WHO WhatsApp number. All you have to do is say ‘hi’ to trigger a series of options ranging from symptoms, prevention advice and the latest numbers and press releases. The information is clear, simple, engaging and can be updated at any time with the latest developments.
Other organisations are using chat-bots too. Within weeks of the outbreak, UNICEF’s U-Report, which uses SMS and chat apps like Facebook Messenger and WhatsApp, had mobilised to provide timely and factual information about coronavirus to young people in countries across the world. In addition to the WHO’s broadcast model, U-Report is based on two-way, participatory C4D principles and designed to both share information, and receive feedback. Since the outbreak, UNICEF Indonesia has been using U-Report to poll young people on their levels of knowledge about the coronavirus. Knowing that less than a quarter of respondents knew that coronavirus was spread through infected respiratory drops through coughing and sneezing was important intelligence for UNICEF Indonesia and the Indonesian government in the creation of targeted, data-driven content across other channels.
Social media is more than a platform; it’s content too. By being on the app through chat bots, trusted organisations such as UNICEF and WHO can help share factual content which can in turn be shared through WhatsApp’s distributed network of groups. Official content from WHO on WhatsApp helps improve the self-efficacy of users by arming them with the facts and empowering them to share trusted sources.
Social media sign-posting
When you search for information about coronavirus, or when you engage with coronavirus related content on Facebook, Instagram, Twitter, Google, YouTube and TikTok, an advisory button appears, giving you the option to find out more through reliable sources of information such as the WHO or government health organisation websites. Adding in ‘choice’ into social media platform infrastructure increases the availability of information, which is the first step towards ‘knowledge acquisition’ – an important development outcome under Alsop and Heinsohn’s dimensions of choice theory (Kleine, 2010, pp. 679). In the case of TikTok, you’ve just watched a funny video about coronavirus related toilet paper shortages. With the red button below the video, you now also have the choice to find out more information about how to protect yourself from the virus by visiting the CDC website.
Access to information is being engineered through moderation and advertising too. Marc Zuckerberg, founder of Facebook, recently announced sweeping measures to fight coronavirus by removing or reducing misinformation through teams of moderators and providing unlimited ad credits to the WHO, and millions to other frontline organisations. While moderation will hide harmful information from viewers, advertising will help target audiences with the right message at the right time.
WhatsApp, also owned by Facebook, has recently followed suit by setting up a coronavirus information hub providing educators, health workers and governments with information on how to use the platform to connect with users and to fight misinformation. It has also donated $1 million to fact-checking organisations which WhatsApp users can use to verify information on the platform. Additionally, it is disrupting information flows by reducing the number of times a message can be shared.
Due to its end-to-end encryption, WhatsApp can’t target users in the same way other social media platforms do. This has forced organisations such as the WHO and UNICEF to warn about WhatsApp misinformation through other social media channels and through the media. A more effective approach would be for WhatsApp to provide users with an in-app ‘information choice’ – warning people to check the facts about coronavirus before sharing. As people spend more time on social media platforms and chat apps, the brokering of what information is seen, and what is not, will become increasingly important during public health emergencies.
Even better, WhatsApp could introduce push notifications in high risk areas – as the current WHO/WhatsApp chat-bot necessitates proactive engagement from the user. This would help reach far more people, much more quickly.
This blog post is techno-optimist in its approach. As much as social media platforms like WhatsApp have fanned the flames of misinformation in the coronavirus pandemic, they have also provided a significant conduit for helpful, and sometimes life-saving information. Sign-posting and the use of chat-bots is an efficient way to ‘scale’ access to trusted information in an automated way. Before WhatsApp and WHO had even announced their chat-bot, over half a million people had signed up. Since yesterday, I’m pleased to report that I’ve seen the link to the chat-bot in three different WhatsApp groups. I hope to see it in many more.
Since January – and the start of the New Media, ICT and Development course – I’ve been coordinating UNICEF’s coronavirus social media response for UNICEF. It’s still very much a whirlwind. So far we’ve flooded our channels with helpful coronavirus content on topics ranging from misinformation correction, stigmatisation, virus prevention, child protection and learning from home. We’ve worked with Facebook to distribute free advertising credit to country offices and helped develop a Facebook hub which directs anyone who is searching or posting about coronavirus to reliable sources like the WHO and UNICEF websites. We’ve created analytic reports which provide vital insights into online conversations and misinformation on coronavirus across high risk countries. We’ve filmed viral TikToks, creatively using dance and colourful stickers to convey key coronavirus prevention messages. We’ve created informational videos which have gone viral on WhatsApp and have been used as part of CNN news reports. We’ve worked with YouTube influencer and therapist Kati Morton to talk about dealing with coronavirus related anxiety. We’ve collaborated with WHO, IFRC and UNICEF’s C4D and U-Report teams to share information and best practice. We interviewed UNICEF’s coronavirus lead to answer our audience’s pressing questions through Facebook, Twitter and YouTube lives. And most relevant to this blog, we’ve been working with WhatsApp to ensure factual information about WhatsApp reaches more young people than ever through the U-Report chat-bot. This is just the start of much more to come.
So it is by no means a surprise that my final blog piece is an exploratory look into the pitfalls and potential of technology and social media in public health emergencies. My decision to focus on WhatsApp was driven by my own professional need to grapple with the implications of the private platform having seen similar issues with social media driven misinformation during the Ebola response in DR Congo. The process of writing the blogs while dealing with a major health emergency helped me pause and reflect on the theory behind the practice. More, this blog has been an important ‘mediated’ way of representing my personal reflections on work to a public audience (Rettberg, 2018).
This is an expansion and elaboration of earlier blogs on the coronavirus infodemic and TikTok, coronavirus and the power of movement. As part of a group blog, we also looked at the impact of coronavirus on this year’s Commission on the Status of the Women event. I took a break from writing about coronavirus with one blog exploring diversity in the youth climate movement, and one on practical ways to improve accessibility on social media.
The views in this blog are personal, and do not reflect those of UNICEF.
Know the facts about coronavirus by visiting the WHO website.
Brandwatch. 2020. COVID19 Brandwatch Resource Centre. Retrieved from https://www.brandwatch.com/cv19-resources/.
Christakis, N. 2019. Blueprint: The Evolutionary Origins of a Good Society. Little, Brown Spark. New York.
Golebiewski, M; boyd, d. 2018. Data Society. Data Voids: Where Missing Data Can Easily Be Exploited. Retrieved from https://datasociety.net/library/data-voids/.
Kleine, D. 2010. ICT4WHAT?—Using the choice framework to operationalise the capability approach to development, Journal of International Development 22:5, 674–692.
Miller, D.; Costa, E.; Haynes, N. et al. 2016: How the world changed social media. London: UCL Press.
New Scientist. 2019. WhatsApp restrictions slow the spread of fake news – but don’t stop it. Retrieved from https://www.newscientist.com/article/2217937-whatsapp-restrictions-slow-the-spread-of-fake-news-but-dont-stop-it/.
Rettberg, J. 2008. Blogs, Communities and Networks in Blogging. Polity Press; Cambridge.
Rettberg, J.W. 2018. Self-Representation in Social Media, in: Burgess, J., Marwick, A. & Poell, T. (eds.): SAGE Handbook of Social Media. London: Sage.
Social Sciences in Humanitarian Action. 2020. Key considerations: online information, mis- and disinformation in the context of COVID-19. Retrieved from https://www.socialscienceinaction.org/.
U-Report Dosier. 2019. UNICEF Office of Innovation. Retrieved from https://www.unicef.org/innovation/media/4171/file.