It happened in a matter of weeks. All of a sudden, the world faces a challenge that affects all of us in several ways. From an ICT perspective, this virus spreading across the world also has interesting consequences related to the tech world. People are increasingly using digital platforms to adjust to quarantine life and social distancing; ordering food, medicines and other necessities, entertaining themselves, working and interacting with other people etc. We are consuming information and news online like never before, every minute there is a new update on the topics of global pandemic. Just imagine, what would we do without the Internet these days? In my last blog post, I will revisit some of the themes from earlier texts, reflecting on the current global situation. I will also look into language as a point of departure for ITC4D-related issues.
The different faces of Big Data
On this blog I have been presenting a few of the different faces of technology, finding both positive and negative aspects of these digital tools. I have gotten a deeper understanding of the potential power that programming, coding and platforms like social media still holds for development and social change. Simultaneously, technology is being used to counteract civil movements and journalists in their search for justice and truth and also presenting major challenges regarding discrimination and bias.
Technology is ever developing and constantly taking on new contexts, issues and challenges daily. Today we find ourselves in new territory. With more than a billion people forced to stay inside, the Big Data mining business must be absolutely booming. This might be alarming, remembering the consequences of what can be done with Big Data if it lands in the wrong hands. Regarding the dark side of Social Media that we discussed in earlier blog posts; the power of Big Data is undeniable. It is a tool that when collected and analyzed can be used with malicious motives to manipulate, to control and change behaviors. But like most things, there are two sides of the same coin. Big data can be used to fight the pandemic (for good or bad), one example of tech in the fight against Corona being mentioned in Dovile’s post on AI and the corona virus. It could help to predict the spreading of the virus and find commonalities to identify new important information. But it is not always without sacrifice.
Big Data to save lives -worth violating Civic Rights?
According to the news, the EU Commission now wants the cellphone companies in Europe to extract anonymous cellphone data with the objective to control and track the spreading of the virus. Some countries in the EU are already using this data to track movement patterns and to make sure that people are staying in quarantine. This presents a dilemma: protecting people from getting sick and ultimately saving their lives on one hand, and utilizing private data – consequently sacrificing civil rights to control the population on the other. There are drones flying over the European cities, surveilling people from above. All of this is already put into action in many of our neighbor countries (not just China). These breaches in privacy are seen as temporary and applied in a time of crisis, to help and protect citizens, but what happens after the crisis is over? It seems to me that we are quickly entering a new reality, where fear might make us sacrifice our rights and accept a new level of governmental control, that later could become normalized. What consequences will there be in the long run? Can we be sure that we will regain our freedom and our human rights? If this is happening in countries like Norway, what will then happen to other fragile democracies?
In the words of Milan and Treré:
”In sum, what happens to those individuals and communities “at the bottom of the data pyramid” (Arora 2016), be it for class, racial, legal, or sociocultural reasons?” (2019 p.321)
Will certain governments exploit this new power to control their citizens for other reasons than the pandemic? Will this window be used in counteracting human rights defenders? Dr. Linnet Taylor describes the problematic situation of Big Data as following: “These granular data sources which allow researchers to infer people’s movements, activities and behavior have ethical, political and practical implications for the way people are seen and treated by the state and by the private sector.” (2017 p.2) Taylor continues with the observation that discrimination related to data in different forms is advancing as fast as Big Data is developing, but the awareness and mechanics to combat these issues are unfortunately not. In her article, she asks herself: what is data justice and digital rights? She believes that visibility, engagement with technology and non-discrimination would be the three ground pillars for reaching Data justice (2017 p.9).
Privilege and power structures in the Tech world
Concerning the consequences that Big Data presents to the individual and civil society, there seem to be two major issues (out of many). One is the topic of control and surveillance, as discussed briefly above (and mentioned in our previous posts.) The other issue is ”how mechanized processing of data amplifies existing forms of discrimination” (2019 p. 4). It is about power dynamics within these certain processes of data. White masculinity is overwhelmingly the normative core of tech industry; it ultimately is their reality and perspective that will be reproduced. Creating diversity within the tech industry is vital to counteract the discrimination that is being reproduced in several ways through vessels like applications, social media platforms, AI, face-scans etc. Technology is not neutral; it will reflect the values of its creators.
This also applies not only to the male gaze, but the western or Eurocentric gaze as well. Unfortunately in many occasions, us westerners tend to forget being reflexive, to critically see ourselves from the outside and acknowledge the specifics of our situation, culture and its values, the prejudice and stereotypes we carry. We are often blind to the fact that certain aspects of our reality are not universal, that our knowledge is not neutral and that we are not as objective as we might believe.
Milan and Treré discusses the popular view of Technology and Big Data; they are not only innovative wonders, booming with some sort of objective knowledge and endless possibilities. They describe data and tech as a “mythology” and an “ideology” (2019 p.322). But the tide is turning. What before was surrounded with tech-optimism is now often the target of critique. It is clear that we should continue to critically interrogate and expose the different aspects of big data, that although praised through modernization narratives, are definitely not without risks and threats.
Social distancing and Critical tech glasses
Greater knowledge leads to new reflections and a greater awareness. The process of having created and maintained this blog together with my teammates has been very rewarding. Investigating the topics on ICT has definitely opened my eyes to new issues within the digital world that I haven’t dedicated much thought before. One example of having my new critical tech glasses on, is when I started to learn French. In these days of social distancing I decided to download a language application. The more I used the application, the more I reacted to the examples of sentences that manifested themselves on my screen. Sentences like “my wife cooks”, “the man knows”, “the boy is strong”, “the girl is beautiful” and “we respect his wife” (to mention a few) evoked a concern, I started connecting the content of the application to the bias and discrimination that manifests itself in different forms throughout the tech world. Could this be a display of stereotyping and sexism in data, from a linguistic point of view? As an application with the objective to teach a language, it is a clear example of presenting and reproducing a certain discourse. I started doing some research on the history and content of the application, called Duolingo.
The creator of Duolingo is a Guatemalan man, Luis von Ahn, who saw the need of a platform for learning languages in a context where many individuals were economically out of reach for taking conventional (and expensive) language courses. In his team for this project, there were seven men and one woman. This particular language application is practicing diversity and inclusion in other forms, for example making the app available for free and including smaller endangered languages like Navajo and K’iche. But the gender perspective seems to be something of a blind spot. And how about even including a LGBTQ perspective? The application has 300 million registered users across the world, influencing the way people use languages and ultimately consume values through these languages. The program wants me to repeat the sentences out loud, to embody and express these underlying values. In many cases it seems to be reproducing a hetero normative and sexist narrative. To avoid this type of content it might be a good idea to add a gender analysis, or just critically examine what discourse the app is reproducing to its students. Could it be that the overrepresentation of “white guys in programming”, coding and tech in general is influencing this program? Or maybe it is all just coincidence.
Language as a way of resistance
Speaking of languages, there is a particular project related to data and development I would like to highlight.
Today there are many digital initiatives to save and spread the knowledge of different languages and cultures that because of the colonial history have been marginalized for a long time. Milan and Treré touches upon these issues in their investigation on Big Data from the south.
“Western modernity has supplied the knowledge underlying colonialism, and later global capitalism: both historical processes have marginalized and devalued the knowledge as well as the specific ways of knowing of the Global South.” (2019 p.321)
This continues to be true in the tech industry that is steadily steering the development and the dynamics in the direction of western concerns. In the project of evangelization and hispanization to make the native population part of the Mexican nation, Spanish was the official language and the governments denied the native languages the actual status of valid languages. Today, the Mexican government recognizes 68 national languages. One of the initiatives to lift native languages in Mexico is called 68 voces (68 voices), a non-profit project that made a series of short stories representing the different groups and languages in Mexico. They later had Mexican illustrators making the stories into short animated films. All this under the premise “No one can love what they don’t know”. Their objective is to show new narratives and representations of a pluri-national Mexico; taking a step away from the stereotypical portrayal of Mexican people and giving space for a reinterpretation, empowering the different indigenous communities that have been discriminated for centuries.
Documenting data, sharing knowledge and changing the narrative is all part of the post-colonial project for social justice, taking back what has been lost during years of repression. Racism and classism are still marking the context of the country, but through projects like these, the narrative is slowly changing. This could be an example of data from the margins, as mentioned by Milan and Treré, where data is utilized for resistance and creativity. Another example could be the Arab Digital Expression Foundation, where young people have the possibility to express themselves through tech, in a context of repression. The Egyptian revolution was created through digital expression. The world needs more alternative data, created in the Global South.
Writing, reading, and commenting on blogs and in social media is still an important part of how international development is discussed on the Internet. It continues to be a vibrant space for discussion, for getting the latest news and for sharing different perspectives. The assignment of creating a blog has to me been an educative way to combine theory and practical skills. We have practiced communication and teamwork within the group, inspiring and supporting each other. We overcome technical issues, put time and effort on esthetics and design to make the blog as accessible as possible for our readers. I have gotten new insights in social media strategy through the simple practice of trying and sometimes failing, sometimes succeeding in the challenge of reaching readers and involving them on our different platforms. I have certainly gained knowledge about the effects of our digital world from new perspectives.
Technology is a part of us as humans. It is part of our lives and maybe more importantly, it is a product of us, our ways and our values. Just as in every day life, we will meet inequalities, injustice and discrimination. We will also encounter initiatives for change and development, creative ways of using these new tools for good. There is still much need to examine how we can improve our relation with the internet, making it a more inclusive and equal world. Big data and digital tools need to be monitored so they are not easily being utilized for criminal reasons. It is a new, fast moving world where we need to keep up. I believe that it is important to look beyond the wonders of technology and apply a critical perspective to improve the way we interact with and manage this new world of tech, AI and Big Data. It’s here to stay and we need to learn how to live with it the best possible way. It will be interesting to see how our digital world will develop. Hopefully we will be able to mend the digital divides that exist also in the real world. The corona crisis is a huge challenge, also in the digital world, but it is also the time where tech can show its potential. Hopefully digital technologies can prove their importance in a situation like this, and truly make a difference for people across the intersectional and geographical borders.
Beraldo, D., Milan, S. 2019: From data politics to the contentious politics of data, Big Data & Society, First Published November 2019.
Milan, S., Trere, E. 2019: Big Data from the South(s): Beyond Data Universalism, Television & New Media, April.Couldry, N., Mejias, U. 2018: Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject, Television & New Media
Taylor, L. 2017: What is data justice? The case for connecting digital rights and freedoms on the global level (Links to an external site.), Big Data & Society, 4:2.