“We are more than just users or customers of these services. We are generators of data – they need us to like them for them to be indispensable” – Ali Rae, Multimedia Journalist and Filmmaker at Al Jazeera
The awareness of new technology is increasing as our everyday life becomes more and more centred around digital solutions. The rules and ethics we play by in face-to-face interactions seem to slowly be catching up with the online space as discussions about online behaviour, facts vs fake and privacy fill our feeds, our learning curricula and government halls. Admittedly, I have to confess I am no expert in these areas, I tried to apply the rule “if I wouldn’t share it with an acquaintance, why share it online” with the slight oversight on the fact that I would have to use my Facebook account to log in on other platforms. I quickly came to realise that my sense of strangers outside of the group of acquaintances were not individuals as in daily life, but more so companies. What does it mean when a platform asks for my personal details? What does it mean when an app asks to see my location? Why does every platform collect my “cookies” and why do they ask for my agreement beforehand? An agreement implies that there is something of mine that you want access to, but to understand exactly what, I would need to spend time reading through lengthy pages of legal text – it kind of kills the buzz when you’re in a rush to look up that cake recipe.
In the times of digital advancements, data is what drives innovation and new technology forward. My individual data might not mean as much, but in a collective, a data set of hundred thousand and millions of people can help identify behavioural patterns and needs. This data can be used for companies to improve their services, for developing something that meets our needs but also to categorise people under certain criteria. Data in itself is neutral, but how we interpret it and what we use it for can come to be for good or bad. When we provide our data companies have to mention what they will be using it for (this is the case of the EU but not all places around the world have this transparency) but they don’t have to mention how it will be interpreted. Simply saying, we will use it to improve our services will not determine whether the formula or algorithm used to improve their services inhabit any biases.
“We are often told that data are the new oil. But unlike oil, data are not a substance found in nature. It must be appropriated.” – Nick Couldrey
As a new enthusiast of many things development and digital, I recently came across the notion of ‘data colonialism’ used to describe the situation many finds themselves in today. “The capture and processing of social data unfold through a process we call data relations, which ensures the “natural” conversion of daily life into a data stream. The result is nothing less than a new social order, based on continuous tracking, and offering unprecedented new opportunities for social discrimination and behavioural influence.” Nick Couldrey and Ulises A. Mejias explain in their publication. They make the reference to colonialism not in the sense of the violence and brutality of slavery but in the act of how land was conquered in a quest to rule territory, extract resources. In their opinion, this act can be applied to the way big corporations move in the online space, with the quest to extract data, influence and appropriate behaviour.
The online spaces have for a long time been under-regulated and in the last couple of years, many governments around the world are trying to find ways to regulate the space. But just as oil, many find it to be a complex issue, how does one regulate for the rights of individual privacy and at the same time for the future opportunities of technology. Al Jazeera’s mini-doc series ‘All Hail the Algorithm’ has looked into this specific situation and the many layers the discussion has. In one of its episodes, it explores the situation in Kenya where big tech companies have had an upper hand in the chase of connectivity in exchange for data. It looks at how some big tech companies provide cheap solutions in exchange for data. They manufacture cheap mobile phones, provide connectivity to a range of well-selected data-driven apps that are mainly their own – the more you use them, the more data you produce. And should you want access to a wider range of application and information you have to pay more. In the situation of Kenya, and many other countries, this is problematic as the big tech companies are often foreign companies, they don’t answer to the government, nor the people who use their services.
“… what is mostly interesting in what I’ll call ‘techno-politics’ is the rush to ‘connect the unconnected’ and the rush to retain the connected in very specific platforms. A lot of these actors will do anything and everything to make sure at some point or others these users go through their platforms because it is all about the data.” Nanjira Sambuli, Digital Rights Advocate at the World Wide Web Foundation who is being interviewed in the mini-doc series.
In many countries around the world, the question still remains as big tech continues to develop services and apps aimed for our everyday needs. I believe it needs to become a more frequent exercise to ask ourselves, on which terms are we connecting? To what extent are big tech the drivers behind establishing digital social orders and how influenced are we by those tweaks. What standards of accountability are there and where do we draw the line on the chase? Just as many might feel the collective pressure of connecting online, the online platforms are nothing without our collective connections.
🤔Give your two cents. Do you do anything to stay aware of what data is being collected about you? How tech-positive, tech-critical and/or tech aware are you? Share your situation below and let us know what policy advancement exists in your country.