Rise of platforms for data collection & analysis: Making the right choice
After blogging about the broad ideas around big data for development for the last few weeks, it’s now time to take a look at some nitty gritty details of data platforms on the rise and into the realities of how organizations are transitioning into this ‘data driven’ era.
How can organizations choose the right data gathering and analysis tool (if there is such a thing), what are the trends in practice and what has been learned from experience so far? These were some of the topics discussed at the Net Hope ICT4D webinar on October 16th – let’s jump right in!
Following on from Emanuel’s previous coverage of this event, we begin with Natasha Beale (Program Manager, Center for Effective Global Action, University of California, Berkeley) who gives us tips on how colleagues, partners and end users can be trained in new data platforms and methods without being bogged down with technical jargon. Here are the top 5 tips from Natasha:
1) Allot enough time for training. Never assume that a one-day seminar will be enough to master a new tool – try put aside a couple of days and include different learning styles to reach your audience effectively.
2) Be ready to spend on training. There’s not much point in budgeting for tools but not the training to successfully implement them. Remember to keep the numbers small too, not more than 15 people in a session.
3) Use the local language. Prepare properly by training trainers and preparing devices in the local language. Translating in real time is a bad idea!
4) Ask lots of specific questions. As a rule of thumb, assume that nobody knows anything and get your people to learn and to demonstrate their learning by doing.
5) Get your processes in place. Ensure that supervisors are trained to deal with problems that come up and understand the order of protocol when tech problems arise.
It’s hard to argue with Natasha’s tips, which encourage us to localize and focus on the human elements of training.
On a topic that is possibly the most trending and worrisome all at once, Priscilla Chomba-Kinywa (Global Tech & Innovation Advisor, Folkekirkens Nødhjælp, DanChurchAid) discusses the key considerations to secure privacy and compliance and perhaps leaves us with more food for thought than answers.
Firstly, Priscilla advises us to cover our bases when using mobile data collection tools by ensuring that there is reliable backup and checking the quality of encryption. Can someone do a google search and crack the code? If so, the encryption is clearly not secure and to this end Magpi is suggested as a tool that encrypts data well. Echoing the concerns of academics, Priscilla outlines the challenge of helping international partners to understand the importance of data privacy, especially when many lower/middle income societies do not yet have legal data protection and privacy frameworks (Taylor & Schroeder, 2014; Spratt & Baker, 2015). This is crucial especially in settings where so much sensitive data is being collected, for example from volatile situations or refugee camps. Priscilla asks us to question how we can collect less personal data, how we retain data and how we can anonymize it better and confesses that this keeps her up at night.
Questions of power and consent when sharing data come into play when aid is given, but personal information is required first. What does consent mean in different contexts and how can we be sensitive to this? Policy issues for data rights are not clear cut, and some argue that ‘while people have the right to privacy, they also have a right to food and health’ (Spratt & Baker, p. 29, 2015). This example illustrates the reciprocal effects of what can be considered the positive and negative values of big data in practice and as Priscilla discussed, the importance of being aware of these, especially when there is an imbalance of power.
Priscilla leaves us with some final tips:
1. Try to standardize your data collection tools in order to reduce the burden of data collection and to maximize security, but remember that no one data collection tool works in every context.
2. Don’t forget paper! It’s not an either/or, always use pen and paper in addition to your data collection tools/platform.
Our webinar ends with some words from Bryan Sobel, Technical Advisor for Catholic Relief Services who shares his experiences about maintaining approaches for data platforms that are as decentralized as possible, while maintaining a key point of centralization for analyses. Leaning as far as possible towards decentralization, Catholic Relief Services has a policy that higher levels do not get involved with a task that can be done by people on the ground. In order to maintain some data standardization, the organization uses what they call ‘minimum standard of the data model’ which in essence ensures that surveys across all projects have a set of core questions that do not change and a customizable section to make the survey unique and relevant. Sending data back to headquarters for analysis and more nuanced understandings is key, as those on the field do not always have the time or opportunity to step back to do this. Considering the fact that the organization follows hierarchical structures, decentralizing data as much as possible is a step towards participation and development ownership.
For me, this webinar was a fairly good example of communities of practice discussing ICT4D as part of the larger puzzle of development (Walsham, 2017) while highlighting the current tensions within data for development. I hope these tips have been as informative for you as they have been for this researcher, who loves getting the down-low on real life examples in (almost) real time.
Stay tuned for part 3 of this series where Emanuel and I cover the top data for development trends discussed in this webinar!