Sunday 10 May 2020

Demystifying data in the social sector


Every good data analyst is aware of the “Garbage in = Garbage out” formula.  What this means is, the analysis is only as good as the data. One cannot expect to get actionable insights from data when there is a data quality issue, data integrity issue & data that does not capture the outcome indicators. 
In the social sector, there is no dearth of data. Every activity could be a step to possible interventions, policy ideas and partnerships. Every day spent on the ground is an experiment in the making, leading the organization to pave new paths to solve a complex problem. When this is the case, how can the organization know if they are onto something great? Through evidence. Always. By recording each activity, by doing that activity over a repeated period of time and capturing that activity in both a quantitative and qualitative manner. This is too basic, isn’t it? Yes, it is. But “capturing the activity” is not something everyone gets right. In fact, how you capture the activity is what most non-profits get wrong.

I’ve had the opportunity to ponder over some of the challenges with data in the social sector since I spend ~40% of my day working as a data specialist at a non-profit organization. And here’s what I have learned - Non-profits can become more "data-smart".
1)      Understanding data in an abstract sense: The term “Data is the new oil” is tossed around so often that people are pressured to believe it. Most people don’t relate to it in their everyday lives, thus they do not find it relevant. Those who believe it is important, do so, due to the donor reporting norms in the organization.
2)      Data is an after-thought: In most organizations, proposals are your appetizers, programs are your main-course, data & M&E is your dessert. For best results, proposals, programs, Data and M&E should be baked together. Data should inform the proposals, data should inform the program design. M&E should not be an after-thought.
3)      Quick solutions over deliberate decision-making: Changes on the ground may affect the program strategy. The decisions for changes in program strategy should not occur in isolation. Temporary data solutions to cater to the immediate program needs can harm the long term data strategies. Elaborate planning on how the data captured due to the change can integrate into the larger program data can save time & effort later.
4)      Assuming systems in pilots will last forever: New programs have different data scoping needs compared to a well-established programs. Pilots are highly volatile & require agile solutions. Organizations should learn from evolving programs and set up a more sustainable data system in time. In short, sustainable data solutions can only be built once the programs become stable.
5)      What data architecture?: Most data collection happens via google forms, SurveyMonkey which leaves data dispersed in unmanageable places & structures. Dashboards are built on the scattered pile of data and more data is added to the sheets, until one day, the delicately balanced data system collapses. A robust data architecture comes with planned data engineering. Identifying data sources, stringing the data together & building a structure that allows for visualization. Visualization should be thought of only after data engineering is well-handled.

Data architecture is the first step for refined impact measurement. Data strategy baked into the organizational strategy is known to most organizations in the private industry, but it is yet to become a norm in the non-profit world. With most activities now moving online, due to the pandemic, we have more data at our disposal now. Using this opportunity to make the shift towards becoming more "data smart" should be the way forward.

No comments:

Post a Comment