What Does “Impact” Really Mean Beyond the Data? A Student Explores the Project Behind the Numbers
What does “impact” really mean — and how much of it can data truly capture? At the University of Virginia School of Data Science, students don’t just learn how to analyze data, they apply it to real-world challenges alongside faculty and community partners.
Through a collaboration with the Center for Nonprofit Excellence, M.S. in Data Science student Yani Iben worked with assistant professor Terence Johnson and Rebecca Schmidt, director of impact and applied learning, to help produce the Center for Nonprofit Excellence’s (CNE) 2026 Economic Impact report. The project centered on analyzing IRS 990 data to quantify the role nonprofits play across Virginia. But when that work extended beyond the dataset and into Virginia’s General Assembly, it revealed a more complex story about what “impact” truly means. Read on to learn how this experience reshaped the student’s understanding of data, advocacy, and impact.
As a student in the M.S. in Data Science residential program at UVA, I spend a lot of my time focusing on how to quantify real-world phenomena and make sense of messy data. My interest in data science started because I was interested in using data to understand underlying patterns. For example, how can we understand the extent to which various factors drive diabetes readmissions, or the statistically meaningful commonalities among customer segments?
Similarly, working alongside assistant professor Terry Johnson and Rebecca Schmidt on the 2026 Economic Impact Report, my world was defined by numbers. We were deep into the messiness of IRS 990 forms, synthesizing data to quantify the economic footprint of nonprofits across Virginia. We were looking for the hard facts: job creation numbers, healthcare support statistics, and the sheer financial weight these organizations carry.
This semester, I brought this research context with me to Virginia’s General Assembly for CNE’s Nonprofit Advocacy Day. Our report was used as evidence to build conversations with state stakeholders and lawmakers. Stepping into the General Assembly complicated my perspective; it showed me the friction between statistical “impact” and lived reality. This experience highlighted how the nonprofit community’s ripple effect remains unquantifiable.
Meeting the CNE staff and other nonprofit leaders at the General Assembly gave the data a human context. In a dataset, “education” is just a category or a string of characters. After talking to leaders who spent their days coaching and mentoring Charlottesville youth, those data points carried actual weight. While working on the report helped quantify the capacity of these organizations to provide jobs and healthcare, the personal impact of their work is immeasurable.
One of the most grounding moments happened while I was sitting on the General Assembly balcony. Next to me was a group of nurses from UVA Health, who were there specifically to advocate for maternal health. Their presence was a quiet reminder that behind every policy debate is a constituency of real people.
In data science, there is often an assumption that if something is important, it can be measured. But sitting there, I realized how much of a nonprofit's "ripple effect" isn't explicitly captured in a tax filing. You can quantify the number of prenatal visits a grant covers, but you can’t easily quantify the shift in a family’s long-term stability or the sense of security a mother feels.
Rebecca and I wrapped up the day by grabbing lunch with nonprofit leaders in the General Assembly cafeteria. As we spoke with leaders from various nonprofits, I realized how deeply interconnected the nonprofit sector is. This experience was a reminder that no single nonprofit exists in a silo; instead, there is a significant overlap among initiatives that work together to address the various needs of their communities.
The "impact" we highlight in reports is often just the measurable part of a much larger picture. The real work involves a lot of context that doesn’t fit into a cell on a CSV but shows up in the room when people advocate for their communities. It was a reminder that while our analysis of IRS 990s provided the evidence base, the true value of these nonprofits lies in human connections that data can only ever approximate.






