Q&A: Aaron Martin on Coming Home to Virginia, Data Governance, and More

January 16, 2024
Aaron Martin is an assistant professor of data science and media studies at UVA.
Aaron Martin is an assistant professor of data science and media studies at the University of Virginia (photo by Cody Huff).

Aaron Martin joined the faculty of the University of Virginia in fall 2023, with a joint appointment from the School of Data Science and the Department of Media Studies. 

It was, in effect, a homecoming for the Virginia native after many years in Europe. Martin spoke recently about what brought him back to his home state, his research interests, and what he’ll be monitoring in the critical area of data governance in the year ahead.  

Q. What drew you to UVA and the School of Data Science?

A. My reasons are both personal and professional. I grew up in Virginia. I lived in Bedford County during my formative years, and growing up in Appalachia meant that UVA was always a place I aspired to be a part of. It represented academic excellence.

The fact that UVA is a public university, and a fine one at that, was also very enticing. I spent the good part of the past 20 years in Europe, where public universities are the norm and public institutions are revered, and I wanted to reconnect with that here. 

With respect to the School of Data Science, I was very intrigued by the fact that it explicitly commits to incorporating societal values in its model and the approach to building a data science curriculum.

Q. Let’s turn to one of your research interests: data and artificial intelligence governance. For those who haven’t followed this issue closely, what can you tell them about where things stand?

A. It’s certainly been an onslaught lately. People are increasingly concerned about the unintended consequences of what we call “datafication” as well as automation in their lives. These concerns are wide-ranging. They include the tangible harms that communities experience in particular, vulnerable people with respect to, for example, the use of data and AI in medical decisions or when applying for housing or a job, or when trying to access food in humanitarian crises. 

That is a key focus: the ways in which AI may get decisions wrong or may be biased in different ways that result in real harm today. 

There’s also been another set of concerns, longer-term ones, about so-called existential risks that relate to AI’s supposed increasing capacity to threaten the human race. This discourse has emerged primarily in Silicon Valley and among an elite group of stakeholders, who probably have a vested interest in that framing. But I think it’s important for us, as academics, as scholars, to appreciate the differences in these perspectives and how the priorities that they point to might be reflected in emerging regulation. 

Another area that is very topical is the intersection of copyright and intellectual property protections and the development of AI in particular, generative AI. 

The New York Times just filed a lawsuit against OpenAI claiming rampant infringement of its copyrights in the training data used to build ChatGPT. So, we should keep an eye on what the courts decide. And there’s a lot at stake: billion-dollar industries will either prosper or be throttled based on these court decisions. 

Finally, it will be very important to monitor geopolitical dynamics around data and AI. We often think about these issues as domestic problems, but AI is certainly a global technology, and countries like China are pursuing the development of AI and regulations for AI in parallel to the European and American approaches.

Q. Data governance is a rapidly evolving issue. What else will you be watching for over the next year?

A. Top of mind is the European Union’s AI Act, which is very quickly being finalized. This year, the dirty details will be ironed out, and the Act will become law in the EU in the coming months.

So, I’ll certainly be monitoring what the final text covers and certain of the clauses, for example with respect to the regulation of generative AI; how risks are conceived; which technologies are deemed to be especially risky by the European legislature; and then, beyond the EU context, how the rest of the world reacts to European developments.

What we’ve seen in the past is Europe is very much a standard setter in the development of rules for the digital economy. It’s expected that the rest of the world will follow the European approach to AI regulation (the so-called Brussels effect). But maybe not, right? So, I will certainly be monitoring how the rest of the world approaches the regulation of these new technologies.

Q. You’ve also done a lot of work examining how historically marginalized communities, including refugees, understand technology. What are some of the biggest challenges policymakers and others face in this area?

A. This is my favorite body of work that I’m involved in. I’ve been working with the UN Refugee Agency since 2018 on different projects.

We initially were looking at how you can expand access to digital connectivity among refugee populations in different parts of the world and examined the role that different laws and regulations play in creating barriers to access. We did a scan of a large number of different refugee-hosting countries to understand what regulations were favorable to connectivity by refugees and which ones were less conducive, and worked with governments to try to improve the policy frameworks for access by these communities. That’s been super rewarding because you can see the practical impacts of the research.  

In terms of the challenges facing policymakers in this area, I think one of the main challenges is we often only see the negatives to the use of technology, so we focus on misinformation and identity theft and other harms. But in fact, when you talk to refugees about what matters to them, you hear that they really are desperate for reliable, legal, affordable access to the internet and various digital services, in particular because it facilitates things like gig work and other economic opportunities. In other words, access is still a major policy issue.

Q. What are some research areas you’re excited to explore in the next year or two?

A. The first area is a project that we’re kicking off with the Bosch Foundation. It’s a digital identity project with a specific focus on migration. The European Union has passed new legislation that essentially requires all member states to offer a digital wallet to its citizens, and it hasn’t thought through the implications for migrants. 

We are researching what these design requirements mean for broader populations in the EU context, but, of course, we will also try to feed into policymaking across the humanitarian sector as well, which has its own systems and policies for digital identity.  

My research these days is largely in the area of digital identity. And what I’ve discovered is, with the explosion of AI, there is this increased anxiety about verifying that the entities that we are interacting with online are actually humans when they claim to be. I’m interested in this intersection of artificial intelligence and digital identity and what the popularization of AI means for our notions of personhood and questions of digital identity online. 

I also think there’s a lot more to research on technology and sovereignty. How will digital technology, and in particular AI and machine learning, reshape notions of sovereignty? I’m trying to write at least part of my book on this subject, and that should hopefully be ready for 2025. 

Q. Finally, can you share something about yourself that people might not know?

A. My mother was hugely inspirational to me becoming an educator. She lives in Bedford County and still teaches in the area. I spent an extended period outside of academia after finishing my Ph.D. I worked in the intergovernmental and private sectors, including at a large multinational bank. And I want to give credit to my mom for inspiring me to return to academia and to pursue a career in education. 

To hear more from Aaron Martin on the digital risks that refugees and host communities face, listen to a recent podcast he recorded on this topic with the International Labour Organization.

Author

Senior Writer & Editor