While data science may be a relatively new academic discipline, Mar Hicks believes that learning about lessons from the past can be a critical part of this emerging field’s future. An author and historian, Hicks spoke recently about coming to the University of Virginia’s School of Data Science, the importance of history to understanding rapid technological developments, and more.
Q. When did you first learn about UVA’s School of Data Science, and what about it made you decide this was the right fit for you?
I found out about the School of Data Science through folks at UVA talking about it and talking to me about it. And honestly, it was those folks reaching out to me and telling me about what the new School of Data Science hoped to do that convinced me that it would be a really interesting place to pursue my work in an interdisciplinary school.
Q. Your book “Programmed Inequality” looks at how the British fell behind in computing after pushing women computer workers to the side. Talk about some of the lessons tech leaders and other countries could glean from this story.
One important lesson is that personpower is a key element in all technological systems, and it’s often overlooked or misunderstood to the detriment of whole countries and societies. And we see that every time new automating technologies come onto the scene. We’re seeing some of that in the turmoil surrounding general AI and large language models – for instance, automatic text generators that are being marketed as a panacea for writing and for programming as they’re simultaneously shown to be inaccurate and biased, stealing people’s copyrighted work, and to be quite energy and water hungry to the point where even casual use may be significantly environmentally destructive.
And yet, there are CEOs who are going all in on the idea that these technologies can be the answer and replace workers. And that’s an idea that isn’t actually about efficiency; it’s about maintaining a certain kind of power, the power of certain people over everyone else and everything else. So, one transferable lesson from the history I’ve written is that when people put too much faith in technology, it doesn’t end well for pretty much anyone involved, other than maybe the few people at the top who stand to make a fast fortune from marketing an untested technology as a catch-all solution and as a replacement for workers.
Q. You’re a historian of technology, gender, and computing – discuss how data science informs your work and research.
This question does sort of imply that my work is outside of data science, that real data science is only statistical or quantitative or having to do with machine learning. And I think that’s because we understood it that way for a long time. I hope that my work isn’t something that’s seen as an add-on or outside real data science and, therefore, only able to be influenced by data science. I hope that my work is seen as a fundamental part of building this academic discipline because this discipline is very quickly being positioned to try to do many and varied things, to try to take on problems and research questions that come from across the whole of academia and society. And I think any discipline that wishes to be that expansive has got to include within it researchers who focus on historical and social context.
Q. With the rapid emergence of AI and amid a constantly changing technological landscape, why do you think understanding history is so important to engaging in current debates?
I think one reason that, oftentimes, people don’t really like history is because it can be a real bummer. Horrible things have happened and been perpetrated in the past and into the present. But the idea of studying the mistakes of the past is to try to not continue to replicate them in the present and in the future. Unfortunately, one thing that we see over and over again with most technologies and their creators – not all, but most – is that they’re very committed to an ahistorical framework. These technological leaders don’t really want to hear anything about history because it’s easier to do what they want if they ignore what’s come before.
From another angle, when ordinary people and users aren’t well informed about history, then the marketing that comes along with new technologies that pushes them as the next great thing – almost like a secular savior – then those people are more likely to uncritically adopt, accept, and use technologies that aren’t really fit for purpose. And this is especially true of AI throughout its pretty long history. And in the process, what we see is that very flawed and problematic technologies can take hold. So, what I would say is history leads to a greater awareness of this cycle and its harms, but then it’s up to people to decide what to do with that knowledge.
Q. You’ve mentioned you had a rabbit that appeared in Time magazine. What is the story behind that?
At the beginning of our still ongoing COVID-19 pandemic, Time magazine did a fluffy story – pun intended – about how some people who were able to work from home were getting along with their animal “co-workers” and how their pets felt about their people being home more than usual. And so, the reporter asked me how my rabbit was responding to my being home. I was teaching online that year, and my rabbit was pretty pleased because he generally likes a lot of attention.