Exploring Big Data Through a Humanities Lens
Research & Inquiry
Published April 25, 2018
As inspiration for his new research project about big data, Associate Professor of Government Brent Durbin says he can simply turn to recent headlines about the unforeseen uses of Facebook algorithms—including efforts to sway the U.S. presidential election.
Durbin, who teaches courses in U.S. foreign policy and strategic intelligence, has long been interested in the science of forecasting—how different types of organizations make decisions about the future. Now, with support from a prestigious New Directions Fellowship from The Andrew W. Mellon Foundation, he is launching a project that aims to explain more fully how the 21st-century “big data revolution” is affecting politics and social change.
Durbin’s project—“Grasping @ the Future: The Promise, Peril and Politics of Modern Forecast Organizations”—will combine studies of data mining and applications with research on the history and philosophy of science.
“In a lot of ways, our technology is moving faster than our ability to understand it,” says Durbin who will spend next year conducting research at Stanford and the University of California, Berkeley. “There’s a lot of attention being paid to new digital tools, but we’re only just starting to explore their social impact.”
Here’s what else Durbin had to say about his research project.
Why is it important to study the politics of big data?
“There are so many examples of how big data is having an impact—algorithms being used to make decisions about mortgage lending, parole, hiring and firing, surveillance—and hundreds of other things. Data ethics is something that Silicon Valley philosopher-types talk about. But there’s not a lot of talk about it in political science. My interest is to try to understand what questions are being asked when these data tools are being created. How do developers think about their work? How are they being trained and evaluated?”
How are you focusing a humanities lens on these issues?
“I want to look at the history and philosophy of science to better understand previous technological transformations—not just what happened, but what people expected to happen and how they managed the changes. Benjamin Franklin reportedly wondered when he saw the flight of an early hot-air balloon whether the new technology would make war obsolete. That’s what he thought might happen. A lot of people think they know what big data will mean for society, but most of them are probably wrong.”
How far back in history will you go?
“I don’t know yet. There are so many transformations to look at—telephony, radio, the steam engine. There are always those who overreact to the dangers of new technology. But there is something different about what’s going on now. These digital technologies are so tied into human psychology and culture. Facebook’s [former] motto, ‘move fast and break things,’ doesn’t take into account what could be broken. So, how do you develop a type of ethics within the tech industry that’s akin to the Hippocratic Oath?”
How do you feel about having to learn new math during your fellowship?
“It’s daunting! The last time I did that was about 20 years ago. I’m brushing up on my statistics now so I can be ready for the program in the fall at Stanford. The great benefit of this fellowship is that it allows me to pause in my normal trajectory to learn new things.”
How will your project benefit your work at Smith?
“With the creation of the Statistical and Data Sciences Program at Smith, there are some real synergies to take advantage of. I’m hoping to be able to develop some course modules about politics and data science—especially where these intersect with social justice issues. Also, data analytics is a growing area within political science, and I’m excited to incorporate this new knowledge into my courses.”
Mellon’s New Directions fellowships are designed to promote interdisciplinary research by helping faculty members in the humanities and humanistic social sciences learn about subjects outside their areas of special interest.