Winter 2023 Issue

Data Revolution

W&M fuses liberal arts-style critical thinking with digital fluency

By Noah Robertson '19
Illustrations By Carl Wiens

Winter 2023 Issue

W&M fuses liberal arts-style critical thinking with digital fluency

If you live in the Chesapeake Bay watershed — home to six states, multiple military bases and much of the federal government — Tim Carroll ’87 wants to know whether your septic system will back up in the next 10 years.

Carroll is the head of Microsoft’s Climate Portfolio, which means he spends his time feeding enormous data sets into computer models, which in turn help forecast the consequences of global warming.

Take rising sea levels as an example. Even a modest amount — say half an inch — can matter when it flows into the Chesapeake Bay. Higher tides can combine with heavier rainfall and dump water aground in a process called “compound flooding.” Consistently wetter ground absorbs less moisture, including — you guessed it — from the septic system that may sit in your backyard. “When people talk about these lofty topics like climate change, the very real impact to individuals may not be that their house gets flooded and washes away, but that their septic system backs up two to three times a month,” says Carroll.

His work is essentially a gigabyte-driven game of connect the dots, and it wouldn’t be possible without supercomputers and software models that can process tranches of data — from flood maps to the United States census.

But the coding and engineering components of their team’s data-driven work are a part, not the whole. Carroll isn’t a meteorologist, a scientist or even a top-notch coder. Instead, he’s more of a data interpreter, collecting information on opaque issues such as climate change and making them accessible to the public. After all, regardless of its breadth, information isn’t useful unless people know how to use it.

“You can’t just take a bunch of maps, throw them in front of people and say, ‘See what’s going to happen,’” says Carroll. “What I’ve enjoyed doing in the tech field is helping people translate problems.”

He, like many others, learned to translate those problems, from general to specific, at William & Mary. From Microsoft to Google, university alumni already work in some of the world’s leading data and tech firms and are helping to solve some of the world’s biggest challenges. Applied science programs on campus are booming and data stands as one of the initiatives in the university’s Vision 2026 strategic plan, so more students will soon follow these alumni’s lead.

And they’re entering the workforce at a crucial time. The World Economic Forum (WEF) argues the global economy is currently in a Fourth Industrial Revolution, driven by a boom in data and artificial intelligence technology. In five years, according to the WEF, there will be 150 million new technology jobs. By 2030, some analysts project more than three-fourths of all positions will require digital skills.

While technical skills matter, though, they only go so far on their own. Rapidly advancing technology involves a host of practical and ethical questions around its use, from privacy to accuracy to prejudice. Hence the need for programs at a university like William & Mary, which fuses liberal arts-style critical thinking with digital fluency, says Dan Runfola, assistant professor and director of graduate studies in the Department of Applied Science. People with that combination of skills, he says, are exactly what the world needs in a time of intense change.

"Not only do I think we are the right place to be doing this,” says Runfola, “I think we’re pretty much the only place doing this.”


But when Carroll entered the workforce, no one was doing it at all. In the late 1980s, data and computing hadn’t become cornerstones of business or education. Even when Carroll graduated and moved to San Francisco, things were antiquated by today’s standards.

His first job was at a startup delivering now-clunky VHS tapes to people’s homes. His boss was a computer scientist who was so paranoid that Carroll and the other young employee would break something that he put McGyver-esque duct-tape boundaries on the floor to keep them away from anything important.

(The other employee later went on to work in IBM’s cybersecurity division, by the way.)

“It was great, and I learned an extraordinary amount right up until I was flat broke and had to move home,” Carroll says.

In fact, that experience was what first encouraged Carroll to continue in the tech industry. Back in Maryland, he called up the three biggest computer dealers in his area, and one of them gave him a chance. Carroll began to learn the tech side of the business, moving to other companies such as Compaq, and eventually immersing himself in the worlds of supercomputing and cloud technology. He joined the founder of a software startup that was later acquired by Microsoft, which led him to his current role.

“My differentiator was not that I was a computer scientist or a better coder, because I wasn’t,” says Carroll. “But I am fortunate to be able to translate technical capability into a language that non-technical decisionmakers and policymakers require.”

That same skill set, dating back to his days in government and history courses at the university, helps him now. There are two main pillars in climate security — reducing greenhouse gases and adapting to climate change. Carroll works in the second, and predictive modeling is crucial to his work. To help governments, companies and regular people adapt, they need to know what they need to adapt to.

“The first thing we do is to help people get their arms around the scale of the problem,” Carroll says.

Analysts can mix and match climate and census data to understand the challenges that underserved populations face in a warming world. Companies can mix highway data with sea level rise projections to predict the resiliency of supply chains. States in the Chesapeake Bay watershed can watch trends in the bay to adapt their infrastructure.

“Once you’ve collected the data one time, the number of things that you can use it for, if you understand how, are extraordinary,” he says.


Extraordinary too is the role data has in our daily lives — which so often orbit around social media and smartphones — and in solving the world’s biggest problems. Accurate and available data has been crucial in the world’s response to the COVID-19 pandemic. Granular economic data facilitates the Federal Reserve’s fight against inflation. Information sharing from national intelligence services has helped sustain Ukraine’s self-defense.

For a snapshot of why data plays such a big role in the world today, look no farther than Runfola’s office desk in the Integrated Science Center.

Atop it stand eight monitors, two computers and one laptop, arranged as if he were a football fan trying to stream every Sunday game. At any given time, the displays show computer models processing terabytes of data. Runfola watches for mistakes, since the programs — depicting everything from the effect of climate interventions to the likelihood of a protest — take hours or even days to run.

His desk, next to a bookshelf with artwork from his two young kids and ball caps from every university where he’s worked, is a microcosm of the data revolution. Computers now, including some of his, are doing things scientists theorized decades ago but didn’t become possible until the last 10 years. Meanwhile, students in Williamsburg are watching and learning.

“It’s all about how we teach a computer to think,” he says. “There are mathematical, engineering and philosophical parts to that question, but the fundamental piece is how we represent information that we, as humans, understand in a way that computers can understand also.”

Consider a straight line in a painting, he says. When someone looks at vertical lines, different neurons fire up than when someone looks at horizontal ones. Easy for a human, he says, but hard for a computer, which can’t come close to replicating the vast neural networks in our brains.

Unlike Runfola’s toddler, who’s currently learning to walk through trial and error, machines don’t by nature observe and assess new information. They have to be taught to do that, through software models that input and analyze data sets. The more sophisticated the model, and the larger the data set, the better the results. To wit, scientists are just now learning to teach machines to walk — like Spot, a robot dog developed by Boston Dynamics.

“It’s all still pretty janky,” says Runfola. “As a field, we’re not very good at it yet.”


Modeling may sound inscrutable, says Evgenia Smirni, chair and Sidney P. Chockley Professor in William & Mary’s computer science department. So instead of getting lost in a labyrinth of data jargon, she suggests people just think about getting a sandwich.

Start by thinking like a restaurant owner. There are multiple steps in almost every customer’s visit: going to the counter, ordering, paying and waiting. Owners want two things: to turn a profit and to make sure customers enjoy their visit. So each of those steps has to be managed in a balance, without too long a wait for food or too many employees working at once. In this case, a model would use data — such as the number of line cooks and cashiers and the demand based on time of day — and then project the average time to order, pay and wait. 

This might seem like the difference between a few minutes at the register, but over time it could actually mean the difference between a stable and unstable business. “It’s a very complicated problem, but a simulation allows you to solve problems like this,” says Smirni.

Simulations like these can help solve exponentially more complicated problems — such as the number of COVID-19 tests an area needs to monitor infection rates or the fastest ways to decarbonize the world economy. Those may seem like apples and oranges (Ryans and Reubens?) compared to a model about sandwiches. But it’s essentially the same process.

No wonder, then, that “everybody and their cousin is hiring in computer and data science,” says William & Mary Provost Peggy Agouris.

Almost 90% of companies surveyed by the online learning platform DataCamp say increasing data fluency is a moderate or high priority. A third of all global jobs — more than one billion — will be transformed by technology over the next 10 years, according to the Organisation for Economic Co-operation and Development. By 2030, that will require around the same number of people to reskill, and Accenture, a consultancy, estimates G-20 countries, made up of the world’s 20 largest economies, could be putting $11.5 trillion of potential gross domestic product growth at risk over the next 10 years if those skill demands aren’t met.

A microcosm of that demand-driven progress is at play at William & Mary, says Smirni. “We have been experiencing a tremendous interest in offering more computer science classes at the undergraduate level,” she says. Perhaps unsurprisingly, she points at the data.

Ten years ago, the total number of available seats each semester for undergraduate computer science classes was 842. Now it’s 2,119. In the same period, the number of faculty in the department has increased from 14 to 23. There are 277 students majoring in computer science — 34% of whom are women, which is 50% more than the national average. Seven percent of the computer science majors are Black, which is double the national rate. The average starting salary for university graduates with a computer science degree is $94,000. Those who work in big tech start around $129,000.

“We have a fantastic reputation and are following national trends,” says Smirni, who has published papers simulating the spread of COVID-19. “Everything is moving toward high-tech industries.”

Agouris lists multiple initiatives to encourage that growth: work by the student government to incorporate data literacy across the College (COLL) Curriculum, funding from the state to graduate more students and hire more faculty in computer science, and a proposal for a new school of computing and data science.

The Raymond A. Mason School of Business already has a business analytics program. William & Mary Law School and the university’s computer science department are participating in the Commonwealth Cyber Initiative, with $1 million in funding.

“The wave of data has come in and has the ability to infuse new techniques and new skills in our teaching, labs and research,” Agouris says. “We’re not starting from scratch, but enriching areas where we’re already strong.”


 W&M data image


Perhaps the university’s greatest strength, when it comes to this field, is its synthesis of hard sciences and the humanities. Students studying computer science, for example, are taught critical thinking. That combination can be an indispensable asset in the workforce.

Just ask Nami Choe ’98.

Choe is Google’s director of marketing data science, which means she leads a team of analysts who help clients advertise with Google's data and advertising technology, or AdTech, platforms. But she didn’t start her time in college, or even her career, thinking she was on track to work at the world’s biggest search engine. Instead, she thought she wanted to be a doctor.

She started down the pre-med track, but it took a year and a half to realize “I stunk at it,” she says. “I had this existential moment because I didn’t know what I was going to do with my life. I was like, you know what, I’m going to change majors and I’m going to bust my butt to do this.”

So she did, switching from biology to sociology. She liked numbers, she liked math and she liked how people could be creative with those two things to learn about the world. It felt like a perfect fit. And near the end of her time at William & Mary, a new professor encouraged Choe to start thinking through data like a narrative.

“She was one of the best professors I’ve ever had because she was a good storyteller with data and information,” Choe says. “I thought, OK, that’s what I want to do.”

Choe earned a graduate degree in biostatistics and went to work in public health. She figured she’d end up in epidemiology at the Centers for Disease Control and Prevention or doing research for a pharmaceutical company, but the economy was in a downturn and she eventually landed a consultant job for the Department of Defense. The rigidity there, in part, taught her how much she relished the ability to think through information in her own way.

“You have to be creative in how you use the data to tell stories, because it’s never going to be perfect,” says Choe. “Data’s always messy.”

She later worked for Ogilvy and then Ralph Lauren (no, she says, employees do not get free merchandise a la “The Devil Wears Prada”). A former colleague reached out about an opportunity at Google, and after some initial doubts Choe interviewed. She got the job and is now in her eighth year.


Three days a week, Choe puts on sneakers and takes the subway to Google’s New York office. There, she changes into more professional shoes, stashed underneath her desk. (At one point, she had about a dozen pairs at the office, she admits.) A coaster from Newport, Rhode Island, sits atop: “Well-behaved women rarely make history.”

That’s her kind of bric-a-brac, she says.

“What I find fascinating about my job and why I love it so much is that the AdTech world changes constantly,” Choe says. Privacy regulations, in particular, are growing more common across the world. So-called “cookies,” which track and sort users as they surf the web, are slowly being phased out.

The upshot for Choe is that constant change demands constant creativity. Clients as different as Adidas and Constant Contact — who all have limited marketing resources — want to spend their budgets wisely. Users, meanwhile, don’t want overly invasive ads. “With the industry changing so much,” she says, “how do we keep up, maximize our return on investment, and make sure our customers and users have the best digital experience online?”

Recently, she’s been working in the emerging market of connected TV services such as Roku or Amazon Fire TV. These are embedded in television sets to make content more accessible, and for someone like Choe, that means a huge rise in the number of new ways users can access and enjoy that content.

A host of variables affect her work. People access these services at home and out of town, on their TVs and other devices, on schedule and randomly, for anything from “Better Call Saul” to Sunday church services. “There is no one magic bullet to solve that,” she says, “so we have to come up with a creative way to understand how clients should reach their customers.”

So she, like Carroll, plays the elaborate game of connect the dots. Customers usually use their devices all in one geographic area. That’s one dot. They usually express interests or hobbies in their activity. That’s another. Then, sometimes days or weeks after seeing an ad, they buy a product. Choe and her team have to draw that causal path and show their clients whether ads are working — and if not, how to improve them.

But in a meeting with clients, Choe and her team don’t often start with data or analytics. “We ask a lot of questions, because sometimes what they think their problem is, isn’t really the problem.” First, they have to come up with a clear problem statement and then hypothesize how to solve it, even while acknowledging that the hypothesis will likely take tweaking.

“There is no perfect solution,” Choe says. “There’s no perfect score or perfect result.”


In the same way, there’s no perfect technology, says Runfola. Data, artificial intelligence, supercomputing — these tools have enormous promise to solve problems, but they also have pronounced flaws.

The advances in semiconductors and the spread of cellphones over the last decade now mean that the world, more than ever, is online. Among other issues, online platforms can raise privacy concerns and make it easier to spread disinformation, for example via deepfakes or other doctored images. At an ethical level, says Runfola, there needs to be a debate around artificial intelligence and how increasingly advanced robotics will change society. That’s hard when the world of tech often has a develop-first, think-later mindset.

“That’s problematic in a lot of ways because it pushes you to develop these algorithms without really thinking about the potential societal implications,” he says.

For Carroll, too, the advances can be bittersweet. His path toward Microsoft was a process of “stumbling forward,” he says, enabled by hard work but also a lot of luck. That probably wouldn’t be possible in today’s hyper-competitive job market, in which application screening software can be a gatekeeper.

“In order to keep up with the speed that’s required, you have to depend on the technology to a certain extent,” he says. “Then it’s a question of how much of that dependency impacts the true human connection.”

Technology alone can’t give someone a meaningful life, says Chet Thaker P ’17, P ’19, CEO of TeleBright, an expense management company, and a trustee on the William & Mary Foundation Board. “That sense of skepticism is entirely appropriate, and frankly we put too much trust in what comes out of machines,” he says.

Instead, he suggests, people could try to use these new capabilities for the things humans don’t do well or just don’t like — such as combing through massive amounts of information or routine tasks that take little attention. TeleBright, for instance, collects invoices from its many clients and then loads them onto a platform for those clients to review. “As you can guess,” says Thaker, “ingestion of these invoices from 250 different providers is not a uniform task.”

The process often results in irregularities or small errors, which are time consuming to identify and then correct. So TeleBright is exploring the use of robotic processing automation to better catch and correct those flaws. It would save a massive amount of time.

In addition, Thaker wants to use machine learning to scan and study client expenditures. “What we want the AI to do is to look through clients’ phone bills and observe the patterns that are otherwise not visible to a single analyst,” he says. TeleBright may have all this data, he says, but it may not be able to identify the most nuanced ways to interpret it. Having artificial intelligence available, says Thaker, enables “seeing the forest around the trees.”

This technology is increasingly coming to campus. Already, Thaker convinced UiPath, a robotics company, to donate $4 million worth of bots to the school of business, one for every undergraduate student.


Thaker belongs to a tennis gym, where each Sunday at 7 a.m., the website opens reservations for the week ahead. Everyone, including him, wants their preferred times. But most people, also including him, don’t want to wake up so early on a weekend. So he’s working on a bot that will automatically log him in at 6:59 a.m., enter his password and reserve his choice slots.

“What do I get out of this? Amusement,” he says. And maybe an extra hour or so of sleep.

Choe also sees a similar mixture of work and play involved in advanced technology. On one hand, it’s incredibly fun to solve complicated problems. On the other, it’s important to understand technology’s limitations. Tech powered by data can connect billions of people and identify patterns no one human could notice. It can’t replace human interaction or get around foundational matters of trust.

Just as she and her team focus on finding the right question at the outset, they also can benefit from the process of building trust.“The marriage of observation plus modeling equals better outcomes,” she says. “There are ways to keep AdTech platforms accountable and transparent.”

William & Mary students will have a hand in that. They already do.

One of Runfola’s undergraduate honors theses this year focuses on the issue of bias and satellite imagery. Whether from race or class or sex, it’s easy to think of how algorithms might be prejudiced when using a picture of a face, he says. It’s much more difficult to think of that when the algorithm is looking at a top-down image of a home. But that doesn’t mean prejudice doesn’t exist. Neighborhoods, cities and whole areas have certain characteristics — from the quality of the infrastructure to the building styles — and can be subject to some of the same biases as individuals. The student Runfola is working with wants to probe that.

“We’re kind of uniquely situated here to look into these ethical issues,” he says. “It’s what we already do.”

There are growing pains associated with new technology disrupting the world, says Choe. Change can be scary, but it doesn’t have to be. Not if people are prepared.

“The way we work is going to change and you’ve got to be OK with that,” says Choe. “But as long as data is there we’re going to be OK.”