Software Development
AI / DEI / Ethics / Technology / Universities

This Penn researcher is exploring how ChatGPT fits into the social sciences

Desmond Upton Patton collaborated with fellow University of Pennsylvania academics on a paper about how generative AI, and especially the chatbot, can be used (ethically) in social sciences work.

SAFELab's Desmond Upton Patton. (Photo courtesy of the University of Pennsylvania)
ChatGPT seems to be part of every conversation about artificial intelligence since the chatbot was released in November 2022 —including those happening in academia.

The work done in Desmond Upton Patton’s SAFELab at the University of Pennsylvania focuses on AI, computational social sciences and ethics, so when ChatGPT came out, he and his team quickly started playing around with it as a research tool.

Upton Patton, a professor at Penn’s School of Social Policy and Practice (SP2) and director of SAFELab, published a paper earlier this summer about the potential opportunities and challenges of generative AI in the social sciences, specifically social work. His co-authors are fellow researchers at SP2 and Penn’s Annenberg School for Communication, Aviv Y. Landau and Siva Mathiyazhagan.

Read the research paper

Upton Patton wanted to create a conversation around the tool in academia, he told Technical.ly, before it got lost in a sea of negative use cases.

“What we’re trying to do is to anticipate, before mass use, some of the challenges of the application for societal use, but also encourage positive use and uses that can really help us answer questions that are hard to answer in social science,” he said.

Upton Patton likes that the tool can identify gaps in his thinking, he said. For example, he may ask ChatGPT to recommend scholars of color or interdisciplinary research he could include in a paper or syllabus. He has also asked it to act as an editor for his writing.

ChapGPT is not replacing his thinking process or his knowledge, but supplementing it and helping him to make his work more thorough and efficient, he said, describing the tech as something like a “virtual research assistant.”

Users can also make the tech work for them, by automating or expediting the more tedious tasks of their job. As Plymouth Meeting-based consultant and designer Beth Brodovsky said during her talk “AI: The Ultimate Sidekick” at this month’s MILLSUMMIT in Wilmington: “I’m not a programmer, but now I can be.”

Upton Patton thinks ChatGPT can be used to ask bigger and better questions about social sciences. The tool can help come up with new questions and pull data and context that hadn’t been considered before.

“How does ChatGPT affect our future research? How might ChatGPT reimagine ethics in social science spaces?” he said. “And I think we need to be considering new and deeper, more refined questions of how these tools affect the kind of work you want to do in communities.”

Upton Patton said he encourages his team at SAFELab to use ChatGPT in their workflow as well. In this research paper, Upton Patton said ChatGPT can be particularly helpful with qualitative research by identifying trends and suggesting other sources.

“With my team, it’s really important for us to identify authors of color, to identify research that comes from interdisciplinary spaces, from other disciplines,” he said. “[We] want to make sure that we’re using it ethically and as a resource and a tool. It’s not intended for us to be using it to write or to produce ideas.”

In terms of challenges with ChatGPT, Upton Patton said the main thing to note is that it uses unverifiable sources. This means it’s more important than ever to check sources and use proper citations for information. To get ChatGPT to give valuable responses, the user still needs to deeply understand the topic so they can provide clear prompts and judge the quality of its responses.

(Another important consideration: who’s developing the tech. “I think it has to be a stewardship — and it requires all of us, because otherwise there’d be too much bias in the system, if it’s only being stewarded by a very select set of people,” Carolyn Yap, Google’s director of AI Practice, said during PACT’s AI-focused Phorum conference in May. “The more perspectives, the more communities, the more cultures are brought into this conversation, the better the guardrails, the systems, the safeguards, the policies, and even the responsibility matrixes can be.”)

For social work in particular, Upton Patton feels like the field tends to be slow to adopt new technologies, and that there isn’t a strong template for how to use generative AI in social sciences yet. However, the introduction of ChatGPT is an opportunity to create that blueprint for engaging with new technology.

“We see in the hard sciences, we see in computer science, how these tools revolutionize how work is done and how people collaborate and what types of problems people are able to tackle,” Upton Patton said. “We really need these types of insights in social work and social science because we’re dealing with the everyday human experience.”

Sarah Huffman is a 2022-2024 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Lenfest Institute for Journalism.
Companies: University of Pennsylvania
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Philly daily roundup: Student-made college cost app; Central High is robotics world champ; Internet subsidy expiration looms

Philly daily roundup: Earth Day glossary; Gen AI's energy cost; Biotech incubator in Horsham

Will generative AI replace software developers?

Penn professor on gen AI's rapacious use of energy: 'One of the defining challenges of my career'

Technically Media