Students and faculty gathered March 26 for a dialogue about the impact of ChatGPT on workplaces and the university’s own learning.
Titled “Chat GPT, Gen Z, and the University,” the talk was sponsored by The Pedro Arrupe Center for Business Ethics.
Panelists who shared their experiences regarding the steady advancement of AI tools included Jacqueline Wise, Ph.D., professor of finance and associate director of the Arrupe Center; Ann-Marie Smith, Ed.D., computing science professor at Delaware County Community College; John Keller, Ph.D., associate professor of philosophy; Alan Kahlenbeck ’21, M.S. ’25; and Colin Biddle ’25, a member of the University Student Senate.
Keller said he feels the biases and blindspots of users impact the algorithm’s limits just as developers do.
“One important thing that I think is at least still true is that AI really doesn’t have what we would normally call ‘understanding,’” Keller said in an interview with The Hawk. “It’s very good at getting back to you what it predicts you want to hear, but there are very simple sorts of questions where it will get them completely wrong because it’s predicting you want to hear something.”
The discussion focused on both the concerns and the opportunities that the recent prominence of this technology have posed in order to address the changes that will be necessary on both a policy and a personal level to determine which of its uses are ethical.
Smith, who served on the panel as a technical expert, had concerns primarily about human input in these systems. During the panel, Smith discussed how the failure to regulate where these algorithms source their information makes them apt to provide false or misleading information.
“AI systems are only as good as the data that they’re trained on,” Smith said in an interview with The Hawk. “So that would probably be an ongoing concern: understanding that there could be biases and inaccuracies. That said, AI systems, in my opinion, have proven to be a great resource for teaching and learning.”
Several panel members addressed ChatGPT as a development that would prove more beneficial to work with or around it rather than against.
“I think we have a duty as faculty to introduce AI, talk about it, work with students and become AI-literate with them,” Smith said.
Biddle said fears regarding ChatGPT and similar applications could produce a stigma detrimental to students looking to enter the workforce.
“I think it’s important [that] the university, however it’s going to handle its AI policy as an institution, [does] not instill a sense of fear about AI,” Biddle said. “I think that it’s going to be a tool that we’re expected to use.”