CUNY’s Hunter College Film & Media Department has established policies on ChatGPT, alternative artificial intelligence (AI) programs and the specific ways in which journalism students can and cannot utilize them.
“AI is going to be a tool like transcription services,” says Sissel McCarthy, director of the journalism program. “I think that journalists have to look at this as a tool like all the other tools that we have from technology, not a replacement for verification.”
Students in Hunter’s Department of Film & Media Studies may use AI programs such as ChatGPT, Google Bard, Llama, or Grammarly for feedback to improve writing, but they are strictly prohibited from using them to write any assignments.
AI pulls information from all different sorts of sources. Whether information is taken from AI, another author, book, or an article there is an obligation to credit the source. “Using these AI tools to generate and create work that they are passing off as their own is a violation of the policy” says Colleen Barry, the director of student conduct.
The Office of Student Conduct has a strict policy on presenting another person’s ideas, research or writing as their own. In simple terms, it’s plagiarism. Proposed changes to the academic integrity policy that specifically hone in on artificial intelligence will be up for a vote before the end of the month. “I think having the specific language about artificial intelligence and being very transparent is always the way anybody wants to understand and I do believe when students understand what the expectations are, they abide by them,” says Barry.
Launched in November 2022 by OpenAI, ChatGPT stands for Chat Generative Pre-trained Transformer. The chatbot interface has revolutionized the way information is processed. It can write essays, articles, generate codes and much more. It’s been trained to produce answers to follow-up questions, challenge incorrect premises, reject inappropriate requests and admit its mistakes.
But ChatGPT has glitches. “Journalists have to fact check everything that comes off of AI because it’s a large language model,” says McCarthy. “It predicts the answer based on its training. It’s not thinking. It’s not reasoning. So yes, it can write very well but sometimes it says things that are false or biased. Bias is a real problem when it comes to AI because it’s been trained on the Internet.”
AI presents legal issues for journalists as well. It’s not clear who would be liable for errors that AI makes, and whether the accountability falls on OpenAI or the news outlet that publishes the story. McCarthy says, “ultimately, the journalist is accountable.”
As a tool, artificial intelligence helps writers with grammar, spelling, sentence restructuring and language recommendations. “Students are doing a great disservice to themselves if they’re not [using college as an] opportunity to learn how to become better writers and better readers,” Barry says. “If you’re just using a tool that creates and generates, you’re not thinking, you’re not using your ability, so a faculty member is not grading you but work that was [generated by a computer], which is a violation of policy.”
Students can become overwhelmed with all the pressures of work and life, but rather than relying on the quick fix of AI, Barry and McCarthy advise students to use the resources available to them at Hunter. The Rockowitz Writing Center, for example, helps students to improve their critical reading and academic writing skills. “I just think sometimes they’re underutilized,” Barry says.
Most importantly, if they’re struggling, Barry says that students need to communicate with their professors. Instead of cheating with AI, she says, maybe they just need a little more time, or to ask for a bit more support.