After a year's worth of struggle with Open AI's ChatGPT in their classrooms, writing center faculty at Pasadena City College kicked off the academic school year to train other faculty on strategies to adapt to the presence of generative AI.
Why it matters: Drawing from its extensive database of information, a generative AI (e.g. ChatGPT) creates content on command that some students may credit as their own. Rather than condemning the use of ChatGPT, faculty leading this training say students should be allowed to use AI, but to a limited extent. The faculty want to see students learning, building skills and producing work that goes beyond what ChatGPT can offer.
What strategies? The Pasadena City College faculty suggest colleagues check to see how ChatGPT responds to assignments before giving them to students. Other strategies include assignments that ChatGPT can't do — tasks that require self-reflection or drawing from personal experience.
Keep reading for faculty tips on adapting to AI.
Listen
• 1:21
How College Faculty Are Learning To Embrace ChatGPT — Or Just Accept It
Older and wiser from one academic year with OpenAI’s ChatGPT, faculty at community colleges have recognized that their students are no strangers to generative artificial intelligence tools.
With a new academic year beginning, faculty fall somewhere between two sentiments — optimism on what generative AI can do for students, and serious concern about what damage it can cause.
In the absence of official guidance, this leaves faculty members to figure this out on their own. Faculty leading these discussions share one thing in common: They want to make sure students are learning.
And so at the start of the fall semester, in an auditorium at Pasadena City College, a talkative group of just over 30 faculty and staff gathered to talk about the inevitable new presence in their classrooms and offices.
You can feel that the room is ready to engage. Even before the session kicks off, instructors swap stories about how new technology enters the classroom. One laughs about whether to borrow a playbook from another professor and ban all electronics in the classroom.
Writing center faculty Giselle Miralles and Genesis Montalvo address the crowd.
“We just want to do a really quick reflection on this AI tsunami that we have suddenly experienced, as we have heard more and more about AI and its use in the classroom,” Montalvo says, kicking off the training with a partner exercise.
“Where are you now? Are you still in the same place that you were when you first learned about ChatGPT and other AI? Or have you changed directions, changed your viewpoint?”
Not a replacement for learning
Miralles and Montalvo tell LAist their own thinking has developed over time. The English instructors learned about ChatGPT last year through TikTok. They say faculty feared their jobs could be replaced, and many had planned to prevent or catch students using the generative AI for assignments.
“It became this really negative conversation. And that kind of brought into question, well, I don't want my job to be just policing students. Like, ‘ha, gotcha,’” Montalvo says. “What we want to do with this presentation is let professors know that this is here to stay. And rather than treating it as an enemy, how can we treat it as a learning tool?”
Giselle Miralles
(
Veronica Lechuga
/
for LAist
)
Genesis Montalvo
(
Veronica Lechuga
/
for LAist
)
Montalvo and Miralles personally have both found ChatGPT useful. Montalvo has used ChatGPT for starting the simple mundane task of summarizing author biographies before adding her own take on the assignment. She says she’s not out to stop students from becoming digitally savvy, knowing friends in the legal and medical fields who have used the AI for their professions.
For Miralles, ChatGPT can reach students in a way that she might not. She reached that conclusion when she asked ChatGPT to explain how a camera exposure works, in simple terms, hoping it could do better than her husband. (This took a few tries.) For students, Miralles reasons, that would be like having someone who could explain concepts when faculty are not available — on the weekends, 24/7.
Full of tradeoffs
Over the course of the training, faculty explore all kinds of questions and comments.
One offers at the outset that he finds ChatGPT is very good at writing code. Another says she could see that the AI could enhance accommodations to students. One notes that ChatGPT could give students instant feedback on their work. A counselor shares that generative AI can help first-generation students construct resumés and write cover letters. Another attendee complains that ChatGPT is "usually giving me C-level work."
Genesis Montalvo presenting at a faculty presentation about AI at Pasadena City College.
(
Veronica Lechuga
/
for LAist
)
There are thornier issues, too. If ChatGPT learns through material it can access, well, there’s a lot of prejudiced and otherwise bad material out there, so does that mean ChatGPT might be replicating those issues? Another participant says he’s seen on the news that ChatGPT might cite sources that don’t actually exist.
And as much as instructors may recognize the benefits of AI, they still express concern about students completing assignments without contributing their own work.
Set a starting point
Montalvo supports students using ChatGPT as a starting point for their assignments, a brainstorming tool. Nonetheless, she expects that students revise the output for accuracy and to reflect their own voices.
“If you're using AI to just completely generate without you, yourself, engaging with it, you are hurting yourself in the long run because you're taking yourself out of that equation and solely relying on the AI to create everything for you,” Montalvo says. “So, it's finding that balance of empathy and understanding, but also being firm and holding the student accountable for their own actions and their own learning.”
Miralles agrees. The worst would be if students were to cut and paste from ChatGPT directly as their own work. Reminding students that faculty are there to help students learn, not just tell them what to do, is what Miralles thinks is important to instill in students.
“You know, I think it's out of respect to tell students like, this is why I'm having you do it, because I think it's valuable for your learning, right?” Miralles says. “This is why I have you understand how to vet your sources, right? Because you want to understand what kind of sources are credible, right? Like, that you can understand what information is something that you can believe, right? So that is a skill you need in everyday life, I think, right?”
Assignments only humans can do
Software to detect AI has proven to be unreliable, and questioning students about cheating has been difficult. Reflecting on the initial faculty response to ChatGPT, which had struck a policing tone, Montalvo realized there must be underlying reasons students were using ChatGPT instead of producing their own work.
Advice for Community Colleges Faculty Adapting to AI
Pasadena City College Writing Center Faculty Giselle Miralles and Genesis Montalvo recommended strategies for college faculty trying to integrate generative AI into their classrooms.
Educate students on how to use generative AI, particularly ChatGPT
What ChatGPT can do – explain concepts, generate ideas for brainstorming
What ChatGPT can’t do – represent diverse perspectives (ChatGPT is biased to represent dominant culture), ensure accuracy (pulls from a fixed database)
Ask students to disclose how they use generative AI for their work
Request students, if using generative AI, to build upon and revise the output
Test your assignments with ChatGPT– see its response to your own prompts
Assign non-generic and higher-level tasks that ChatGPT can’t do
Ask students to create self-reflective or metacognitive tasks: What did they learn?
Get students to respond personally. What about them? What parts of their personal experience can they bring?
Remind students why doing their own work matters, what skills they’re learning
“It's like why students plagiarize for any reason. It's never because of like, ‘Oh, it's easy. I don't wanna do this.’ It's usually because they're stressed, they're unsure about the topic, they don't really understand the topic, like there's a bunch of other reasons behind the plagiarism.”
Miralles says the responsibility of the instructor is to adapt material to ChatGPT and consider what the instructor can do to motivate students to engage with the work. Faculty say the way to push students to create original work has been to emphasize what only students can do — that which is human.
Montalvo, who teaches poetry, points out the times when she suspects a student is using ChatGPT to write a poem. The poems are boring — they lack artistry, depth and personality. Montalvo says she tells them, “I want to know you.”
Miralles encourages assignments that are metacognitive, which is defined as an awareness of one’s own thinking. “As we know, ChatGPT does not have human emotions, so it can't recreate that.”
She suggests asking students about their learning process. “How are they learning? What did they learn from this assignment? What parts of their personal experience can they bring, right?” Miralles says.
Faculty are also discussing how to assess students’ performance in the context of AI. Miralles, as an English instructor, expects that grading criteria can assess students for originality, creativity, and even for the “energy” of their work.
Collaborate on guidance
The faculty trainers at Pasadena City College did not fault their administration for a lack of guidance. They express that it has taken time, the technology is still so new, and that they’re still figuring out how to respond to AI.
There's some evidence to suggest the conversation will need to happen sooner than later. A survey of students in grades 6-12, released Wednesday by the nonpartisan think tank Center for Democracy & Technology, found that students with special needs are more likely than their peers to use generative AI and be disciplined for doing so. That survey also found widespread differences in how educators use AI and discipline students for using it. The center suggests that administrative guidance can provide consistency and protect vulnerable students.
At the moment, the Pasadena City College trainers said faculty-driven guidance is more valuable than the administrative kind. Administrative guidance could run the risk of taking a more punitive stance, policing students, and generating fear.
Eventually, the trainers at Pasadena City College say that the faculty can have open discussions with administration or bring proposed suggestions to the administration that they hope would be supported.
“I think that with any change on any campus, I do think it should always be faculty driven, just because faculty are the ones that are engaging with students,” Miralles says. “So the fact that it's faculty talking about it, I think that's great.”