With our free press under threat and federal funding for public media gone, your support matters more than ever. Help keep the LAist newsroom strong, become a monthly member or increase your support today.
Cal State says staff AI use ‘resulted in errors’ in legal filing
A legal document filed on behalf of California State University was riddled with faulty quotes and other telltale signs of being AI generated, an administrative law judge said this week, prompting Cal State to acknowledge that artificial intelligence had been used to help create the document.
Administrative Law Judge Bernhard Rohrbacher ordered that a CSU legal filing featuring “phantom quotations” from a 1981 court decision be struck from the record of a case pending before the California Public Employment Relations Board.
The order filed Monday is part of a proceeding pitting the nation’s largest public four-year university system against the CSU Employees Union, which is seeking to represent an estimated 1,400 students serving as resident assistants in college housing around the state.
Rohrbacher wrote that while “there is no proof that AI was, in fact the author” of a Cal State brief, the document “bears all the hallmarks of the hallucinations associated with AI-generated texts” and contains a series of misquotes Cal State failed to explain.
A Nov. 10 case filing by Cal State said that the mistakes were due to a “failure to double-check correct page numbering” and “erroneously included quotation marks around paraphrasing statements.” But in a written statement on Wednesday, a university system spokesperson acknowledged the brief had been written with AI assistance.
“The CSU is aware that a staff member used artificial intelligence, without conducting due diligence, to assist with creating a brief that resulted in errors undermining the integrity of their work,” said CSU spokesperson Jason Maymon. “This action does not align with the CSU’s ethical and responsible use of AI, and we are taking appropriate steps to address this matter.”
The order comes as Cal State aims to become a leader in integrating generative artificial intelligence into higher education. It could also have implications for the bid to unionize CSU’s resident assistants, who typically receive benefits like free housing and a campus meal plan but no salary in exchange for helping manage more than 67,000 dorm beds at Cal State campuses.
The CSU Employees Union in March moved to absorb resident assistants — whose wide-ranging responsibilities can span everything from organizing dorm socials to responding to student emergencies — into an existing unit of more than 17,000 student workers. Cal State has opposed the effort, saying resident assistants are not employees but “live-in student leaders.”
“If students submit assignments with AI-generated half-truths and fabrications, they face consequences. And yet the CSU is doing exactly what we tell students not to do,” said Catherine Hutchinson, president of the CSU Employees Union, in a written statement. “Resident assistants provide valuable services to their campus communities, so it is shameful that the CSU would waste the time and resources of California’s Public Employment Relations Board in an attempt to quash their right of union representation.”
Maymon said Cal State “is proud to lead the adoption of AI in higher education. But just like every institution that has embraced navigating this new terrain, challenges will surface. This presents an opportunity for the CSU to fine-tune our AI trainings so that our students, faculty, and staff receive the information and professional development to fully leverage the benefits of this new technology.”
Judge called misquote ‘a stretch’
A Nov. 3 brief from Cal State repeatedly quoted from a single federal appellate court decision to support its argument that resident assistants, typically called RAs for short, should not be considered employees. But Rohrbacher, the administrative law judge, said he could not find a series of quotes and page citations in the original decision, which stems from a decades-old lawsuit involving Regis, a private college in Denver.
For example, in passages that purported to quote the court’s decision, the Cal State brief said that the relationship between RAs and their college is “primarily educational rather than economic in nature” and that RAs, therefore, are not employees.
But that statement and several others do not appear in the appellate court decision CSU cited, Marshall v. Regis Educational Corporation. Rohrbacher said the misquote “is certainly a stretch” and undermines Cal State’s argument for citing the Marshall case. California law in 2018 dropped language defining certain students as employees “only if the services they provide are unrelated to their educational objectives” or if educational goals are secondary to such services, Rohrbacher noted.
“It is therefore curious that the University interprets Marshall to imply such a requirement, making that case inapplicable here,” Rohrbacher wrote. “That assumes, of course, that generative AI was not the author.”
In striking the Cal State brief, Rohrbacher stopped short of direct accusations. But the administrative law judge said Cal State had failed to explain its mistakes, writing that regardless of whether AI wrote the brief, “these are not the kind of ‘errors’ or acts indicating a run-of-the-mill ‘lack of diligence’ … that could be ignored.”
Rohrbacher did not immediately respond to a request seeking comment for this story. He previously served as general counsel for the California Faculty Association, according to his LinkedIn profile, the union representing CSU employees, including professors, librarians and coaches.
Vying to be the ‘first and largest AI-empowered university’
Cal State in February announced a $16.9 million deal with OpenAI to purchase enterprise access to ChatGPT, part of a strategy to become “the first and largest AI-empowered university system.” It has also launched a new CSU board that includes state officials and industry representatives from companies like Anthropic and Nvidia. Campuses around the system have recently moved to add a cluster of AI-related degree programs.
While some within CSU have embraced those steps, they have also provoked worries about how the technology will impact the way students learn and professors teach. Critics have also raised concerns about protecting student and faculty data privacy, and the cost of CSU’s investments in artificial intelligence at a time of difficult budget cuts and job losses.
As part of its work on AI, CSU has published guidelines about the ethical use of the technology. Rohrbacher quoted a section in a footnote of the order striking CSU’s brief.
“Content generated by AI can sometimes be inaccurate, deceptive, or completely fabricated, also known as ‘hallucinations,’ and might inadvertently include copyrighted material,” the guidelines read. “It is your responsibility to vet any AI-generated content before dissemination.”