学生反抗人工智能授课
Students fight back over course taught by AI

原始链接: https://www.theguardian.com/education/2025/nov/20/university-of-staffordshire-course-taught-in-large-part-by-ai-artificial-intelligence

斯塔福德郡大学的学生们感到“被欺骗”,因为他们发现一门编码模块——政府资助学徒计划的一部分——主要使用人工智能生成的材料进行教学。像詹姆斯和欧文这样的学生原本希望通过学习开启数字职业生涯,但他们经历的课程是通过人工智能语音解说和被人工智能检测工具标记为极有可能由人工智能生成的演示文稿来提供的,内容存在不一致和泛化的问题。 尽管学生们提出了投诉,并且学校政策禁止学生提交人工智能作品,但学校辩称教师使用人工智能是一种辅助工具,甚至发布了使用框架。学生们在录制讲座中表达了担忧,质疑他们被禁止*使用*人工智能,却被人工智能*教导*。 这种情况凸显了大学越来越多地采用人工智能的趋势,尽管学生对此并不满意。虽然学校坚持认为学术标准得到了维持,并安排了一位人类讲师进行最后的课程,但学生们认为损害已经造成,并对浪费的时间和受损的学习体验表示沮丧。他们认为这门课程未能提供成功转行所需的深入知识。

## 人工智能在教育领域引发学生反弹 一篇最近的《卫报》文章,在Hacker News上讨论,强调了学生对一门主要由人工智能授课的大学课程感到沮丧。讨论揭示了对现代高等教育更广泛的批评。 评论员指出,教学质量正在下降,许多教授缺乏热情或有效教学的时间,导致大学越来越多地依赖人工智能作为降低成本的措施。一些人认为,YouTube等平台上已经存在易于获取、高质量的教育内容,使得传统的讲座显得逊色。 然而,另一些人则为敬业的教授辩护,并强调面对面互动和社区建设的价值——这些是远程、人工智能提供的课程所缺乏的。一个关键点是,有效的教学需要双向反馈,而当前的人工智能难以提供,从而导致糟糕的用户体验。最终,对话表明,大学越来越重视商业方面而非真正的教育,而人工智能可能会加剧这种趋势。
相关文章

原文

Students at the University of Staffordshire have said they feel “robbed of knowledge and enjoyment” after a course they hoped would launch their digital careers turned out to be taught in large part by AI.

James and Owen were among 41 students who took a coding module at Staffordshire last year, hoping to change careers through a government-funded apprenticeship programme designed to help them become cybersecurity experts or software engineers.

But after a term of AI-generated slides being read, at times, by an AI voiceover, James said he had lost faith in the programme and the people running it, worrying he had “used up two years” of his life on a course that had been done “in the cheapest way possible”.

“If we handed in stuff that was AI-generated, we would be kicked out of the uni, but we’re being taught by an AI,” said James during a confrontation with his lecturer recorded as a part of the course in October 2024.

James and other students confronted university officials multiple times about the AI materials. But the university appears to still be using AI-generated materials to teach the course. This year, the university uploaded a policy statement to the course website appearing to justify the use of AI, laying out “a framework for academic professionals leveraging AI automation” in scholarly work and teaching.

The university’s public-facing policies limit students’ use of AI, saying students who outsource work to AI or pass off AI-generated work as their own are breaching its integrity policy and may be challenged for academic misconduct.

“I’m midway through my life, my career,” James said. “I don’t feel like I can now just go away and do another career restart. I’m stuck with this course.”

The Staffordshire case comes as more and more universities use AI tools – to teach students, generate course materials and give personalised feedback. A Department of Education policy paper released in August hailed this development, saying generative AI “has the power to transform education”. A survey last year (pdf) of 3,287 higher education teaching staff by the educational technology firm Jisc found that nearly a quarter were using AI tools in their teaching.

For students, AI teaching appears to be less transformative than it is demoralising. In the US, students post negative online reviews about professors who use AI. In the UK, undergraduates have taken to Reddit to complain about their lecturers copying and pasting feedback from ChatGPT or using AI-generated images in courses.

“I understand the pressures on lecturers right now that may force them to use AI, it just feels disheartening,” one student wrote.

James and Owen said they noticed the use of AI in their Staffordshire course “almost immediately” last year when, during their first class, the lecturer put on a PowerPoint presentation that included an AI version of his voice reading off the slides.

Soon after, they said, they noticed other signs that some course materials were AI-generated, including American English inconsistently edited to British English, suspicious file names, as well as “generic, surface-level information” that occasionally referred inexplicably to US legislation.

Signs of AI-generated material continued this year. In one course video uploaded to the website, a voiceover presenting the material suddenly morphs into a Spanish accent for about 30 seconds, before switching back to a British accent.

Voiceover accent changes mid-lesson in alleged AI-generated course – video

The Guardian reviewed materials from the Staffordshire course and used two different AI detectors – Winston AI and Originality AI – to scan course materials from this year. Both of them found that a number of the assignments and presentations had “a very high likelihood of being AI-generated”.

Early in the course, James said, he brought his concerns to the student representative during a monthly meeting. Then, in late November, he aired them during a lecture, which was recorded as a part of the course materials. In the recording, he asks the lecturer to not bother with the slides.

“I know these slides are AI-generated, I know that everyone in this meeting knows these slides are AI-generated, I would rather you just scrap these slides,” he says. “I do not want to be taught by GPT.”

Soon after, the student representative on the course chimes in, saying: “We have fed this back, James, and the response was that teachers are allowed to use a variety of tools. We were quite frustrated by this response.”

Another student says: “There are some useful things in the presentation. But it’s like, 5% is useful nuggets, and a lot is repetition. There is some gold in the bottom of this pan. But presumably we could get the gold ourselves, by asking ChatGPT.”

The lecturer laughs uncomfortably. “I appreciate people being candid …” he says, then he changes the subject to another tutorial he made – using ChatGPT. “I’ve done this short notice, to be honest,” he says.

Eventually, the course head told James that two human lecturers would be going over the material for the final session, “so you don’t get an AI experience”.

In response to a query from the Guardian, the University of Staffordshire said “academic standards and learning outcomes were maintained” on the course.

It said: “The University of Staffordshire supports the responsible and ethical use of digital technologies in line with our guidance. AI tools may support elements of preparation, but they do not replace academic expertise and must always be used in ways that uphold academic integrity and sector standards.”

While the university brought in a non-AI lecturer for the last lecture in the course, James and Owen said this was too little, too late, especially because the university appears to have used AI in this year’s teaching material as well.

“I feel like a bit of my life was stolen,” James said.

Owen, who is in the middle of a career change, said he had chosen the course to get the underlying knowledge, not just the qualification – and felt it was a waste of time.

“To be sat there with this material in front of you that is just really not worth anyone’s time, when you could be spending that time actually engaging with something worthwhile, is really frustrating,” he said.

联系我们 contact @ memedata.com