AI kills universities? Professor drives students crazy with ChatGPT teaching, sues school for $8,000 in tuition

avatar
36kr
05-16
This article is machine translated
Show original

A student from Northeastern University in the United States discovered that a professor used ChatGPT to generate course materials, angrily suing the school to demand a refund of $8,000 in tuition. Students believe they are receiving algorithmic teaching for their high fees, while professors argue that AI is a tool to liberate productivity. How has ChatGPT stirred up waves on college campuses, from a cheating device to a teaching assistant?

What has ChatGPT brought to education?

When it first appeared, it made cheating extremely easy, with a history paper or literary analysis being completed in just a few seconds.

Even now, professors are still being driven to mental breakdown by students' "AI assignments".

Video from the internet

However, the situation has now taken a dramatic turn. It's not just students using AI, but professors as well.

The use of AI by professors in class has been causing a huge stir in the United States.

Students are complaining, filing grievances, and one student even took the school to court in anger, demanding a refund of $8,000 in tuition.

Just in the past few days, The New York Times published a detailed report on this matter.

Ella Stapleton graduated from Northeastern University in the United States this year.

In February, while reviewing her organizational behavior course notes, she suddenly noticed something was off - "Is this a question the professor asked ChatGPT?"

Her business professor Rick Arrowood was preparing a slide deck for a leadership course, and when Stapleton was halfway through, she discovered an instruction to ChatGPT: "Expand to all areas. More detailed, more specific."

Following this were a series of positive and negative leadership traits, each with a bland definition and an example.

Ella Stapleton said she was surprised to find a professor using ChatGPT to generate course materials

Stapleton sent a message to a friend in her class.

"Did you see the slides the professor posted on Canvas?" she asked, "He made them with ChatGPT."

"Oh my God! Don't say anything," her classmate replied, "What the hell?"

So, Stapleton decided to investigate further.

She examined the professor's PowerPoint and found more AI-generated traces: distorted text, unnecessary body parts in office staff photos, and ridiculous spelling errors.

Stapleton was very unhappy.

Given the school's tuition level and social reputation, she had expected to receive a first-class education.

This course was a required course for her business minor, and the course syllabus prohibited "academic misconduct", including unauthorized use of AI or chatbots.

"He told us not to use it, but he was using it himself," Stapleton said.

Increasingly angry, she filed a formal lawsuit against Northeastern University's business school, accusing the professor of undisclosed AI use and other dissatisfactions with his teaching style, while demanding a refund of the course fees.

These fees accounted for a quarter of the total semester tuition, over $8,000.

A spelling error in the slides of an organizational behavior course

Students are complaining about their instructors' over-reliance on AI on websites like Rate My Professors, carefully checking course materials for ChatGPT's common words like "crucial" and "delve".

Besides accusing professors of hypocrisy, students also raise economic questions: they pay high tuition fees to receive human instruction, not an algorithm they could use for free themselves.

On the other hand, professors believe that AI is a tool for providing better education.

Professors interviewed by The New York Times said that AI helps them cope with heavy workloads, saves time, and can serve as an automated teaching assistant.

The number of teachers using AI continues to rise.

Last year, a nationwide survey of over 1,800 college teachers showed that 18% claimed to be frequent AI users.

The consulting firm Tyton Partners, which conducted the research, repeated the survey this year, and the percentage has nearly doubled.

AI will continue to exist, and professors are in the process of learning to adapt - like Stapleton's professor, struggling to navigate through the technological flaws and students' disdainful looks.

High-scoring paper from ChatGPT?

Last fall, 22-year-old Marie wrote a 3-page paper for an online anthropology course at Southern New Hampshire University.

When she checked her grade, she was excited to find she had received an A.

However, in the comments section, her professor accidentally pasted a conversation with ChatGPT.

It included the grading criteria the professor requested from ChatGPT and requests for "very good feedback".

"In my view, the professor didn't even read the paper I wrote," Marie said.

Marie could understand the professors' difficulties, as they might have hundreds of students to manage.

Despite this, Marie felt wronged and confronted the professor during a video conference.

The professor told Marie that she had indeed read the students' papers but used ChatGPT as a reference - something the school allowed.

Robert MacAuslan, Vice President of AI Affairs at Southern New Hampshire University, stated that the school believes in "the transformative power of AI in education". They have also established guidelines for faculty and students to "ensure that this technology enhances rather than replaces human creativity and oversight".

"These tools should never be used to complete their work," Dr. MacAuslan said, "but can be seen as an enhancement to existing processes."

As for Marie, after discovering that another professor also used ChatGPT to provide feedback, she chose to transfer.

After hearing about Marie's experience, Paul Shovlin, an English professor at Ohio University, said he could understand Marie's frustration.

Dr. Shovlin is also an AI faculty researcher, responsible for researching ways to correctly integrate AI into teaching and learning.

"As mentors, our value lies in the feedback we give students," he said, "because as people who read and are touched by students' writing, we establish a human connection with students."

Paul Shovlin, Ohio University professor and AI faculty researcher

Dr. Shovlin advocates integrating AI into teaching, but he believes the purpose is not just to make instructors' work easier.

He believes students need to learn to use AI responsibly and "cultivate moral standards in the AI era" because they will certainly use AI at work.

Shovlin gave an example: In 2023, the head of Vanderbilt University's School of Education sent an email to the community calling for solidarity after a major shooting at another university.

The email mentioned creating a "culture of care" through "building close relationships", but the last sentence revealed it was written by ChatGPT.

After students criticized this approach of outsourcing empathy to a machine, the person responsible was temporarily suspended.

But things are not always so black and white.

Dr. Shovlin says establishing AI usage rules is complex because reasonable usage may differ across disciplines.

The teaching, learning, and assessment center where he works has not established rigid rules but has proposed some "principles" for AI use, such as avoiding a one-size-fits-all approach.

The New York Times interviewed dozens of professors whose students had mentioned online that these professors had used AI.

These professors said they used ChatGPT to design computer science programming assignments or create quizzes about required reading.

They also use it to organize feedback for students or to make the feedback more gentle.

They state that as domain experts, they can tell when ChatGPT is "talking nonsense" or getting facts wrong.

However, opinions differ on how to correctly use AI.

Some professors admit to using ChatGPT to help grade assignments, while others strongly oppose this approach. Some emphasize transparency when using AI with students, while others do not disclose its use due to students' concerns about the technology.

Nevertheless, most people feel that Stapleton's experience at Northeastern University, where professors use AI to generate course materials and PowerPoints, is completely acceptable.

Dr. Shovlin agrees, but with the condition that professors must edit and confirm the AI-generated content to demonstrate professionalism and knowledge.

He compares this to long-standing practices in academia, such as using teaching plans or case studies from third-party publishers.

He says it's ridiculous to treat a professor as a "monster" just because they use AI to create PowerPoints.

"In my view, that's absurd."

An Enhanced Calculator

Shingirai Christopher Kwaramba, a business professor at Virginia Commonwealth University, describes ChatGPT as a time-saving partner.

He says lesson plans that previously took days can now be completed in just a few hours.

For example, he uses it to generate datasets for fictional chain stores, which students use in practice to understand various statistical concepts.

"I see it as an era of a 'super calculator'," Dr. Kwaramba says.

He says he now has more time for student consultations.

Other professors like David Malan from Harvard University indicate that AI usage means fewer students come to the office for tutoring.

Dr. Malan, a computer science professor, has integrated a custom AI chatbot into his fundamental computer programming course.

Hundreds of his students can seek help from this AI for coding assignments.

Dr. Malan had to adjust the chatbot to refine its teaching method, ensuring it only provides guidance and not complete answers.

In 2023, the first year of the chatbot's launch, most of the 500 surveyed students found it helpful.

Dr. Malan says he and his teaching assistants no longer need to answer "tedious questions about introductory materials" during consultation hours, and can instead focus on weekly lunch meetings and programming marathons that create "more memorable moments and experiences".

Katy Pearce, a communication professor at the University of Washington, developed a custom AI chatbot trained on her previously graded assignments.

Now, this AI can provide feedback in her personal style at any time, day or night.

She says this is especially beneficial for students who are not typically proactive in seeking help.

"In the foreseeable future, will most of the work done by graduate teaching assistants be replaced by AI?" she says, "Absolutely."

So, what will happen to the career path from teaching assistant to future professor?

"This will definitely become an issue," Dr. Pearce says.

Harvard campus in Cambridge, Massachusetts

This article is from the WeChat official account "New Intelligence", author: New Intelligence, published with authorization from 36Kr.

Source
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments