TRUTH BE TOLD
WHO IS DOING YOUR CLASS ASSIGNMENTS?
By DR VICKI BISMILLA
Several years ago, while I was a college Academic Vice-President, our teaching teams were starting to grapple with the issue of trust when it came to marking assigned essays.
At that time the encroaching temptations for time-crushed students to harvest paragraphs came in the shape of online research papers, legitimate peer-reviewed journal articles, library books and the much-maligned Wikipedia.
Some students occasionally handed in essays in which sections were taken from published works and they failed to cite their borrowed pieces which amounted to plagiarism. At that time our teams either caught these infractions through their own expertise and trained eyes or they used tools such as Turnitin that highlighted plagiarized sections of essays. In which case the student was given a zero.
Fast forward a decade to today and the mind boggles at the way artificial intelligence has completely upturned the world of teacher-student trust when it comes to assigned essays.
For example, ChatGPT is an AI tool that now can write about anything you instruct it to write. So, you could ask ChatGPT, “How does fossil fuel impact the world?” and it will spit out the pros and cons in paragraph format in a matter of minutes. The answers, though stilted, can often be satisfactory too! Scholars such as Brady Lund and Tina Wang in Library High Tech News call it a sophisticated chatbot as they interview it on its potential impact on academia, libraries, and ethics. Its interesting that the answers that the chatbot gave to these researchers on its ethics actually sounded like a robot.
While its information is limited by its own training data, it does caution about the possibility of personal security breaches, especially if used to generate answers in such sensitive areas as health and medicine.
Other researchers like Debbie Cotton, Peter Cotton and Reuban Shipway in their book Chatting and Cheating: Ensuring Academic Integrity in the Era of ChatGPT maintain that ChatGPT actually increases student engagement but educational institutions need to develop specific policies and procedures to deal with its use. Its use is touted especially for specific tasks such as language translation, but caution needs to be used since it also has the potential of creating fake news that influences public opinion. But student plagiarism on essays remains a significant concern. When used by students to research their assigned topic, ChatGPT peruses all online information sources available, processes them and harvests on behalf of the student who is asking. So, when a student hands that in as an essay what is the teacher to do?
The best response to this dilemma that I have read came from Anil Verma, professor emeritus at the University of Toronto’s Rot-man’s School of Management He maintains, sagely, and calmingly and I quote from the Toronto Star:
“We have seen waves of new technology since the invention of the steam engine. In the wake of every wave we have adapted, I would argue, rather successfully. Sure, new technologies transform legacy jobs but in the final outcome, they serve as tools to serve humanity.”
He does, however, see our dilemma as teachers concerned about the negative sides of this AI tool. He says that he would tell his students in his very first class of the semester that if they use ChatGPT they should master the technology as they will need that skill in their career journeys. They must learn the tool’s advantages and limitations. For example, while it can scan a huge amount of online material (scholarly and mundane) and give grammatically acceptable paragraphs, it does not cite the materials it has used. It can also be garbled because it can go from one source that presents one argument to another source that presents a counter opinion without explaining. So, students cannot just hand in what ChatGPT gives them. They need to read, understand, separate the pieces and the conflicting opinions, and explain them. They would need to state upfront that they have used ChatGPT, and they need to pursue the sources and cite them.
Furthermore, professor Verma would require his students to give him the exact command that they gave to ChatGPT so that he can determine if the students actually understood the class discussions and the shape they are hoping their essays will take. He would expect to see the students’ own opinions and thoughts about the topic and about the materials they used. Also, he would require them to follow essay-writing conventions such as proper introductions, systematically developed arguments, stated results, and logically arrived at conclusions. Their essay would need to contain evidence threaded throughout that lead to the conclusions at which they arrive. In other words, he expects his students to be ethical, rigorous, and transparent.
This seems like a sensible approach to a technology that is here to stay.