Artificial intelligence (AI) has been a buzzword in the tech industry for quite some time now, and its impact on education is undeniable. With the rise of chatbots, virtual assistants, and language models, education is no longer limited to traditional teaching methods. Enter ChatGPT, a large language model developed by OpenAI, is one example of how AI revolutionizes education in academic settings.
ChatGPT is a state-of-the-art language model that can answer questions, generate text, and write essays. It has been trained on massive data, allowing it to understand and interpret natural language like a human. This ability to understand natural language sets ChatGPT apart from traditional chatbots or virtual assistants. Unlike other chatbots that rely on pre-defined scripts, ChatGPT can understand and respond to any question or query.
In academic settings, ChatGPT can be a game-changer. It can provide students with quick and primarily accurate answers to their questions and can even help them with their homework or assignments. Imagine asking ChatGPT a question about a complex topic in engineering or mathematics and getting an instant, clear, concise response. This can save students time and effort searching for the correct answer or understanding a difficult concept.
Furthermore, ChatGPT can also help teachers and professors. It can assist them in creating content for their lectures or writing research papers. ChatGPT can generate coherent and grammatically correct text, which can be a boon for academics struggling to find the right words to express their ideas.
Despite gaining immense popularity in recent years, with many individuals and organizations turning to it for various tasks, the use of ChatGPT in educational settings has raised some eyebrows among university academics. Some thought leaders in higher education appear to have serious reservations and say the use of AI-generated content in higher education raises difficult questions about the nature of plagiarism.
In a recent interview with the Greek Reporter, Noam Chomsky, a US linguist, author, and well-known public intellectual, did not candy-coat his disdain for AI-generated content being allowed in higher education settings.
“ChatGPT is high-tech plagiarism; it undermines education. For years there have been programs that have helped professors detect plagiarized essays. Now it’s going to be more difficult because it’s easier to plagiarize,” said Chomsky. He dismissed ChatGP as a way of avoiding learning and for students to learn “absolutely nothing.”
Other voices in higher education have also raised concerns. Philosophy professor James Stacey Taylor at the College of New Jersey said he caught 14 of his 163 students using ChatGPT. Some professors have also noted a rise in using the AI chatbot for essays and assignments.
But what about engineering education in the era of ChatGPT, especially when considering that one of the most significant drawbacks of using the chatbot in academic settings is its potential to produce unreliable or inaccurate responses?
Indumathi V, the Deputy Dean of the Engineering Institute of Technology (EIT), said ChatGPT might need to fully understand the context of certain engineering concepts, discipline-specific terminologies, or technical jargon that could lead to incorrect or incomplete responses.
“It also relies on existing data and information to generate responses, which means it may need help to provide a creative or original solution to complex engineering problems. Being a language model, it cannot perform hands-on tasks or experiments, potentially limiting its ability to provide practical advice to engineering problems or projects,” she pointed out.
As it is, the chatbot’s ability to generate responses based on previous input depends on the quality and quantity of data it has been trained on. If the input needs to be more balanced or sufficient, it can lead to the machine producing equally flawed or biased responses. This can be particularly problematic in academic settings, where accuracy and reliability are paramount.
“It is also primarily designed for text-based input. It may need help answering non-textual inputs such as diagrams, images, or videos, often used in engineering courses,” said the Deputy Dean.
Another disadvantage of using ChatGPT in academic settings is its inability to consider the context of the question or the answer. ChatGPT’s algorithms are based solely on patterns and probabilities, and they cannot understand the nuances of language, culture, or even subject matter. As a result, the machine may provide contextually inappropriate or misleading responses, leading to confusion and misunderstanding on the user’s part.
Using ChatGPT in academic settings can discourage critical thinking and independent learning. While the machine can provide answers to specific questions, it cannot guide students through the thought process required to arrive at those answers. As such, students may become overly reliant on ChatGPT and fail to develop their critical thinking skills, essential for academic success and professional growth.
On whether overreliance on technology like ChatGPT could decrease students’ critical thinking skills and problem-solving abilities, Indumathi V said, “Absolutely!”. She added that when students rely too heavily on AI language models to provide answers and solutions, they may not develop the skills to think critically, analyze information, and solve problems independently.
“For example, if a student uses ChatGPT to generate an answer to a homework problem without fully understanding the underlying concepts, they may struggle to apply those concepts to new and different problems.
“Similarly, suppose a student relies on ChatGPT to provide them with solutions to complex problems. In that case, they may miss the opportunity to develop their problem-solving skills and strategies. Students miss out on the opportunity to fail and learn from mistakes, which sometimes is the best way to learn!”.
Another potential disadvantage of using ChatGPT in academic settings is its lack of emotional intelligence. ChatGPT’s algorithms are designed to generate responses based on logic and probability, and they cannot understand emotions or empathize with individuals.
This can be particularly problematic in academic settings, where students may need emotional support or guidance from their teachers or peers. The machine’s inability to provide emotional intelligence can create a sterile and uninviting learning environment, ultimately hindering student engagement and success.
For now, though, there appears to be no definitive verdict on using ChatGPT in educational settings. Only some people are as concerned about the points highlighted by academics such as Noam Chomsky. Some lecturers have welcomed the development of AI as an opportunity to introduce a new teaching tool. In contrast, others think the quality of AI-generated content could be better to pose a severe threat to academia.
And when it comes to students who have used the chatbot, it is an equally mixed bag of opinions and experiences, as the New York Times discovered from a mini survey they’d conducted. For their survey, the publication invited teenagers to tell them how schools should respond to ChatGPT.
Many respondents concluded that the chatbot was a mighty, sometimes unreliable, tool. Some expressed concern that the A.I. tool would deprive them of their creativity, critical thinking, and motivation, while others felt it would lead to rampant cheating.
Jonathan, a student at PACE High School in Texas, said, “I believe that using chatbots and AI in school is dangerous for motivation and knowledge. Why write if a bot does it for me? Why learn when a bot does it better? I find this similar to the lack of motivation faced in math classes worldwide when the portable calculator was invented, and it is plausible that the same can happen in English classes if this AI is used …?”.
Conversely, though, several teenagers argued that A.I. is the future, that schools should embrace it rather than restrict it, and that all of this was an overreaction. “Everyone needs to chill out!” she told the New York Times. “ChatGPT is certainly not the end of the world, nor the eradication of writing as a whole.”
As an AI language model, EIT’s Deputy Dean aggress that ChatGPT can provide students with quick access to information and help them understand complex concepts more quickly. She said it could also assist them with their language skills and articulating their thoughts more clearly.
“It has the ability to level the playing field for international students from diverse backgrounds and English as their second language. However, students must use ChatGPT responsibly and ethically. They should use the tool to supplement their learning and understanding rather than rely on it exclusively.”
This, noted Indumathi V, means using ChatGPT to help them answer specific questions or provide additional information about a topic while still taking the time to process and analyze the information provided by the tool.
In conclusion, while ChatGPT is a powerful tool with many benefits, its use in academic settings can also have significant drawbacks. These disadvantages range from unreliable or inaccurate responses to ethical concerns around data privacy and ownership.
As such, educators and institutions must carefully weigh the benefits and drawbacks of using ChatGPT in academic settings before adopting it as a teaching tool. Ultimately, the goal should be to create an environment that fosters critical thinking, independent learning, and student success while utilizing technology responsibly and ethically.
NB: Readers should know that the last two paragraphs of this article (the conclusion) were written by ChatGPT when asked to “list the disadvantages of using ChatGPT in academic settings.”
Subscribe to our newsletter for all the latest updates.