December 23, 2024
Education News Canada

YORK UNIVERSITY
How will AI tools such as ChatGPT shape teaching and learning?

January 30, 2023

By Angela Ward  

ChatGPT, an artificial intelligence (AI) tool that has dominated the headlines of late, has been labelled as a transformational force in academia. How are York faculty harnessing this powerful tool?

Out of all the emerging AI tools, ChatGPT has been the focus lately. The chatbot, developed by OpenAI, interacts with users in a dialogue, answering their prompts with complex responses. Despite the uncertainty that comes with this new technology, it offers exciting possibilities for education.  


Angela Clark

"We always knew this was on the horizon in the academic integrity world," said Angela Clark, an academic integrity officer in the Office of the Vice-Provost Academic. "This is a new generation of AI tools that represents a big leap from the AI tools in existence prior to OpenAI releasing DALL-E and ChatGPT last November. We're still in the beginning stages of learning about what these tools can do and their uses in education."  

Robin Sutherland-Harris, educational developer in the Teaching Commons (TC) at York University, adds that the use of these tools is already a reality in the working world. "As educators, it makes sense to adapt to these AI tools because students will graduate into this world with AI as part of the landscape. We should think about how we can incorporate them into assessments, strategies and ways that we think about disciplinary writing within academia. AI tools are going to change all of these.  

"I think the process of writing academically will probably shift with the integration of these AI tools into existing software, especially with predictive text completion." 

Sutherland-Harris is interested in how tools like ChatGPT could help educators reflect on the nature of disciplinary writing and assessments. She said, "I'm excited by the possibilities for thinking about disciplinary writing, thinking about what kind of assessments are robust and how we're asking students to do what AI is not able to do, such as taking multiple sources in combination and analyzing them for new conclusions.  

"My understanding is that ChatGPT is good at comparing one thing at a time but less adept at using deeper evidence to construct new arguments. How are we building assessments that addresses this, instead of getting bogged down in the mechanics of the writing?" 

ChatGPT can also encourage critical thinking when it comes to fact-checking content in classroom activities since it's not always accurate with answers and citations. Sutherland-Harris said, "It's helpful to start with an example text, which can be used to workshop ideas or interrogate what the AI is getting right in terms of a specific thinker, period of history or analysis and ask where the AI is falling short. It gives a useful starting point to push conversations into quite a deep level to really engage with content and discuss how writing should conform to the norms of the discipline, such as English or history."  


Robin Sutherland-Harris

Although there are exciting opportunities with AI tools, there are also challenges and concerns within this new terrain. ChatGPT can produce AI-generated essays, programming code and math solutions, which raises concerns about academic integrity.  

"When it comes to academic integrity, there will be some upheaval. It will be challenging as we all adapt to ChatGPT and come up with ways to integrate it into learning. In the short term, there may be more suspicion that students have engaged in academic misconduct. This may lead to more security measures being put in place such as having students write assessments by hand in the classroom or changing assessments from written work to oral. This may happen in order to be cognizant of professors' time, so they don't have to scramble to completely restructure how they assess," said Clark.

In response to ChatGPT, some educators are already changing how they approach assessments and what they're planning for the semester, said Sutherland-Harris. They are also searching for strategies around course-level policy that can protect academic integrity. Both Sutherland-Harris and Clark agree that this creates an opportunity for open discussion in classrooms, where educators might speak with students in-depth about the ideas they're presenting or develop a charter with students on the use of academic integrity and AI tools.  

This open communication "engenders trust amongst the learning community," Clark said. "Given that there are currently no citation guidelines for the content that these tools produce, instructors might even ask students how they think material should be cited." 

When it comes to ensuring academic honesty, some educators will encourage transparency from students when they submit written work, Sutherland-Harris said. "Professors might ask students if they've used any AI or assistive writing technology. What was it and how did you use it? Students might use it to create an outline or draft an introduction before rewriting it. The use of AI for some educators is already being normalized as part of the writing process."

Reflecting on citations, she notes that there is a gap when it comes to the norms around citing AI. "How do we cite and recognize the use of assistive technologies in the same way we cite other people's ideas?" 

In terms of what tools like ChatGPT mean for the future, Clark said, "We now have ChatGPT 3.5, which has been shown to make mistakes at times. It can't really synthesize information from different sources, show evidence of critical thinking and it makes errors when asked to generate programming code or solve math problems. As such, there are ways to detect when it has been used and in the short term, we can maintain our current practices. But GPT-4 will be released soon and who knows what that will bring? It also keeps improving as more people use it, prompting it to learn' and evolve."  

"I think the ways that people think about, and structure assessment are already changing and will continue to change," Sutherland-Harris added. "I wonder about the implications that AI which is good at writing will have on scholarly writing over time, which will affect how we educate students." 

An upcoming workshop in February will be facilitated by both Sutherland-Harris and Clark to give professors the opportunity to connect on this topic. Instructors who are interested can register here. Different academic integrity resources for instructors and students are in development to help promote more clarity on these tools and their use.   

This story was originally featured in YFile, York University's community newsletter.

For more information

York University
4700 Keele Street
Toronto Ontario
Canada M3J 1P3
www.yorku.ca


From the same organization :
313 Press releases