Skip to content

Don’t want students to trust ChatGPT? make them use it

Featured Sponsor

Store Link Sample Product
UK Artful Impressions Premiere Etsy Store


When for the first time caught students trying to use ChatGPT to write their essays, it felt like an inevitability. My initial reaction was one of frustration and irritation, not to mention pessimism and doom at the slow collapse of higher education, and I suspect most educators feel the same way. But as I thought about how to respond, I realized that there might be a teaching opportunity. Many of these essays used sources incorrectly, either citing books that did not exist or misrepresenting those that did. When students started using ChatGPT, they seemed to have no idea what could be wrong.

I decided that every student in my religion studies class at Elon University would use ChatGPT to generate an essay based on a notice I gave them and then I “rate” it. He had anticipated that many of the essays would have errors, but he did not expect all of them to. Many students expressed dismay and dismay to learn that the AI ​​could fabricate false information, including the page numbers of non-existent books and articles. Some were confused, amazed and disappointed at the same time. Others raised concerns about how over-reliance on such technology could induce laziness or encourage misinformation and fake news. Closer to the bone were fears that this technology could take people’s jobs. Students were alarmed that major tech companies had pushed out AI technology without making sure the general population understood its drawbacks. The task met my goal, which was to teach them that ChatGPT is not a functional search engine or a foolproof writing tool.

Other educators tell me they have tried similar exercises. A teacher had students write essays and then compare them to one that ChatGPT wrote on the same topic. Another produced a standard ChatGPT essay that each student graded. Future versions of this task could focus on learning how to activate this AI, to tell it more precisely what to do or not do. Educators could also have students compare ChatGPT to other chatbots, like Bard. Teachers can test ChatGPT by asking for a specific argument and asking the AI ​​to use at least three sources with citations and a bibliography, and then show the results to the class. The notice could be tailored to the content of each class so that students are more likely to catch any errors.

When I tweeted about this assignment, some more zealous AI supporters were upset that I didn’t mandate the use of GPT-4 or teach students how to use plugins or reapply, which (supposedly) would have given them better, more accurate essays. . evaluate But this misses the point of the task. Students, and the general population, don’t use ChatGPT in these subtle ways because they don’t know such options exist. The AI ​​community is unaware of how little information about the flaws and inaccuracies of this technology, as well as its strengths, has been leaked out into the open. Perhaps AI literacy can be extended with tasks that incorporate these strategies, but we must start from the absolute baseline. By demystifying technology, educators can reveal the fallible Wizard of Oz behind the curtain.

Students and educators alike seem to have internalized the oppressive idea that human beings are substandard and relatively unproductive machines, and that superior ones, perhaps AI, will supplant us with their perfect precision and 24/7 work ethic. 7 days a week. Showing my students just how flawed ChatGPT is helped restore confidence in their own minds and abilities. No chatbot, even a completely trustworthy one, can take away from my students the understanding of their value as people. Ironically, I think bringing AI into the classroom reinforced this for them in a way they hadn’t understood before.

My hope is that having my students grade essays generated by ChatGPT will inoculate them against over-reliance on generative AI technology and increase their immunity to misinformation. A student has since told me that she tried to discourage a classmate from using AI for her homework after learning of her propensity for collusion. Perhaps teaching with and about AI can really help educators do their job, which is to enlighten the minds of young people, help them formulate who they are and what it means to be human, and ground them as they face the challenge of a future. in flow.


WIRED Opinion publishes articles by external contributors representing a wide range of viewpoints. Read more reviews here. Submit an opinion piece on ideas@wired.com.




—————————————————-

Source link

We’re happy to share our sponsored content because that’s how we monetize our site!

Article Link
UK Artful Impressions Premiere Etsy Store
Sponsored Content View
ASUS Vivobook Review View
Ted Lasso’s MacBook Guide View
Alpilean Energy Boost View
Japanese Weight Loss View
MacBook Air i3 vs i5 View
Liberty Shield View
🔥📰 For more news and articles, click here to see our full list. 🌟✨

👍🎉 Don’t forget to follow and like our Facebook page for more updates and amazing content: Decorris List on Facebook 🌟💯

📸✨ Follow us on Instagram for more news and updates: @decorrislist 🚀🌐

🎨✨ Follow UK Artful Impressions on Instagram for more digital creative designs: @ukartfulimpressions 🚀🌐

🎨✨ Follow our Premier Etsy Store, UK Artful Impressions, for more digital templates and updates: UK Artful Impressions 🚀🌐