Search the Community
Showing results for tags 'taught'.
-
https://sg.style.yahoo.com/quit-teaching-because-chatgpt-173713528.html I Quit Teaching Because of ChatGPT This fall is the first in nearly 20 years that I am not returning to the classroom. For most of my career, I taught writing, literature, and language, primarily to university students. I quit, in large part, because of large language models (LLMs) like ChatGPT. Virtually all experienced scholars know that writing, as historian Lynn Hunt has argued, is “not the transcription of thoughts already consciously present in [the writer’s] mind.” Rather, writing is a process closely tied to thinking. In graduate school, I spent months trying to fit pieces of my dissertation together in my mind and eventually found I could solve the puzzle only through writing. Writing is hard work. It is sometimes frightening. With the easy temptation of AI, many—possibly most—of my students were no longer willing to push through discomfort. In my most recent job, I taught academic writing to doctoral students at a technical college. My graduate students, many of whom were computer scientists, understood the mechanisms of generative AI better than I do. They recognized LLMs as unreliable research tools that hallucinate and invent citations. They acknowledged the environmental impact and ethical problems of the technology. They knew that models are trained on existing data and therefore cannot produce novel research. However, that knowledge did not stop my students from relying heavily on generative AI. Several students admitted to drafting their research in note form and asking ChatGPT to write their articles. As an experienced teacher, I am familiar with pedagogical best practices. I scaffolded assignments. I researched ways to incorporate generative AI in my lesson plans, and I designed activities to draw attention to its limitations. I reminded students that ChatGPT may alter the meaning of a text when prompted to revise, that it can yield biased and inaccurate information, that it does not generate stylistically strong writing and, for those grade-oriented students, that it does not result in A-level work. It did not matter. The students still used it. In one activity, my students drafted a paragraph in class, fed their work to ChatGPT with a revision prompt, and then compared the output with their original writing. However, these types of comparative analyses failed because most of my students were not developed enough as writers to analyze the subtleties of meaning or evaluate style. “It makes my writing look fancy,” one PhD student protested when I pointed to weaknesses in AI-revised text. My students also relied heavily on AI-powered paraphrasing tools such as Quillbot. Paraphrasing well, like drafting original research, is a process of deepening understanding. Recent high-profile examples of “duplicative language” are a reminder that paraphrasing is hard work. It is not surprising, then, that many students are tempted by AI-powered paraphrasing tools. These technologies, however, often result in inconsistent writing style, do not always help students avoid plagiarism, and allow the writer to gloss over understanding. Online paraphrasing tools are useful only when students have already developed a deep knowledge of the craft of writing. Students who outsource their writing to AI lose an opportunity to think more deeply about their research. In a recent article on art and generative AI, author Ted Chiang put it this way: “Using ChatGPT to complete assignments is like bringing a forklift into the weight room; you will never improve your cognitive fitness that way.” Chiang also notes that the hundreds of small choices we make as writers are just as important as the initial conception. Chiang is a writer of fiction, but the logic applies equally to scholarly writing. Decisions regarding syntax, vocabulary, and other elements of style imbue a text with meaning nearly as much as the underlying research. Generative AI is, in some ways, a democratizing tool. Many of my students were non-native speakers of English. Their writing frequently contained grammatical errors. Generative AI is effective at correcting grammar. However, the technology often changes vocabulary and alters meaning even when the only prompt is “fix the grammar.” My students lacked the skills to identify and correct subtle shifts in meaning. I could not convince them of the need for stylistic consistency or the need to develop voices as research writers. The problem was not recognizing AI-generated or AI-revised text. At the start of every semester, I had students write in class. With that baseline sample as a point of comparison, it was easy for me to distinguish between my students’ writing and text generated by ChatGPT. I am also familiar with AI detectors, which purport to indicate whether something has been generated by AI. These detectors, however, are faulty. AI-assisted writing is easy to identify but hard to prove. As a result, I found myself spending many hours grading writing that I knew was generated by AI. I noted where arguments were unsound. I pointed to weaknesses such as stylistic quirks that I knew to be common to ChatGPT (I noticed a sudden surge of phrases such as “delves into”). That is, I found myself spending more time giving feedback to AI than to my students. So I quit. The best educators will adapt to AI. In some ways, the changes will be positive. Teachers must move away from mechanical activities or assigning simple summaries. They will find ways to encourage students to think critically and learn that writing is a way of generating ideas, revealing contradictions, and clarifying methodologies. However, those lessons require that students be willing to sit with the temporary discomfort of not knowing. Students must learn to move forward with faith in their own cognitive abilities as they write and revise their way into clarity. With few exceptions, my students were not willing to enter those uncomfortable spaces or remain there long enough to discover the revelatory power of writing.
-
Very interesting article. https://jaxenter.com/learn-to-code-its-harder-than-you-think-122738.html I totally agree with it. I did join the computing club when i was in secondary 1. Found it too tough, i just couldn't do even simple programming. My brain isn't just wired correctly to do this. My wife did computer engine. By her own admission, she's at best an average programmer. But even then i think her logical processing at least in terms of computing is way beyond me. So good coders/programmers is still something in very short supply and a skill like learning surgery or being a pilot.
- 36 replies
-
- 1
-
- education
- programming
-
(and 4 more)
Tagged with:
-
I learnt this the hard way. Packaging of work effort is very important. It's not enough being an excellent worker...it's more important how you package it. I wasted much of my youth (20s to 30 yr old) thinking that so long as I did a good job, I will be recognized. It can't be further from reality. What I learnt and apply constantly (with some level of success) is if you are doing a piece of work....you need to show... 1. What is the business rationale behind your work. What problems are you solving and why it is so important. In short I need to identify business stakeholders. I don't waste my time on work if there is no clear business stakeholders or business stakeholders that matters. 2. What are the various steps you take. Always talking about what are my strategy...what are the tactics taken. Never confuse strategy with tactics. Tactics can fail...but strategy should not. Who comes up with Strategy? Me. Who execute the tactics? My directs or vendors. You get the drift i hope. 3. What is the positive impact to business as a result of the steps you undertaken And I make sure end of the day, my direct boss looks very good as he/she present it to higher up. I heard some of the younger folks complaining about how their managers only know how to talk. Well guys, it's a very important skill if you wanna move up the ladder. I wish someone told me this when I was younger. Ok...just a friday rant
-
Not talking about passing your driving test, but how to survive + not be a nuisane on the roads, in real life?? My time life was simple, and roads are half deserted, so i have no complaint about the lesson he taught me...but what about today??
- 170 replies
-
- think
- instructor
-
(and 2 more)
Tagged with:
-
Hi, Just curious to find out from you guys. Have you wonder why the centre teaches parking using the black and white guide poles? I understand way back it was the trend but now it seem like it's not really the most effective methods? They should fix up some fake car bumpers or whatever to simulate actual scenarios as to simulate vertical or parallel parking in between other vehicles. I believe most of us that passed from the centres have to spent sometimes to orientate ourselves to park in car parks or parallel lots without the guide poles. Guess they do that to squeeze more money from the trainees as when we pass the test , the sales personnel will approach us with the freshie driver orientation course that teaches us parking without poles and driving up multi-storey carparks and the course cost another couples bucks to register. Shouldn't this be include in the lessons? BTW, anyone have any comments to share or any effective methods for parking without the aid of guide poles. Cheerios