Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The definition of "cheating" will change... because those students are, in fact, learning.

They are learning how to accomplish a lot more, a lot faster, with AI.

That's a valuable skill -- one of the most valuable skills that students can learn today.

It's not the only valuable skill they should learn, but it's certainly one of the most valuable.

Educators will have to adapt.



Wishful thinking if you read the article. Most of these kids aren't learning anything about how to use AI. They aren't like a Simon Willison or Janus or anything. They're not learning about confabulations, they're not learning about scaling laws or tokenization, they're not learning to use the API, they're not running multiple models to get a feel for how they think differently, they ain't doing none of that shit you wishfully think they are - because they're cheating to minimize effort so they can go do something else like watch Tiktok. (Even Lee, who is by far the most successful and sophisticated user profiled in OP, is just dumping out stuff as fast as he can to try to make a buck.)

They are copy-pasting the assignment from Blackboard to ChatGPT and vice-versa. As the article points out, they usually aren't even reading the outputs to check that the essay doesn't suddenly start bloviating about Aristotle.

And no one needs to pay a college wage premium, or any wage, for that:

> He worries about the long-term consequences of passively allowing 18-year-olds to decide whether to actively engage with their assignments. Would it accelerate the widening soft-skills gap in the workplace? If students rely on AI for their education, what skills would they even bring to the workplace? Lakshya Jain, a computer-science lecturer at the University of California, Berkeley, has been using those questions in an attempt to reason with his students. “If you’re handing in AI work,” he tells them, “you’re not actually anything different than a human assistant to an artificial-intelligence engine, and that makes you very easily replaceable. Why would anyone keep you around?” That’s not theoretical: The COO of a tech research firm recently asked Jain why he needed programmers any longer.


I read the article. Those kids are definitely learning how to think, just not in the way educators wish. One of the girls quoted said she loves writing, but finds it helpful to get structure and initial content from generative AI. The kid at Columbia has already launched two startups. Clever kids always figure out how to leverage the tools they have at hand.

With the advent of new technology, humanity has no choice but to move up the value chain -- in this case, from executing tasks to judging, supervising, and curating AI work.


BTW, this is how it's going with the 'clever kids' and hiring for their cheating startup: https://x.com/im_roy_lee/status/1920206362996343261

    it's really fucking hard to find competent engineers. 

    i've interviewed about 20 cs majors who claim to be experts at typescript and only 3 of them have been able to explain what the purpose of useState is (not trolling).
(There are at least 2 ways this tweet is embarrassing.)


> One of the girls quoted said she loves writing, but finds it helpful to get structure and initial content from generative AI.

Which shows she's not learning from the AI or even improving on it. All she's doing is filling out the 'structure and initial content' - oh, is that all???

> The kid at Columbia has already launched two startups.

Which doesn't require learning anything much from the LLMs, as he did that on his own before them (note the Leetcode). Also, considering that the second one is fraudulent and if it works, exists largely to abet fraud, this is not a ringing endorsement.

> With the advent of new technology, humanity has no choice but to move up the value chain -- in this case, from executing tasks to judging, supervising, and curating AI work.

ChatGPT couldn't written that ending any better. (I will charitably assume it did not, unless you took the trouble to replace that emdash with a double hyphen just to be cute.) It could have written it about 1000x cheaper, however.


I'm not sure there's a choice. The economic incentives are too strong. Like it or not, those individuals who figure out (i.e., learn) how to leverage AI to do more, faster, and (eventually) better will have an advantage over those individuals who don't. Many kids intuitively seem to understand this.

Believe it or not, I write all my comments the old-fashioned way -- without the help of any AI!


there is on choice. my kid uses about 11 different AI tools and she is about to turn 12. it is on parents to guide this but any kid who is not all over this at the deep level will be at insurrmountable disadvantage


Maybe they're "learning" how to use AI [1], but in at least some cases they're not learning what they're supposed to be learning from the class they're enrolled in [2]. The point of the assignments was never to produce the the answers as fast as possible by any means necessary [3], but to build strong foundations so that you can later do much more challenging/useful things.

If your assignment is "program X in assembly," and you just copy/paste the prompt into ChatGPT, you aren't learning assembly, you aren't learning important concepts relevant for low-level performance optimization, etc. etc. That was the point, not to produce X.

What would be useful is specific classes with assignments that are designed to be worked on with the assistance of AI. "AI-enabled SWE" etc. Lessons on how to effectively utilize it for research and learning. But those things aren't replacing learning the fundamentals, and probably can't unless AGI is achieved and actual knowledge is just economically irrelevant.

[1]: For a lot of undergrad assignments you can probably just copy/paste the description into ChatGPT, which I don't think qualifies as being difficult enough to merit "learning" how to do).

[2]: I guess this is "fine" if you're in the "AI will imminently replace all white-collar work" camp, but then its just dumb to be enrolled in college in the first place, unless maybe you're going for free.

[3]: In some sense, fully outsourcing your thinking to AI is the same as just paying someone to do your homework, which has always been possible but less accessible.


Yes, prompting an LLM is a kind of skill. But relying on it has been shown to hinder development of the most important skill of all--critical, independent thinking. And ultimately that's the only "accomplishment" that matters when it comes to education. Completed homework assignments are not a product, they're a process for making someone think, and that's what's being avoided with LLM usage.


Right, until dependency on the tools makes them unable to solve problems that the tools can't solve for them.


Educators are already adapting. I teach CS and we do verbal assessments of students their code. According to my students, a certain percentage slip through but all in all, it's nowhere the problem that this article makes it out to be (at my college).


Now what will said cheaters do when they don't have access to AI? Like if the internet goes down, or if they work in an environment where they can't access it? Or if the company says no, because it's a security risk?


“You won’t always have a calculator” never convinced kids in school who didn’t want to learn math.


Wait until the internet comes back up? Work at a different company?


'Sorry boss, I can't work because I don't have internet and don't have access to my AI crutch' isn't exactly a stellar argument. That's someone I would immediately regret hiring.


'Sorry boss, I can't work because I don't have internet and don't have access to GitHub/CI/license server/email/any of our Saas tools/any of my files' happens every time the internet goes down and bosses are quite understanding.


If your internet is going down regularly enough for this to be an issue you have bigger problems.


If you have an architecture office and the power goes out, do you have your workers give up on using AutoCAD and go to the drafting boards and slide rulers till the power comes back up? What's the point of going backwards with technology? You'll be outcompeted by those who embrace modern technology.

I'm not aware of any modern engineering profession that doesn't use electronic crutches regularly. Nobody knows absolutely everything in their heads. We're paid for critical thinking , not to have everything memorized just in case the internet is out. Even before the internet, SW devs had stacks of books on their desks they'd reffer to often.

Remember when IBM wanted to pay devs per line of code written? Yeah, exactly. I see the skills being problem solving, not writing code.


What if your pencils break? Won’t hire a dev unless they can manufacture their own writing utensils.


Wait til they argue it’s a mandated disability accommodation.


Maybe some college degrees could have an endorsement at the bottom: "Earned without AI."


For some definition of ‘accomplish’ that has nothing to do with what studying is for.


why is it always the high karma users with the... weirdest takes?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: