Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AI took their jobs. Now they get paid to make it sound human (bbc.com)
50 points by gortok on Sept 12, 2024 | hide | past | favorite | 57 comments


"According to Cowart, many of the same freelance writing platforms that have AI detection software in place are simultaneously hiring people to edit content produced by chatbots. That means in some corners of the copywriting ecosystem, almost everything revolves around efforts to avoid the appearance of artificial intelligence.

"They're selling AI content and paying you to fix it, and at the same time they're sending you emails about how to write like a human so you don't trigger their AI detector," Cowart says. "It's so insulting." Worse, the detectors are regularly updated to keep up with ongoing changes from the companies who make AI chatbots, which means the rules about what might get your writing flagged as AI constantly shift. "It's frustrating, because there are a million ways to say the same thing in English, but which one is more human? I don't like the guessing," she says."

Infinite AI slop getting churned out and people getting paid a pittance to make it sound less like AI, in order to trick other people who are actively working to not be fed the slop. What an absurd and depressing state of affairs.


> actively working to not be fed the slop

By using AI-powered search to summarize everything the AIs generated! I now understand the AI gold-rush.

Dear God, what a dystopian future we're creating. "Information super-highway" was a more-apt metaphor than I ever thought, way back when: as we all know by now, induced demand ends up in grid-lock.


It's such an aggressively obnoxious way for huge swaths of infrastructure to become useless wastelands of generative nonsense that I would have been amused if I'd read it in Snow Crash.

Billions of dollars and rare minerals and manpower all going towards drowning the greatest communication tool in human history in noise. It's like kessler syndrome for the internet, except we are actively spending money to make it happen.

What a pointless, useless waste.


it's the vile offspring from accelerando (Charlie Stross)


And my mind with to The Diamond Age (the Young Lady's Illustrated Primer that used human actors to give AI-generated stories a veneer of humanity).


aha that's a good one too! I got a pinenote and thought about making Tom Riddle's diary, but for maths. shame I never got around to doing it.


Meanwhile countries like China are building real assets like high speed rail and the world's richest industrial ecosystem while hooking us on it.

This is because they don't instinctively follow a principle of "market always knows best".

They might be live in a dictatorship instead of a fake democracy rigged by the rich, but materially, their lives will probably be better than ours in a few decades.


Today, jobseekers are submitting AI-written resumes to resume-filtering AIs used by employers.

But why stop there? I dream of a future where not only do we fully automate the tedious process of playing golf, we also automate the tedious process of watching professional golf.


"The Electric Monk was a labour-saving device, like a dishwasher or a video recorder. Dishwashers washed tedious dishes for you, thus saving you the bother of washing them yourself, video recorders watched tedious television for you, thus saving you the bother of looking at it yourself; Electric Monks believed things for you, thus saving you what was becoming an increasingly onerous task, that of believing all the things the world expected you to believe." - Douglas Adams.


You are my Internet buddy today!

I don't know (or care) what you've posted before, or will post later... but right now, you and I are aligned.


I know of a doctor who has to write weekly reports and has been using an LLM to generate them. I’m sure the people reading them are using an LLM to summarize them.

It’s such an incredible waste of resources and time.

I’ve said it before but I have a picture I took a decade ago of this beautiful meadow I rode my bike past with the morning sun shining through the tall grasses. I remember the birds flying around and insects buzzing. That meadow was torn up and a big concrete data center surrounded by tall fences was put in. All so the tech bubble could go on.


I'm semi-convinced my current doctor is using ChatGPT to write messages to me. I'd rather she sent 2000s hyperabbreviated chatspeak than read another trite statement about how "she" "understands" my human reason for a request.


Don't know about doctors, but school reports have been done like this for decades. You select 1, 7, 8, 5, 2 to a series of questions about the child (eg "Does the child pay attention in class?"), and the software expands that into a school report.


An amazing amount of what certain parts of and actors in business do is write tons of shit that basically doesn’t matter.

It’s the one thing I’m confident LLMs will be purely helpful for. Though the ones being automated are from departments and tiers of companies that I expect to be resistant to downsizing as large parts of their workload are automated. Plus, if that were fully embraced, it’d be the same as other automation (or, sometimes, “automation”) in business and those responsibilities would still exist, but be done by people already juggling a dozen ex-roles. I don’t really want to be the one who has to babysit an LLM through writing department newsletters and crap.


They paved paradise and put up an AI bot


It's economics!

Farmers plant trees, then cut them down, then send to a paper mill, then they get printed, then they travel hundreds of km to be spam ads in my mailbox, then I have to put them in the garbage. Then (supposedly) they go on to become paper again, travelling hundreds of km again in the process, or most likely are burnt for heat in the city garbage disposal.

All of this shows the rich people that our country is doing well and they should invest more, to produce even more spam!

Isn't capitalism amazing?


What has capitalism to do with it? It is just division of labor. And upgrading products so they become other products. And i think its fantastic. Because, to be honest, i dont know how to cut trees, how to shredder them, how to make paper out of it, so i can finally apply my skill to draw advertisement on the spam, that you receive.

(Reminds me of old communist days, where you get some gray paper. I was so happy when i got that shiny reflective paper from wrigleys i could glue to my matchbox!:)


Division of labor?

It should just not be done! The entire process I described is a complete waste of resources. But it's "growth" from an economic point of view.

If your job is to make things so that I can drop them in the bin… can't you see there's some kind of waste?


It's not solely waste, though. Advertising works.

If you can figure out how to separate the waste advertising from the effective advertising, and only spend money on the second segment, The System will reward you with fabulous cash and prizes (in theory).


> Advertising works.

How? Touching it makes me want to buy whatever is on it? I don't even look at it. When there's enough I do a trip to the paper recycling bin.


> Advertising works.

By weaponizing psychology to manipulate people into buying shit they don't need. No one should be trying to figure out how to categorize advertisements between waste and effective. They are all waste.


> shit they don't need

Any colour you, as long as it's black?


Depends on the point of view. I think it is awesome that everyone can do something to their abilities AND get paid for it. If drawing would not be needed, all the artists would be basically out of work from one day to another.


As a developer, this is kinda where I fear ending up: a glorified editor of crummy AI generated code for a pittance. At that point I'll probably check out of the industry.


Keep in mind that companies do not necessarily aim to increase efficiency but to de-professionalize. It will be easier to find these "glorified editors" that require minimal education, than current software developers. Consequently, this will make them more replaceable and hence they will have a worse negotiation position when it comes to working conditions etc.

It is an old pattern, see for instance: https://www.versobooks.com/products/688-breaking-things-at-w...


Companies always want to commodify their complements, right? I wonder if there's a framing where the employees are one of the complements.


I think it's harder to verify obscure code than to write it in the first place.

I've seen perfectly legit looking code generated by chatgpt, except that the value of the constants were wrong.

When you have to verify every single little thing, it's easier to just do it. And certainly doesn't take any less skill.


True. Some people argue that we are quickly moving from a creator economy to a verifier economy. AI generates a bunch of crap and knowledge workers have to verify and edit (https://www.lycee.ai/blog/the-rise-of-the-verifier-economy)


> Some people argue that we are quickly moving from a creator economy to a verifier economy.

I wonder if this is a natural development of economic optimization.

Computability theory taught us that it is in general easier to verify that an answer is correct (P) than to come up with the answer in the first place (NP).


This ! And I really like your analogy to computability theory. I wonder if it applies to all economic activities. My experience so far is that verification is nice for simple knowledge tasks. But for the complex ones, doing it from scratch is often better. But I guess this may change as AI systems become better and better.


Of course, brute forcing a solution by verifying a massive amount of garbage is much less efficient than doing it from scratch.

Verification becomes more productive than sythesis only when the probability of generated content being correct is significantly high. So I suppose, as AI gets more accurate, verification will become more productive.

I'm in a peculiar position with Bing AI. My work provided me an enterprise account, so I can use it for work-related tasks. Often I find myself asking it to write a program for something I don't wanna do myself from scratch, then I just analyze it, see why it doesn't work and fix it. It has got to a point where it's much faster for me to do that, than to read the relevant documentation and figure out how to do it from scratch.


I get that. And this coding use case becomes even more interesting with o1


This is not much different than being the senior developer managing an outsourced team of developers.


That's the interesting thought though - what happens to the industry (and all the other creative industries affected) after everyone like you checks out?


If companies are releasing shit products then you pick up a laptop and compete with them. That's always how things have gone in the tech industry.


I guess it's a lemon market problem - if other sellers are selling crap for cheap, and there's no way for buyers to verify quality before buying, then you can't make money selling quality stuff for more. Hmm, maybe "certified non-AI data" is a business idea.


that's touched on in the article. It may be possible as a niche, but there is a whole cottage industry working to bypass the AI detectors.


Performance death by over-performance of code-generators? After that? Rejuvenation of dying systems by constantly generated new systems. Also selling retro software in containers cause it "works"..


As in code grows organically- and thus becomes ever more capable but also ever more buggy. Then it needs to die, reasons about the produced horror, promises never again and does it again, forever and ever.

Thus the state of the art was produced by the art.


> He led a team of more than 60 writers and editors, publishing blog posts and articles to promote a tech company that packages and resells data on everything from real estate to used cars.

His job was to generate spam.


Great article. A bit in contrast with another report from the US that found AI did not have any impact on recruitment ? (https://www.lycee.ai/blog/ai-adoption-labor-market-ny-fed-su...) So what is the key takeaway here ?


A lot of people say that AI would create new jobs™. And perhaps it will. But the resulting jobs follow a general trend of technology making new jobs: the jobs themselves are more meaningless (from a human conceptual point of view) doing things that are increasingly degrading and far away from our basic instincts. And the resulting products/services seem to be reaching the anti-ideal of "optimize people for squeezing other people dry" rather than for creating any real increase in the overall good.

Personally, I wish computer scientists as a group would have been wise enough to leave AI technology alone and never invent it.


“Computers can now perform a bunch more tasks for us.”

”How wonderful! Now that you’ve automated away a bunch of soul-crushing drudgery, we can focus on fulfilling work, like writing and making art!”

“…”

“… you automated away the drudgery, right?”


Haha. A lot of people say that. But as an artist myself, I think automating too much can actually lead to sterility in art as well. It's a matter of moderation. Complete automation leads to bad art.


I think what’s at risk are these activities as part of ordinary—not elite—work culture, and the social value of hobbyist-level efforts. Similar to how mass reproduction of recorded music wrecked those two things in that area.

You’ll always be able to play so-so violin in your basement, just for yourself. What’s lost is the social and economic value—of participation in one or more arts being part of everyday life for normal people, and part of a tapestry of creative activity. Even if it’s compromised by commerce, or by your family wanting to sing “jingle bells” around the piano, not hear Mozart.

I worry about the same thing happening, but for writing (which is already on its last legs—paying-the-bills creative writing has been a mostly-elite career for decades, because the market’s dried up) and visual arts.


> But the resulting jobs follow a general trend of technology making new jobs

"Technology making new jobs" didn't start yesterday; it started at least a couple of centuries ago. And I take you, or even people with a meaningless job, wouldn't swap their job for sweeping chimneys or making matchsticks.

Even if we don't go so far, taking for example game development, nowadays, thanks to many improvements both in HW and SW, the barrier to entrance has decreased considerably, and people can produce (small) masterpieces even with low software engineering skills.

Here's Carmack talking tangentially about the topic: https://youtu.be/YOZnqjHkULc.


Not sure why you resort to making the only alternative "sweeping chimneys or making matchsticks". There were plenty of other jobs back then that weren't menial.


Presumably you mean generative AI here? Other forms of AI (machine learning etc) have been around for decades.


Yes, generative AI. But machine learning and neural networks were the precursors to this sort of research, so I prefer to put AI into one pile, whose ultimate aim is to make a machine work more like a human.


Even the title of this article is misleading. It sounds like the copywriters would be now doing the "humanizing" job but no, only ONE of them does that, while the other 59 were already kicked out.


True, the jobs created by AI often seem to prioritize efficiency over meaning, leading to a sense of detachment from human purpose.


And just like that, the alienation of labour becomes evermore present.

Of course, this kind of process that abstracts away meaning and creativity from the work has been a thing for a while. After all, similar observations were already being made even by Karl Marx during the 19th century, and it'd be weird if similar stuff couldn't happen with regards to generative AI.

Of course, the way this works nowadays is substantially different than how it would have been for 1800s factory workers, but similar ideas and trends can certainly be seen here.

As for the idea that computer scientists should just had a self-imposed moratorium on things like generative AI, I'm sceptical that it could have ever worked. And now the cat is already out of the bag, so we get to enjoy the consequences. AI slop and all.


AI can produce text, but it still needs human intervention yet still how much time will pass before this necessity also disappears?


~∞


We fed the big tech that is screwing us for a profit. The funniest part of it is the amount of people here in HN that still come to Google, Amazon or Apple defense as they are hoping to not be laid off.


This article barely touched on Big Tech, and ChatGPT was not developed by Big Tech.

You have a problem with late-stage capitalism, not Big Tech.


ChatGPT was developed by OpenAI, which is owned by Microsoft, a classic "Big Tech" company




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: