A lot of people say that AI would create new jobs™. And perhaps it will. But the resulting jobs follow a general trend of technology making new jobs: the jobs themselves are more meaningless (from a human conceptual point of view) doing things that are increasingly degrading and far away from our basic instincts. And the resulting products/services seem to be reaching the anti-ideal of "optimize people for squeezing other people dry" rather than for creating any real increase in the overall good.
Personally, I wish computer scientists as a group would have been wise enough to leave AI technology alone and never invent it.
Haha. A lot of people say that. But as an artist myself, I think automating too much can actually lead to sterility in art as well. It's a matter of moderation. Complete automation leads to bad art.
I think what’s at risk are these activities as part of ordinary—not elite—work culture, and the social value of hobbyist-level efforts. Similar to how mass reproduction of recorded music wrecked those two things in that area.
You’ll always be able to play so-so violin in your basement, just for yourself. What’s lost is the social and economic value—of participation in one or more arts being part of everyday life for normal people, and part of a tapestry of creative activity. Even if it’s compromised by commerce, or by your family wanting to sing “jingle bells” around the piano, not hear Mozart.
I worry about the same thing happening, but for writing (which is already on its last legs—paying-the-bills creative writing has been a mostly-elite career for decades, because the market’s dried up) and visual arts.
> But the resulting jobs follow a general trend of technology making new jobs
"Technology making new jobs" didn't start yesterday; it started at least a couple of centuries ago. And I take you, or even people with a meaningless job, wouldn't swap their job for sweeping chimneys or making matchsticks.
Even if we don't go so far, taking for example game development, nowadays, thanks to many improvements both in HW and SW, the barrier to entrance has decreased considerably, and people can produce (small) masterpieces even with low software engineering skills.
Not sure why you resort to making the only alternative "sweeping chimneys or making matchsticks". There were plenty of other jobs back then that weren't menial.
Yes, generative AI. But machine learning and neural networks were the precursors to this sort of research, so I prefer to put AI into one pile, whose ultimate aim is to make a machine work more like a human.
Even the title of this article is misleading. It sounds like the copywriters would be now doing the "humanizing" job but no, only ONE of them does that, while the other 59 were already kicked out.
And just like that, the alienation of labour becomes evermore present.
Of course, this kind of process that abstracts away meaning and creativity from the work has been a thing for a while. After all, similar observations were already being made even by Karl Marx during the 19th century, and it'd be weird if similar stuff couldn't happen with regards to generative AI.
Of course, the way this works nowadays is substantially different than how it would have been for 1800s factory workers, but similar ideas and trends can certainly be seen here.
As for the idea that computer scientists should just had a self-imposed moratorium on things like generative AI, I'm sceptical that it could have ever worked. And now the cat is already out of the bag, so we get to enjoy the consequences. AI slop and all.
Personally, I wish computer scientists as a group would have been wise enough to leave AI technology alone and never invent it.