- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
If you’re worried about how AI will affect your job, the world of copywriters may offer a glimpse of the future.
Writer Benjamin Miller – not his real name – was thriving in early 2023. He led a team of more than 60 writers and editors, publishing blog posts and articles to promote a tech company that packages and resells data on everything from real estate to used cars. “It was really engaging work,” Miller says, a chance to flex his creativity and collaborate with experts on a variety of subjects. But one day, Miller’s manager told him about a new project. “They wanted to use AI to cut down on costs,” he says. (Miller signed a non-disclosure agreement, and asked the BBC to withhold his and the company’s name.)
A month later, the business introduced an automated system. Miller’s manager would plug a headline for an article into an online form, an AI model would generate an outline based on that title, and Miller would get an alert on his computer. Instead of coming up with their own ideas, his writers would create articles around those outlines, and Miller would do a final edit before the stories were published. Miller only had a few months to adapt before he got news of a second layer of automation. Going forward, ChatGPT would write the articles in their entirety, and most of his team was fired. The few people remaining were left with an even less creative task: editing ChatGPT’s subpar text to make it sound more human.
By 2024, the company laid off the rest of Miller’s team, and he was alone. “All of a sudden I was just doing everyone’s job,” Miller says. Every day, he’d open the AI-written documents to fix the robot’s formulaic mistakes, churning out the work that used to employ dozens of people.
If it’s just a “toy” then how is it able to have all this economic impact?
it’s an economic bubble, it will eventually burst but several grifters will walk out with tons of money while the rest of us will have to endure the impact
There were bubbles about things which were promising profits in the future but at the moment nobody knew how exactly. Like the dotcom bubble.
This one does bring profits now in its core too, but that’s a limited resource. It will be less and less useful the more poisoned with generated output the textual universe is. It’s a fundamental truth, thus I’m certain of it.
Due to that happening slowly, I’m not sure there’ll be a bubble bursting. Rather it’ll slowly become irrelevant.
While I agree with most points you make, I cannot see a machine that is, at a bare minimum, able to translate between arbitrary languages become irrelevant anytime in the foreseeable future.
OK, I agree, translation is useful and is fundamentally something it makes sense for.
Disagree about “arbitrary”, you need a huge enough dataset for every language, and it’s not going to be that much better than 90’s machine translators though.
It’s closer, but in practice still requires a human to check the whole text. Which raises the question, why use it at all instead of a machine translator with more modest requirements.
And also this may poison smaller languages with translation artifacts becoming norm. Calque is one thing, here one can expect stuff of the “medieval monks mixing up Armorica and Armenia” kind (I fucking hate those of Armenians still perpetuating that single known mistake), only better masqueraded.
While I am sure some job displacement is happening getting worked up over ai at this point when we had decades of blue collar offshoring that is being kinda reversed and replaced with white collar offshoring.
Or here is another one immigration at both levels.
Currently AIs biggest impact is providing cover for white collar offshoring.
Tech support and Customer Service tested the waters now they are full force trying professional services.
Bigger impact all around and not much discussion.
Well, while my views on economics are still libertarian, there’s one trait of very complex and interconnected systems, like market economies (as opposed to Soviet-style planned economy, say, which was still arcanely complex, but with fewer connections to track in analysis, even considering black markets, barter, unofficial negotiations etc), - it’s never clear how centralized it really is and it’s never clear what it’s going to become.
It’s funny how dystopian things from all extremes of that “political compass” thing come into reality. Turns out you don’t need to pick one, it can suck all ways. Matches well what one would expect from learning about world history, of course.
What I’m trying to say is that power is power, not matter what’s written in any laws. The only thing resembling a cure is keeping as much of it as possible distributed at all times. A few decades from now it may be found again, and then after some time forgotten again.
Whatever we got is deff not distributed in any sense of the term lol
That’s my point.
The popular idea that you can avoid giving power to the average person, giving it all to some bureaucracy, or big companies in some industry, or some institutions, and yelling “rule of law”, has come to its logical conclusion.
Where bureaucrats become sort of a mafia\aristocracy layer, big companies become oligopolies entangled with everything wrong, and institutions sell their decisions rather cheap.
Speculation. 100% speculation. A tool is precise. A toy is not. Guided ai, e.g. for circuit optimizing or fleet optimization is brilliant. Gai is not the same.
Evidently “precision” isn’t needed for the things the AI is being used for here.
Right. And neither is the investment it is attracting.
https://www.statista.com/topics/1108/toy-industry/#topicOverview
Because toy industry huge