Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If an AI can do my job, why would my employer fire me? Why wouldn’t they be excited to get 200% productivity out of me for the marginal cost of an AI seat license?

A lot of the predictions of job loss are predicated on an unspoken assumption that we’re sitting at “task maximum” so any increase in productivity must result in job loss. It’s only true if there is no more work to be done. But no one seems to be willing or able or even aware that they need to make that point substantively—to prove that there is no more work to be done.

Historically, humans have been absolutely terrible at predicting the types and volumes of future work. But we’ve been absolutely incredible at inventing new things to do to keep busy.



> If an AI can do my job, why would my employer fire me? Why wouldn’t they be excited to get 200% productivity out of me for the marginal cost of an AI seat license?

They’d be excited at getting 100x of 100% output out an AI for 20 dollars a month and laying you off as redundant. If you aren’t scared of the potential of this technology you are lying to yourself.


“Fixed lump of work fallacy” as noted by commenter above. If a company can get 100% more output they don’t fire half their people so they stand still/get no additional productivity gain.

They produce twice as much and grow.

What forum is this???


“Fixed lump of work fallacy”

You're relying on theoretical work needed by employers to be unlimited. You're also assuming all of this additional work can't be handled by an LLM.

First of all fixed lump of work is not a fallacy. We do know there is a limit as there's limits in the amount of work human brains can even comprehend. A limit exists. We don't know where exactly this limit is, but a limit DOES exist and an LLM may possibly cover that limit.

Second, you have to assume that this "additional work" can't be handled by the LLM. How can you be sure? Did you think about what this work actually is? My first thought was "cleaning the toilets."

>What forum is this???

I assume it's a forum of people who don't base their lives off of concepts with buzzwords. “Fixed lump of work fallacy” is a fancy phrase for a fancy concept... that doesn't mean it's an actual fallacy or actually true. Literally you just threw that quote up there as if the slightly clever wording itself proves your point.

What Exactly is this additional work that will pop up once LLMs are around and so powerful they can do all human intellectual work? Can you even do a concrete/solid real-world analysis without jumping to vague hypotheticals covered by fancy worded conceptual quotations? The last guy used analogies as part of his logical baseline of reasoning. Wasn't convincing to me.


Again: I’m only made redundant if there is no more work that my employer needs me to do.

Why should I be scared of technology that makes me more productive?


This assumes that the bottleneck to profitability is the limit of software engineers they can afford to hire.

If they’re happy with current rate of progress (and in many companies that is the case), then a productivity increase of 100% means they need half the current number of engineers.


what company have you ever worked for that was happy with the current rate of progress in software development?


Is the reason for development on features going slow usually the number of developers though? Nowhere I’ve worked has that really been the case, it’s usually fumbled strategic decision making and pivots.


None, of course.

And the “current rate” is competitively defined. So if AI can make software developers twice as productive, then the acceptable minimum “current rate” will become 2x faster than it is today.


Sure that means they need half the number of engineers to stand still. Most companies aim to grow not stand still.


Why would there be more work left for you to do if AI can do it in seconds?

Ai is trending towards a point where it makes your employers productive such that they don’t need you.


A computer already does in seconds what it used to take many people to do. In fact the word “computer” was a job title; now it describes the machine that replaced those jobs.

Yet people are still employed today. They are doing the many new jobs that the productivity boost of digital computing created.

Productivity creates jobs.


I don't know why people think analogies from the past predict or prove anything in the future. It's as if a different situation applies completely to the current situation via analogy EVEN though both situations are DIFFERENT.

The computer created jobs because it takes human skills to talk to the computer.

It takes very little skill to talk to an LLM. Why would your manager ask you to prompt an LLM to do something for you when he can do it himself? You going to answer this question with another analogy?

Just think reasonably and logically. Why would I pay you a 300k annual salary when a chatGPT can do it for nothing? It's pretty straightforward. If you can't justify something with a straightforward answer, likely you're not being honest with yourself.

Why don't we use actually evidence based logic to prove things rather then justify things by leaping over some unreasonable gap with some analogy. Think about the current situation, don't base your hope on a past situation and hope that the current situation will be the same because of analogy.


My job is not to do a certain fixed set of tasks, my job is to do whatever my employer needs me to do. If an LLM can do part of the tasks I complete now, then I will leave those tasks to the LLM and move on to the rest of what my employer needs done.

Now you might say AI means that I will run out of things that my employer needs me to do. And I'll repeat what I said above: you've got to prove that. I'm not going to take it on faith that you have sussed out the complete future of business.


Future or events that haven't happened yet can't be proven out because it's an unknown.

What we can do is make a logical and theoretical extrapolation. If AI progresses to the point where it can do every single task you can do in seconds, what task is there for you left to do? And how hard is the task? If LLMs never evolve to the point where they can clean toilets, well then you can do that, but why would the boss pay you 300k to clean the toilet?

These are all logical conjectures on a possible future. The problem here is that if AI continues on the same trendline it's traveling on now I can't come up with a logical chain of thought where YOU or I keep our 300k+ engineering jobs.

This is what I keep hearing from not just you, but a ton of people. That analogy about how technology only created more jobs before with no illustration of a specific scenario of what's going on here. Yeah if LLMs replace almost every aspect of human intellectual analysis, design, art and engineering what is there left to do?

Clean the toilet. I'm not even kidding. We still have things we can do but the end comes when robotics catches up and is able to make robots as versatile as the human form. That's the true end when the boss has chatGPT clean the toilet.


It makes cheaper people also more productive and


It depends on who your employer is.

If they're high growth yes, if they're in the majority of businesses that are just trying to maximise profit with negligible or no growth then likely not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: