Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Let's say that you are only truly productive at a flat rate of 40% of your work time

That's not the case. Hourly productivity decreases at some point.

The law of diminishing returns is at play here. Working longer hours leads to higher overall output up to a point as the marginal increase diminishes.

Then, at some point you are too tired and produces bad software that will cost to fix.

If your employer pays you a flat rate then it is profitable to make you work long hours up to the point when you start screwing up. If they pay you by the hour or if they pay you overtime, rationally they shouldn't push you beyond the point where your hourly output is too low to justify your pay.



Look, I completely understand what you are saying, because it's something I believe as well. However, what I am trying to find out more is whether this belief is correct or not and whether anyone has done any actual studies.


This is the way it is. Anyone who's worked in the industry (or any industry?) knows from experience that this is the way it is. I'm sure that there are data available.

A quick Googling for "productivity vs hours worked" led me to this:

http://bleedingheartlibertarians.com/2013/10/hours-worked-an...


Unfortunately there are quite a few problems in the report: 1. it measures productivity by GDP, which itself is not a great measure 2. it compares Greece vs Germany, and fails to mention that they rely on completely different industries; Greek - agriculture, Germany - manufacturing

But what we are talking about here specifically is the software development industry, where quantity does not always mean quality. So herein lies the problem that potentially a lot of our beliefs that "more hours worked means diminished return" could just be based on bad data analysis.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: