Hacker News new | past | comments | ask | show | jobs | submit login

> The speedup factor is order-dependent.

This is why it is very difficult to justify optimization work to management. If there are 20 things to optimize that take 10 seconds each, the change isn't really noticeable until you're getting past half-way. And once your processing already takes a few minutes, what's the harm in adding another 10 seconds?




I once shipped a crud app a few weeks earlier than I wanted to because of pressure from managers. I warned them that I hadn't done any optimizations, and it would quickly become unusable with real world use. So naturally, as soon as it was deployed I was reassigned to another project. 3 weeks later the users were complaining it was slow, in 6 weeks it was taking 20 minutes to load. They finally agreed to let me optimize it, in an hour I had it to 5 minutes, in a week I had it loading in 10 seconds. I could get it faster, but I was reassigned again, and at this point it is plenty fast for the users in question.

If I had done the easy change to get it to 5 minute before deploying, it would likely still be that way today annoying my users, but not enough to justify changing.


As a former colleague of mine was fond of saying, broken gets fixed but crappy lasts forever.


There is a similar effect, but in reverse, when adding interdependent features. Early ones don't have a big impact, but once you get to a certain point the inefficiencies add up and the program becomes bogged down.

Cache invalidations and memory swapping as you approach the limits are other examples.


Well it depends what you are doing. If the thing you are optimizing is the thing blocking everything else routinely, spending a lot of resources on optimization is a no-brainer. If the thing you are optimizing takes long, but that does not matter so much, it is not worth optimizing.

When you are creating VFX scenes for a still image you don't really care a lot about the render times as long as it is done the next day or so. If you are doing it for animation, you care a lot about render times, because anything times a thousand will be quite a duration.

The work of a programmer is always a factor in a multiplication. If the small change you make is used by thousand people a day, using it 100 times a day each second you shave off will save a collective time of ~28 hours. And this is only time. You could also think about electricity, about frustration, etc.

A programmers work is always multiplied into the world. It is a huge responsibility - and I think we all should act more like it.


This is of course very good advice and I follow it myself quite often, but there is something to say about quantitate changes leading to qualitative outcomes. Especially with changes with performance in a factor of 150 and the like.

Some workflows that were considered “too long to be worth it” suddenly become easy and routine. It’s kind of like “disrupting the market” in a sense.

And more times than I can count small changes in performance that were considered irrelevant led to very big cultural changes in a company.

For example we had some e2e tests running in our CI/CD pipeline that were taking ~ 15 minutes, people were stressed (or not very productive) as the usual excuse was “I’m waiting for the tests” with not enough time to do something productive but too long to wait patiently. I spend like a day optimizing it and brought them down to 4 mins, suddenly people began to accomplish more, they started writing a lot more tests.

So a day of investment led to happier devs (less turnover), more tests (stability improvement) and faster feature turnaround. And I had to fight tooth and nail to spare the time to actually do this.


> A programmers work is always multiplied into the world. It is a huge responsibility - and I think we all should act more like it.

Unfortunately this same fact also means that varrious kinds of important software are hard to get developed.

When someone writes code to blink a mildly annoying advert on youtube it effects a billion people and can make millions or hundreds of millions in revenue. But code needed to make a local car wash' robots more efficient? -- unlikely to get written unless someone thinks they can sell it to a chain: a large number of programmers get snatched up by places like google that deploy to billions of people. Even though in the past when there were far fewer qualified programmers it might have been easier to get the car wash software developed because there simply was nowhere that could deploy software to a billion people instead (much less highly profitably).

The enormous leverage of software is an undeniable force for good. But it also changes the incentive structures of the world in ways that have negative effects too. :(

This isn't limited to software either, improvements to mass production has made mass produced goods extremely inexpensive-- but by that same token custom work has become much more expensive. And the world around us has become much more homogenized and cookie cutter as a result. But the leverage that software potentially has is vastly greater than other things because of its zero marginal cost of production.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: