Follow

1950: the first software goes under maintenance.
1960: some programmers wonder when the exponentially-decreasing performance of software will stop.
1970: we need more core, more disk, more offline storage, more more more.
1981: oh... LISA will need a hard drive just to boot. Only $3500 for 5MB!
1990: just throw more RAM and hard drive at the performance problem.
2000: EVERYTHING IS C++ OH MY
2010-: try hiding resource issues caused by bad software practices any old way you can.

· · Web · 1 · 0 · 1

In reality, your computing from 2011 could easily run today's software if we hadn't bought the myth that developer time optimizing software wasn't worth the effort

Executives of companies responsible for shifting product took it as a challenge to see how tightly they could crank that refresh cycle, so it's why we stupidly upgrade our fashion statements yearly (though that has little to do with poor software and more to do with greed and stupidity).

@yakkoj I agree with this. It's a problem at the heart of the the DevOps CI/CD methodology.

Real performance tuning is becoming a lost art. I'm sure it's not helping certain classes of security issues either.

For comparison, ask a more experienced dev to optimize some C, Forth or Assembly and see what happens. You may want to hold his beer.

Sign in to participate in the conversation
Mastodon for Tech Folks

This Mastodon instance is for people interested in technology. Discussions aren't limited to technology, because tech folks shouldn't be limited to technology either!