My wife @eugenialoli has been working on installing Linux on various old computers for which a lot of other options are now unsupported.
She's been finding that machines with 2GB or RAM or 16GB of storage tend to struggle, whether while installing the OS, booting, installing common apps or running those apps.
2GB of RAM is an incredibly large amount. As is 16GB of storage.
WTF are we software people doing as an industry that makes us consume so many resources?
For reference, Windows XP's official requirements were 64MB of RAM and 1.5GB of storage. Even giving ourselves a 4x margin at 256MB/6GB, that's a very far cry from 2GB/16GB.
Windows XP was honestly a vastly usable OS, with vastly usable applications running on top.
What are we doing that now requires so much more?!?
@jbqueru
Web browsers that are full development suites?
I started to learn some #FullStack development and I did not know about all the layout inspectors and debuggers that the web browser comes with!
Is she using #puppylinux or other lightweight distro?
@eugenialoli
@jbqueru @eugenialoli Really not all that much. Various 3D rendering, but that only justifies the extra resources the programmes are taking, not that the OS is taking, and various web bloat, which is the fault of websites which maliciously pretend to be apps, and web browsers.
@ellenor2000 @eugenialoli That honestly feels like a failure of engineering. I understand the desire to reduce friction by downloading code on demand to execute on the client itself, but that doesn't justify that a web page should be considered "light" when it uses less than 100MB of RAM.
As I'm typing this, my browser reports 478MB for my mail client, at least 160MB each for the 5 documents I have open, and the lightest plain web pages are around 40MB each.
Let's talk orders of magnitude.
My workstation has 128GB of main RAM. Enough for some very heavy applications.
Shrink by 1000x, 128MB would run Windows XP or MacOS X of the same era, or early iOS or Android. Rich graphical environment, one application + accessories.
1000x further, 128kB is the top end of many 8-bit machines. Text mode single-tasking, simple documents.
Still 1000x, 128B is the Atari 2600 (+4kB or ROM for the code). Working set for simple games.
@jbqueru "640K ought to be enough for anybody."
- Bill Gates
;-)
@myfear I wish we still had the kind of foresight that IBM and Microsoft had at the time. Which computers today ship with the ability to expand the RAM to 10x the factory amount, simply by plugging in an expansion card, without any software modifications?
@jbqueru @eugenialoli lazy developers🤷 There was a time when we had spreadsheet software on 8bit computers with only 64k RAM
@twallutis @eugenialoli I ran Multiplan in 128kB indeed. I'm not sure it's laziness, I've worked on one of the heavier web pages out there (one that can hit 1GB of RAM) and nobody around me was lazy. Wrong priorities, probably, indirectly because capitalism.
@jbqueru there certainly is a lot of inefficiency going on, especially with using web browsers as universal runtime for everything. But at the same time, let's be fair, Windows XP or OSX Tiger would not be acceptable operating systems today, especially for security reasons. Security costs performance. So does being able to utilize modern SSD bandwidth, arbitrary amounts of CPU-cores, actually being able to handle that much memory (look up, how big the page tables in your 32gb desktop can grow)
@jbqueru even if we would build a "legacy free" system for modern PCs, it would necessarily consume a lot more resources than Windows XP does, if it should be able to do everything we expect from a modern OS.
Both Windows and Linux scale amazingly from raspberry pi to big clusters (yes, there was a Windows for the Raspi), but there are clear limits. 64bit address space and multi core support set a certain reasonable baseline.
@mxk I agree (even though I don't agree with all the details). I agree indeed that security has some costs. I also agree that data sets have grown and that resources need to follow accordingly. I can't quite convince myself that those alone fundamentally justify the very heavy resource usage we see today.
@jbqueru @eugenialoli with "lazy" I mean the lack of optimization. Maybe bloated libraries are the problem.
@twallutis @eugenialoli Heavy frameworks are a problem indeed. ember.js on a 600-screen app just chews through RAM. The easiest fix is to decide to remove features, but that goes against all corporate/capitalist incentives.
@jbqueru @eugenialoli @LaF0rge well… what should the m68k port say? Most machines have a maximum of 128 MB RAM.
Operating several m68k buildds for Debian was no fun under the pressure of a) being uptodate/keeping up and b) the missing interest of devs willing to support older archs.
But m68k caught many bugs for i386/amd64 as well….
@ij @eugenialoli @LaF0rge Very valid point, we can go further back. I have a vague desire to get a RISC-V machine to try to hit in that general area as well.
Don't get me wrong, I love 68k, probably wrote 100k lines of assembly for it decades ago, and still write some code under emulation, but it's not very current any more.
@jbqueru @eugenialoli I understand that it depends a lot on the distribution, and sometimes its flavour, how resource hungry a Linux system is, e.g. Ubuntu comes with a full-text indexer of the disk activated. Look for xfce or lxde desktop variants, they are leaner.
Still, resource use is baffling. I fully agree. Firefox has become a memory hog, with what I feel is the same usage pattern I had many years ago, meanwhile it is painful even >8GB RAM. (websites have "evolved" though.) @koehntopp
@KarlE @eugenialoli @koehntopp A frustrating aspect for me is that, when we ask "why" or "what for" resource usage is that high, it's hard to know. We add and add and add and lose track of what things are for.
@martin_piper @jbqueru K is 1000. The fact that techies misused it is just tough and why we now have KiB. The fact K is always 1000 is also a matter of legal precedent in a bunch of places after various lawsuits over "undersized" things.
@jbqueru e.g. 128 x 1000 = 125KB. Not 128KB. When you're talking about RAM.
@etchedpixels @jbqueru No, K is 1024 bytes, as in per kilobyte in RAM terms as per the JEDEC standard.
@etchedpixels @jbqueru the legal standard:
kilo (K) (as a prefix to units of semiconductor storage capacity)
A multiplier equal to ***1024***.
@martin_piper @jbqueru That is a trade body opinion. The standard is IEC 60027-2, Second edition, 2000-11.
KiB is 1024, KB is 1000
@etchedpixels @jbqueru again incorrect. The IEC rubbish is not really used that much. JEDEC standards are used, widely. The IEC is a failure.
The JEDEC are an engineering standardisation organisation, not just a trade body as you incorrectly claim.