This post is dedicated to the memory of Niklaus Wirth, a computing pioneer who passed away January 1st. In 1995 he wrote an influential article called “A Plea for Lean Software”, and in what follows, I try to make the same case nearly 30 years later, updated for today’s computing horrors.
The really short version: the way we build/ship software these days is mostly ridiculous, leading to 350MB packages that draw graphs, and simple products importing 1600 dependencies of unknown provenance.
See it’s actually fine because computer power is just always going to get better and the next gen will handle it all fine.
Oh and there is definitely no reason to try and reduces electricity usage. See cause we’re totally going to run everything on solar panels any day now and we can just scale that up forever to meet demand without any problems.
Obviously, sarcasm. It is kind of infuriating how little a lot of companies care about keeping stuff lightweight.
Personally I’m very interested in projects to build functional lightweight systems and architecture, particularly stuff that could run on older process node chips. Like stuff that could be made without colossally complex supply chains.
Join the cult of embedded engineers! My current project at work uses a cortex m0, so we have 32kB of code ROM and 4kB of RAM. It’s really satisfying finding little optimizations to save a couple dozen bytes here or there, and there’s never the pressure to just slap together code without worrying about size or speed since you can’t afford it with the hardware you’re using
What might be interesting is to go through some archive of old, say, accounting software and find whatever was really the best. Maybe something that ran on an IBM mainframe in the 70s or something, but got upgraded decades ago. Lo and behold we discover this software from the past (with some modern tweaks) is the best accounting software ever, and it can run amazingly on earlier node architecture that is extremely simplistic to adapt to modern architecture.
See it’s actually fine because computer power is just always going to get better and the next gen will handle it all fine.
Oh and there is definitely no reason to try and reduces electricity usage. See cause we’re totally going to run everything on solar panels any day now and we can just scale that up forever to meet demand without any problems.
Obviously, sarcasm. It is kind of infuriating how little a lot of companies care about keeping stuff lightweight.
Personally I’m very interested in projects to build functional lightweight systems and architecture, particularly stuff that could run on older process node chips. Like stuff that could be made without colossally complex supply chains.
Join the cult of embedded engineers! My current project at work uses a cortex m0, so we have 32kB of code ROM and 4kB of RAM. It’s really satisfying finding little optimizations to save a couple dozen bytes here or there, and there’s never the pressure to just slap together code without worrying about size or speed since you can’t afford it with the hardware you’re using
What might be interesting is to go through some archive of old, say, accounting software and find whatever was really the best. Maybe something that ran on an IBM mainframe in the 70s or something, but got upgraded decades ago. Lo and behold we discover this software from the past (with some modern tweaks) is the best accounting software ever, and it can run amazingly on earlier node architecture that is extremely simplistic to adapt to modern architecture.
Accounting software needs to be updated yearly to the local tax codes or it becomes useless