@Edent I understand the ‘stranded on another planet’ angle and this would almost certainly result in my demise, but I understand down to the component level how a CPU works. I’ve even built them by hand (albeit large and unwieldy, and low performance). I also know how they interpret microcode and how it translates to binary, and in turn, gate switching. I have no formal training or qualification in this field. Modern CPUs are simply the result of refinement and scale-out/scale-down. Sure there are modular, proprietary ‘technologies’ baked into different vendor’s chips, but their patent applications combined with a datasheet give most of the game away. FPGA documentation gives so many insights to this, too. I’m a bit of a technological caveman, and I’ve seen your projects - so know you’re not! I think a huge part of what you’re touching on here is the inconceivable scale of what we’ve achieved. It is so infinitesimally small that it’s become borderline intangible - and with this comes enormous risk. Take the 3nm chip in my phone, a lot can be hidden and obfuscated on a 19 billion transistor array smaller than a postage stamp, and it’s taken a long time to get here. Just my thoughts 🙂