What would happen if computers never got any faster?
My first computer was a BBC Micro. It could do basic graphics at a resolution of 640×256 - with 8 different colours. Not a typo. Eight! The mono speaker produced bleeps and bloops. It was basic, in all senses of the word.
Eventually, talented hackers found a way for it to do simplistic 3D graphics and even speech synthesis.
Recently, people have worked out a way to perform ray-tracing on it!
The next computer our house got was the Sega Megadrive. The first game that console saw was, I think, Alex Kidd. A basic 2D platformer.
Sure, it was streets ahead of the Beeb, but the graphics weren't amazing.
But, over the years, they got better. By the time the MegaDrive stopped getting new games in 1997, the graphics and audio available were utterly transformed. In eight years, we'd gone from a limited pallet 2D screen to this:
Stunning music, and liquid smooth 2D graphics with parallax and complex transformations.
3D! Three-friken-Dee!
Some enterprising hackers managed to get Wolfenstein 3D running on hardware which was originally intended for cheap side-scrollers.
And nothing about the console had changed. The tools used to create games had improved. The maths and algorithms had leapt ahead. And the ingenuity of the designers had increased. But the physical hardware was identical.
Once you understand a system - deeply understand - it can do things that its designers never thought possible. You can push hardware beyond its apparent limits.
We're so spoiled today. Every week a newer, faster processor is released. Hardware gets cheaper and we can just throw more chips at the problem.
What would the world be like if that wasn't the case? What if our progress in computer speed suddenly came to a stop? I think history shows us that we would be able to work around the restrictions to do things which seem impossible.
Even when machines break down, they can still be made to perform unexpected miracles. A billion-and-a-half kilometres away, the Voyager 2 probe broke down. Software instructions were sent which told the craft to do something it wasn't intended to do. The craft was instructed to cycle its heating system in order to correct the flow of lubrication. All of a sudden its capabilities were upgraded and it was able to continue its mission for another half-century.
Mars is the only planet in the solar system which is entirely populated by robots from Planet Earth. Those robots aren't receiving any hardware upgrades any time soon. But their software gets upgraded to allow their hardware to perform new tasks.
We don't necessarily need faster, better hardware. We need more thoughtful, and more creative humans.
Markela Zeneli said on twitter.com:
Really interesting read. It's important to think about which directions we progress in with tech, and how getting creative with software can push past the limitations of hardware
Charles Arthur said on twitter.com:
Oh, you mean like Intel’s performance over the past five years
Marek said on twitter.com:
In my opinion software would have to go on higher level with more advance resource use. The question is - if we would not go with the Moore law would that be better for us ?
Rob Manuel 🤖📒 said on twitter.com:
I sometimes wonder about the amazing game concepts we’re missing. Tetris is viable on a ZX81 and yet compelling to this day, and if computers had stayed that level we might have been forced to look for a few more Tetrises.
HN Front Page said on twitter.com:
What would happen if computers never got any faster? L: shkspr.mobi/blog/2020/11/w… C: news.ycombinator.com/item?id=252193…
Tom Hartley said on twitter.com:
Really great article. Two thoughts - first is that it’s sad that our current economic models almost depend upon it. A shift towards something more doughnut shaped would be great and ties in well to your points (en.m.wikipedia.org/wiki/Doughnut_… )
nullify says:
This is such an interesting perspective.
I always thought we were constrained by systems however that isn't the case. There's a possibility space in our minds which we can tap into to work around those systems.
Mehdi Abbassi says:
Very interesting, thanks. It seems to me that the problem is not the number of flops per second anymore, it is the architecture of the processors. They should adapt themselves to the new concepts, like Neural Networks.
Mike P says:
Interesting article. Isn't the Wolfenstein 3D screenshot above actually Duke Nukem 3D though?
Hacker News 100+ said on twitter.com:
What would happen if computers never got any faster? shkspr.mobi/blog/2020/11/w…
Marcelo Martins said on twitter.com:
It is very interesting, but I still want a gpu to change my bg on Zoom
shkspr.mobi/blog/2020/11/w…
René Stalder said on twitter.com:
That was an interesting read. Technical barriers due to limitations in the hardware never seemed much of a problem when the possibilities, carefully, are pushed towards their constraint. (btw. putting CSS into JavaScript is probably the exception).
shkspr.mobi/blog/2020/11/w…
Bret Bernhoft says:
It's an interesting question to ponder, what would happen if computers never became any faster? I don't know what would happen to be honest. Assuming computers continued to exist at all, there would have to be some incredibly transformative factors involved to make and sustain that as real. It sounds like the plot of a science fiction novel.
Some unique thoughts here. Thank you for putting the article together.
Chris Chiesa says:
Funny to happen across this conversation. You got lucky with the BBC Micro. 640×256 with 8 different colours? Sounds like Heaven to me! See, I started with the Atari 800: multiple resolutions with the highest being 320x192 in monochrome -- that is, black and white. A later model added a register that turned that mode into 16-level grayscale -- but with the pixels four times as wide as they were high, for a resultion of 80x192! And yeah, sound was "bleeps and bloops" -- four channels producing mono square waves at even divisions of the 1.79 MHz (!) system clock so it couldn't even hit most musical notes accurately.
Still, the very same thing happened that you describe: guys figured out all kinds of ways to push the capabilities of the machine until they could display GIFs and play sampled sounds and synthesized voices, display text in 80 columns (the hardware alone could only do 40), and on and on and on.
I loved the Atari 800 and its descendants, and still play with it today in emulation on Windows. And guess what I've been playing with just recently? Writing my own ray tracer! I've got the standard checkerboard floor working, and am now trying to ripple it; then I'll work on proper illumination and reflections and such. Maybe tile the floor with some 16-level-grayscale photos (there are many).
It doesn't push the hardware, but it's an interesting challenge in software. BASIC doesn't support structured datatypes, so to operate on 3D vectors you have to use three separate numeric variables per vector. It doesn't support callable functions with arguments and local variables, so you have to have a bunch of additional variables to pass arguments into, and return values back from, subroutines. BASIC is also an interpreted language, so it's slow. So when I happened across a Pascal compiler I'd been meaning to play with, I ported the project to that language, which does have structured datatypes, local variables, callable functions with arguments, etc. And now I've got the floor working there, too, and I'll keep adding features until I run out of memory! Then it'll be off to Windows and either C++ or Perl...
I also did a "semi"-raytracing thing back in the late 80s or early 90s: a short BASIC program that used some of my college vector calculus to generate a 3D-shaded image of a circular ripple defined by a mathematical equation The user can specify the direction of the light source and thereby get different images. At one point I generated nine of these with appropriate lighting directions, and created a simple program that lets the user "steer the light source around" with the 8-position Atari joystick. I've always wanted to change the specific equations in that program to generate images of other equations, but the trick now is to try to recall college calculus!