Oats + Milk = Porridge.
Oats + Plant Milk = Porridge.
Oats + Water + Salt = Porridge.
Oats + Water + Sugar = Porridge.
Oats + Coca Cola = ...
In response to Tom MacWright's "One way to represent things" - which I broadly agree with.
When you are a child, the whole world is complicated. By the time you're a teenager, the whole world is simple. Once you grow up, you realise just how complicated everything is. As you obtain mastery, you find a way to simplify everything.
I expect that by the time I'm old(er) and grey(er) I'll have gone through several more cycles of this process.
When we write code in a traditional programming language, we write strings of characters. In more sophisticated visual programming languages, like Scratch, we manipulate objects.
But, whatever the programming paradigm, all our data are stored and transmitted as bits.
No one wants to write binary. It is almost impossible for a human to read or write anything with a moderate level of complexity. That's why we have debuggers and decompilers to use when we're forced to deeply examine a truculent piece of code. And that's why we have moved away from the binary language of moisture vaporators and on to higher languages.
Humans are pattern matching machines. We love categorising things. So much so that occasionally wars break out over whether something is Type X or Type Y. Our love of categorising comes with other problems.
Taxonomy is hard, as this video adequately explains:
The philosophy of numbers plagues programming. Integers, Floats, Fraction, Booleans, Strings - all valid ways to represent numbers, all with complex subtleties and peculiar interactions.
But, in the end, they're all just stored as bytes in memory.
Ian Davis said on twitter.com:
Also: Every program assembly is a single binary number.