On Tuesday, 13 September 2022 at 20:08:02 UTC, Don Allen wrote:
>I'm aware of those arguments. It wasn't at all clear how your terse comment related to them.
Sorry for that. I all too often assume people can read my mind.
>Yes, if you are concerned with individual bits, then binary representation is obviously more natural than hex (or octal or decimal). But depending on your purpose, other representations may be more natural than 0b. I gave an example in a previous post and therefore won't repeat.
Yes, nobody is arguing otherwise, and we'd better go back to the point of the discussion. Walter wants to remove binary literals on these false assumptions:
- Nobody uses binary literals.
- Hex is always more readable than binary.
- Removing binary literals would simplify the language and compiler.
Let's stop him.