September 10, 2022
On Saturday, 10 September 2022 at 08:19:18 UTC, Max Samukha wrote:
> On Saturday, 10 September 2022 at 02:22:53 UTC, Walter Bright wrote:
>
>> Hex values are far easier to read, too. Have you ever put the tip of your pencil on the screen to count the number of 1's? I have. Binary literals are user unfriendly.
>
> Bit flags are easier to read as binary grouped in nibbles. For example:
>
> enum ubyte[16] ledDigits =
>     [
>         0b0011_1111, // 0
>         0b0000_0110, // 1
>         ...

Exactly! It's true that binary literals are user unfriendly *by itself*, but *with* separating underscores "_" it's much easier to use than hex values.

September 10, 2022

On Saturday, 10 September 2022 at 02:22:53 UTC, Walter Bright wrote:

>

Hex values are far easier to read, too.

“easier to read” is not “easier to comprehend”.
Comprehensibility really depends on the use-case.

But if we assume one’s working with binary flags or something similar (which probably was the reason to use binary literals in the first place), why would we write them in a different notation?

To give an example:
I can’t translate hex literals to their binary form in my head (in reasonable time).

And I never even had to do so – except for an exam or two at school.
Wanna know how I did it? – I wrote down the 0=00001=0001F=1111 table…

I understand that “I’ve written binary literals in hexadecimal form since 30 years” is a reasonable point of view. But that doesn’t really help anyone who doesn’t have the strong mental connection between them and their binary meaning.

Writing binary literals using “bitshifting” notation in C didn’t provide great readability either. (We did so in the microcontroller programming course at school.)
But I’d consider (1 << 7) | (1 << 1) way more comprehensible than 0x82.

On the contrary, if our use case is RGB channels (where the individual binary bits don’t matter and we instead think in “whole” numbers), of course it would be inconvenient to use binary literals here. (As always: use the right tool for the job!)

>

Have you ever put the tip of your pencil on the screen to count the number of 1's? I have.

No, I haven’t.

Thankfully numeric literals are group-able using underscores (in D and binary ones in PHP at least).
I actually find myself rather counting digits of decimal number literals (when not using D of course or in real life).

Well, I have to admit I haven’t had to deal with code “abusing” binary literals so far…

September 10, 2022
On Saturday, 10 September 2022 at 12:02:57 UTC, Adam D Ruppe wrote:
> On Saturday, 10 September 2022 at 07:03:21 UTC, Daniel N wrote:
>> Personally I think anything which WAS a language feature should be in object to keep the feature working out of the box. Not sure why people are so afraid to use it, not often you have to read object.d source, as long as it's just one file, it will be blazingly fast as always.
>
> object isn't actually blazingly fast. Its growth at least appears to be responsible for much of the slowdown of basic builds compared to the older D1 era builds.
>

If it is slow, then it could/should be cached by some mechanism, object.d seldom changes unless you use some advanced tricks, like the ones in your book.

C# realised that imports actually is an issue and added global imports.
https://endjin.com/blog/2021/09/dotnet-csharp-10-implicit-global-using-directives

I would be fine with that also, but it's easier to use what we have already object.d.
September 10, 2022

On Saturday, 10 September 2022 at 15:21:50 UTC, 0xEAB wrote:

>

I actually find myself rather counting digits of decimal number literals (when not using D of course or in real life).

A note on that:
I really like underscore being used as thousands separator.

Way easier to parse than having to decipher whether a number is written in English or German (applies to a lot of other locales, too!).

Example:

1.000 vs 1,000 – which one is “one thousand” and which is “one and zero thousandths”?

We might have different opinions on that, but we all will agree on 1_000 meaning “one thousand” :)

September 10, 2022
On Friday, 9 September 2022 at 23:21:38 UTC, H. S. Teoh wrote:
> PHP has all sorts of things I'm not sure is wise to emulate.

“At least there aren’t two different yet incompatible standard libraries in PHP.”
– assuming we’re throwing around obsolete prejudices.
September 10, 2022
On Saturday, 10 September 2022 at 14:21:02 UTC, Adam D Ruppe wrote:
>
> Anyway, removing the binary literals we've had for decades would *hurt* D. And ignoring D's users over and over and over again IS going to lead to a fork.

It will lead to a fork. If the maintainers want to remove simple literals because they think they aren't used, then this project aim is very low. Simple literal support is write once and its done and there forever without any maintenance at all. This also at the same time importC is claimed to be "simple", while working on it over a year.

I think that both octal and binary literals can be supported because it is trivial. It doesn't matter if you think they aren't used or not, somewhere there will someone who will absolutely love binary or octal literals. Also I agree we can have 0o syntax for octals and fix one blunder from C.

I don't understand, is this some kind of new ill will that programmers are now not allowed certain literals.
September 10, 2022
On 10.09.22 04:17, Walter Bright wrote:
> On 9/9/2022 4:43 PM, Adam D Ruppe wrote:
>>> If you're using a lot of octal literals such that this is an issue, one wonders, what for? The only use I know of is for Unix file permissions.
>>
>> I keep hitting them in random C code I'm translating. Various unix things beyond file permissions and a hardware manual for a think i had to drive (an rfid chip) used them for various bit triplets too.
> 
> octal!433 is really not much different from 0433. It could even be shortened to o!433, exactly the same number of characters as 0o433.
> ...

o!422 is such a hack, and it does not even (always) work. This problem is even worse for binary literals.


> The reasons for adding language syntactic sugar:
> 
> 1. its very commonplace
> 
> 2. the workarounds are gross
> 
> Of course it's a judgement call, and I understand you see them randomly in C code, but does it really pay off? The downside is the language gets bigger and more complex, the spec gets longer, and people who don't come from a C background wonder why their 093 integer isn't 93.
> ...

I think basically everyone here agrees that 093 is bad syntax and was a mistake.

>  > the newer imported!"std.conv".octal!433 pattern
> 
> Nobody would ever write that unless they used octal exactly once, which suggests that octal literals aren't common enough to justify special syntax.
> 
> 
>> I often prefer using binary literals anyway, but changing something like 0o50000 to binary is a little obnoxious.
> 
> I first implemented binary literals in the 1980s, thinking they were cool and useful. They were not and not. I haven't found a reasonable use for them, or ever wanted them. (I prefer writing them in hex notation, as binary literals take up way too much horizontal space. After all, C3 is a lot easier than 11000011. The latter makes my eyes bleed a bit, too.)
> ...

Binary literals are e.g., a GNU C extension and they are in C++14, so clearly people see an use for them.

> Let's simplify D.

I really don't understand why you seem to think removing simple and convenient lexer features that behave exactly as expected in favor of overengineered Phobos templates that have weird corner cases and are orders of magnitude slower to compile is a meaningful simplification of D. It utterly makes no sense to me.

Let's simplify D in a way that actually positively impacts the user experience, for example by getting rid of weird corner cases and arbitrary limitations. Of course, that requires actual design work and sometimes even nontrivial compiler improvements, which is a bit harder than just deleting a few lines of code in the lexer and then adding ten times that amount to Phobos.
September 10, 2022
On 9/10/2022 7:21 AM, Adam D Ruppe wrote:
> Have you ever stopped to ask WHY there's so little confidence in D's leadership?
> 
> This whole thing started because you make the *patently false* statement in DConf that binary literals were *already* deprecated.

Yes, I made a mistake. There was a decision to remove it, but it just got deferred and then overlooked.

Not making an excuse, but D has grown to the point where I can't keep it all in my head at once. Hence at least one motivation for simplification. I've caught Bjarne in errors with C++, too. So what.

> Yet you presume to lecture us what what things are used and how, then make unilateral decisions, just ignoring community experience.

There is a DIP process, and why we have a n.g. where people can discuss. A lot of changes I felt were a good idea were dropped because of opposition on the n.g., like dropping the initializer syntax in favor of sticking with expression style.
September 10, 2022
On Saturday, 10 September 2022 at 16:18:53 UTC, Timon Gehr wrote:
> 
> Binary literals are e.g., a GNU C extension and they are in C++14, so clearly people see an use for them.
>

Just some more examples of supported languages:

Zig - https://ziglearn.org/chapter-1/#integer-rules

Julia - https://docs.julialang.org/en/v1/manual/integers-and-floating-point-numbers/

Kotlin - https://kotlinlang.org/docs/numbers.html#literal-constants-for-numbers (but no Octal)
September 10, 2022
On 9/10/2022 9:11 AM, IGotD- wrote:
> This also at the same time importC is claimed to be "simple", while working on it over a year.

I have many other things to do besides working on ImportC. And it is simple, as language implementations go. Besides, ImportC is a big win for D. I should have done it from the beginning.