September 09, 2022
On 9/9/2022 7:45 PM, Nicholas Wilson wrote:
> If you think binary literals are user unfriendly you're using them wrong.

Do you use them? :-)

September 09, 2022
On 9/9/2022 7:38 PM, Adam D Ruppe wrote:
> On Saturday, 10 September 2022 at 02:17:30 UTC, Walter Bright wrote:
>> octal!433 is really not much different from 0433. It could even be shortened to o!433, exactly the same number of characters as 0o433.
> 
> You snipped the relevant point about having to change context to add the import.

I normally do not quote everything. I'm not trying to hide the import thing, I just don't attach much importance to it.


> That's objectively not a big deal but subjectively proves to be a barrier to adoption.

If you're using octal a lot, it is worth it.


> (I do think it would be a bit better too if it was `core.octal` instead of `std.conv` so it brings in a bit less baggage too.)

It's not really a core feature, but std.octal would be better.


> The question of bigger languages is with interaction between features. Octal literals are about the most trivial addition you can do since it doesn't interact with anything else.

It does, as 093 doesn't work as non-C programmers would expect.


>> Nobody would ever write that unless they used octal exactly once
> This is demonstrably untrue. Local imports are common in D, even when used repeatedly.

While I like that D can do things like that, it's not a great style, because it wouldn't be discovered with grep (i.e. obfuscates what things are imported).


>> Let's simplify D.
> This doesn't achieve anything. If you carry on with this path, you're gonna force a fork of the language. Is that what you want?

Do you really want to use the nuclear option over octal literals?

It really bothers me why so many discussions head down this path. Let's please try and keep the voltage down.
September 10, 2022
On Saturday, 10 September 2022 at 05:58:25 UTC, Walter Bright wrote:
> On 9/9/2022 7:38 PM, Adam D Ruppe wrote:
>
>> (I do think it would be a bit better too if it was `core.octal` instead of `std.conv` so it brings in a bit less baggage too.)
>
> It's not really a core feature, but std.octal would be better.
>

Personally I think anything which WAS a language feature should be in object to keep the feature working out of the box. Not sure why people are so afraid to use it, not often you have to read object.d source, as long as it's just one file, it will be blazingly fast as always.

>
> It does, as 093 doesn't work as non-C programmers would expect.
>
>

I don't think anyone is arguing in favour of 093 but 0o93.

I use octal seldom, but binary very often, google bitboards.

This has an obvious visual meaning but in hex it would be hard to read.
0b111111
0b100001
0b100001
0b111111

September 10, 2022
On Saturday, 10 September 2022 at 02:22:53 UTC, Walter Bright wrote:

> Hex values are far easier to read, too. Have you ever put the tip of your pencil on the screen to count the number of 1's? I have. Binary literals are user unfriendly.

Bit flags are easier to read as binary grouped in nibbles. For example:

enum ubyte[16] ledDigits =
    [
        0b0011_1111, // 0
        0b0000_0110, // 1
        0b0101_1011, // 2
        0b0100_1111, // 3
        0b0110_0110, // 4
        0b0110_1101, // 5
        0b0111_1101, // 6
        0b0000_0111, // 7
        0b0111_1111, // 8
        0b0110_1111, // 9
        0b0111_0111, // A
        0b0111_1100, // b
        0b0011_1001, // C
        0b0101_1110, // d
        0b0111_1001, // E
        0b0111_0001, // F
    ];

Those are the bit masks for a 7-segment display. Of course, you could define them by or'ing enum flags or translating into hex, or use a template, but that would be annoying.
September 10, 2022
On Saturday, 10 September 2022 at 02:31:05 UTC, Walter Bright wrote:
> I haven't seen CPUs that were documented in octal since the PDP-11, even though it didn't quite work with 16 bits.

You caught me. I've never actually programmed anything in my entire life and just made all this up. It is true, no such thing exists. No programmer anywhere has ever had use for octal nor binary, and if they did, I wouldn't know anyway since I'm a huge fraud.

I only come to these forums because I'm a compulsive liar dedicated to sabotaging the D language. Why? I don't know. I guess I just like to watch the world burn.

September 10, 2022
On Saturday, 10 September 2022 at 07:03:21 UTC, Daniel N wrote:
> Personally I think anything which WAS a language feature should be in object to keep the feature working out of the box. Not sure why people are so afraid to use it, not often you have to read object.d source, as long as it's just one file, it will be blazingly fast as always.

object isn't actually blazingly fast. Its growth at least appears to be responsible for much of the slowdown of basic builds compared to the older D1 era builds.

I'm not sure exactly why though, if it is growth per se, or the use of several internal imports causing slowdowns, or specific implementation techniques, but if you zero out the object.d it does speed up builds.

Though I will concede it tends to be a smaller percentage as the program grows, it still imposes a baseline cost to each compiler invocation so we might want to keep an eye on it.

Another aspect is that object is implicitly imported so you get things in the global namespace. The module system has ways to disambiguate it, but I still just generally prefer the explicit import to keep it clear especially since the error messages can be fairly poor without import to clean it up.

But then you get the ergonomic issue of having to import it.
September 10, 2022
On Saturday, 10 September 2022 at 02:38:33 UTC, Adam D Ruppe wrote:
> The question of bigger languages is with interaction between features. Octal literals are about the most trivial addition you can do since it doesn't interact with anything else.

They add a new token to the grammar, which means that tools like syntax highlighters have to be updated.

Obviously it's not a difficult change to make, but there is non-zero friction here.
September 10, 2022
On Saturday, 10 September 2022 at 12:18:49 UTC, Paul Backus wrote:
> Obviously it's not a difficult change to make, but there is non-zero friction here.

Yeah. And on the other hand, deprecating the currently existing binary literals (which is what this thread is about, remember) also requires those same changes. So it has a cost in updating things with no benefit in removing actual complexity since the feature is quite isolated.
September 10, 2022
On Saturday, 10 September 2022 at 05:58:25 UTC, Walter Bright wrote:
> Do you really want to use the nuclear option over octal literals?
>
> It really bothers me why so many discussions head down this path. Let's please try and keep the voltage down.

Have you ever stopped to ask WHY there's so little confidence in D's leadership?

This whole thing started because you make the *patently false* statement in DConf that binary literals were *already* deprecated. This shows you didn't fact check your talk - you didn't try to compile the code, you didn't check the spec, and you didn't talk to any experienced D users, who would have pointed out the error.

Yet you presume to lecture us what what things are used and how, then make unilateral decisions, just ignoring community experience.

A lot of us were excited about std.conv.octal. I wrote the first draft myself, then even voluntarily rewrote it into a convoluted mess when requested to by the Phobos maintainer, which was a lot of work and VRP improvements rendered most that work moot since then, but at the time, I thought it was worthwhile to get it in.

That was over ten years ago.

Since then, despite hitting use for octal literals several times, I've only ever translated them to use std.conv.octal a few times. I more often translate to binary or hex, not necessarily because they're the best representation (though like i said, i often do prefer binary literals to octal), but just because they're built in.

Similarly, I have been arguing that `throw new Exception("some string")` is bad form for a long time, but I often do it anyway just because it is the most convenient thing to do.

On the other hand, you've pointed out before that `unittest` is not a fantastic test system, but it gets used because it is convenient and this is a good thing, since some unittest is better than none.

I do think octal in its own, fully independent module would be better than we have now, since at least then it is easier to pull without additional baggage. But I've heard a lot of people complain they just won't do the import at all because you have to move elsewhere into the code to add it and it is just an added hassle. So it wouldn't fix that but might be more used than it is now.

But regardless, binary literals are already here and shouldn't go anywhere. (btw another reason why is the octal!555 trick - using an int literal - won't work with binary since it will overflow too easily. you'd have to quote the string. which is not a big deal but another little thing)

Anyway, removing the binary literals we've had for decades would *hurt* D. And ignoring D's users over and over and over again IS going to lead to a fork.
September 11, 2022
I'm still upset over hex strings.

They were useful for generated files.

https://raw.githubusercontent.com/Project-Sidero/basic_memory/main/database/generated/sidero/base/internal/unicode/unicodedata.d

2.72mb!

It is an absolute nightmare to debug without hex strings and you can't tell me my builds are going to be faster and use less memory if I have to call a function at CTFE to do the conversion from a regular string...