September 14, 2022
On Wednesday, 14 September 2022 at 19:39:19 UTC, Walter Bright wrote:
> On 9/13/2022 6:52 AM, Don Allen wrote:
>> So while I don't have a personal use for binary literals, clearly others have. But Walter sees an internal cost to the compiler.
> The internal cost to the compiler is pretty small as these things go. It's more the cognitive cost of a larger language.

This focuses things on what I'm calling user-complexity. What many people are arguing here is that removing binary literals will not reduce the cognitive load meaningfully.
September 14, 2022
On Wednesday, 14 September 2022 at 19:56:13 UTC, jmh530 wrote:
> On Wednesday, 14 September 2022 at 19:34:00 UTC, Walter Bright wrote:
>> On 9/12/2022 7:48 AM, jmh530 wrote:
>>> I don't recall anyone mentioning the removal of complex/imaginary numbers, but the issues are the same.
>>
>> I was surprised at the pretty much non-existent pushback on removing them, even though it did carry with it the loss of the convenient syntax for them.
>
> I had some code that broke. It took maybe 15 minutes or a half an hour to fix. I don't recall there being a preview switch for that, but if not it might have been good to have one.
>
> I agree it was a convenient syntax, but most people probably agreed that it wasn't pulling its weight.

... and it's worse if you remove common C features as then D nolonger is a BetterC.
September 14, 2022
On Wednesday, 14 September 2022 at 19:39:19 UTC, Walter Bright wrote:
> On 9/13/2022 6:52 AM, Don Allen wrote:
>> So while I don't have a personal use for binary literals, clearly others have. But Walter sees an internal cost to the compiler.
> The internal cost to the compiler is pretty small as these things go. It's more the cognitive cost of a larger language.

If that's the case, then I would say that removing binary literals is a really tiny step that is not worth taking, given the push-back from the community. I think there are other much larger ways in which the language is too complex, many inherited from C and C++. For me, that's where the real payoff is. Backwards compatibility, as always, will be the big impediment.
September 14, 2022

On 9/14/22 3:34 PM, Walter Bright wrote:

>

On 9/12/2022 7:48 AM, jmh530 wrote:

>

I don't recall anyone mentioning the removal of complex/imaginary numbers, but the issues are the same.

Not even close. Complex numbers were a set of 6 types. Binary literals are not even a type, but just another way to specify an integer. Removing complex numbers means you have to change the type of anything you were using it for to the library Complex type.

Removing binary literals would just mean you had to change literals to either hex, or some std.conv function. So in terms of burden, the complex number removal is far greater (for those who used it) than removing binary literals would be.

But in terms of language complexity, the benefits of removing complex numbers as a builtin are much much greater -- removing a slew of code generation, removing 6 types and the accompanying TypeInfo classes, (possibly) removing keywords, etc. The library also gets more complicated, because now it must replace that functionality. But with that cost, now you have a template that can be used with other things (like half-float).

Removing binary literals means removing 5 lines of code in the lexer. That's it. And you could add a std.conv.binary function (but probably not necessary).

Which is why it's so confusing that we are even having this debate. It's neither a monumental achievement, nor a monumental burden, it's just... annoying.

It would be like removing a dedicated clock in a car dashboard, because you could see the time in the touch-screen system corner. For saving a few pennies you piss off all the customers who liked that clock feature.

>

I was surprised at the pretty much non-existent pushback on removing them, even though it did carry with it the loss of the convenient syntax for them.

I've never used complex numbers in code. Not even while playing around.

I've used binary literals, not a ton, but I have used them. And when I do use them, it's not because I like binary literals more than hex literals, or decimals, it's because for that specific case, they were a tad clearer.

I suspect many are in the same boat.

-Steve

September 15, 2022

On Thursday, 15 September 2022 at 00:15:03 UTC, Steven Schveighoffer wrote:

>

On 9/14/22 3:34 PM, Walter Bright wrote:

Removing binary literals means removing 5 lines of code in the lexer. That's it. And you could add a std.conv.binary function (but probably not necessary).

Actually we can't even remove it, we need to keep the support for ImportC and add extra logic to disable it for D!

September 15, 2022

On Thursday, 15 September 2022 at 05:48:31 UTC, Daniel N wrote:

> >

Actually we can't even remove it, we need to keep the support for ImportC and add extra logic to disable it for D!

C11 doesn't have binary literals, but the idea that support for C11 would be enough to make ImportC widely useful is strange.

September 15, 2022

On Thursday, 15 September 2022 at 00:15:03 UTC, Steven Schveighoffer wrote:

>

For saving a few pennies you piss off all the customers who liked that clock feature.

A small function to solve a big problem, and only a few lines of code!

This function, the author of D said, I will delete it! Come and stop me!

September 15, 2022

On Thursday, 15 September 2022 at 05:57:17 UTC, Max Samukha wrote:

>

On Thursday, 15 September 2022 at 05:48:31 UTC, Daniel N wrote:

>

Actually we can't even remove it, we need to keep the support for ImportC and add extra logic to disable it for D!

C11 doesn't have binary literals, but the idea that support for C11 would be enough to make ImportC widely useful is strange.

Besides C23 it's also a gnu extension. We added other gnu/clang extension to ImportC such as typed enum : uint32_t etc.

September 15, 2022

On Thursday, 15 September 2022 at 06:42:34 UTC, Daniel Nielsen wrote:

>

Besides C23 it's also a gnu extension. We added other gnu/clang extension to ImportC such as typed enum : uint32_t etc.

Yeah, I read this:

"Implementation Defined: Adjustment to the ImportC dialect is made to match the behavior of the C compiler that the D compiler is matched to, i.e. the Associated C Compiler."

as LDC/GDC will have to support every C extension gnu/clang supports.

September 15, 2022

On Wednesday, 14 September 2022 at 05:58:53 UTC, Walter Bright wrote:

>

On 9/13/2022 7:56 PM, Steven Schveighoffer wrote:

>

But it doesn't disprove the fact that sometimes, hex digits aren't as clear.

Does sometimes justify a language feature, when there are other ways?

People often complain that D has too many features. What features would you say are not worth it?

Started a new thread on that.