September 15
On Friday, 23 August 2024 at 01:47:37 UTC, Manu wrote:
> How can we add an attribute to the branch condition that the backend can take advantage of? I think it needs to be in the language spec...

In case this thread goes nowhere, work-around is:

    version(LDC)
    {
        import ldc.intrinsics;
        bool likely(bool b) { return llvm_expect!bool(b, true); }
        bool unlikely(bool b) { return llvm_expect!bool(b, false); }
    }
    else
    {
        bool likely(bool b) { return b; }
        bool unlikely(bool b) { return b; }
    }

Not sure for GDC. This taken from lz4 source.
September 15
On Sat, 14 Sept 2024, 09:36 Johan via Digitalmars-d, < digitalmars-d@puremagic.com> wrote:

> On Thursday, 12 September 2024 at 22:59:32 UTC, Manu wrote:
> >
> > expect() statements are not a good time.
>
> Why not?
> Expect is known from other languages and is more general (can
> work with wider range of values than just true/false, and can
> work with types too), and reduces the problem to the backend
> domain only: no new parsing rules (don't forget editors and other
> tooling), and no AST change, etc.
>

It's still ugly and relatively high impact on the code you markup though. You have to insert additional lines in the flow and write expressions, which is awkward if it's some long or compound expression, and it's also an undesirable repetition of an expression; a tweak to some if(expr) may also need to have a matching tweak to the expect statement, could lose sync, or if you then suggest to resolve the expression to a local bool somewhere before the expect() and the if(), then that's ANOTHER pointless line.

It's a bad time all round. We've moved on from expect statements. They have their place, but it's not for this.

>> I think it would hurt D as a whole to
> >> have special stuff for such a rare thing.
> >>
> >
> > How?
> > And it's not 'rare'; it's 'niche'. For a microcontroller with
> > no branch
> > prediction, it's common and essential.
> > It's literally unworkable to write code for the platform
> > without this tool;
> > you can't have branches constantly mispredicting.
>
> It would help if your arguments would not use so much hyperbole. Obviously it is not unworkable, demonstrated by decades of microcontroller programming without it (C does not have it).


Huh? I've been writing expect statements in C for 20 years or maybe more... It's always been a useful tool; but the reason it's actually way more relevant today, is because the nature of static branch prediction in modern micros where it's much more costly to mispredict in riscv or xtensa than it used to be. Old arch's with weak (or no) branch prediction also had very shallow pipelines, or even no pipelines in early micros. More advanced old arch's did have dynamic branch predictors, but you could still gain some optimisations from hunting code layout, or signalling an initial prediction; though the value and applicability of this sort of hint was MUCH narrower; it was rare to bother.

Modern arch's are pipelined, and a mispredict is relatively more costly.
I don't think there has ever been a time where we have had popular
architectures with meaningfully deep pipelines and without dynamic branch
prediction, where a mispredict is anywhere near as costly.
There's a reason it's been recently added to C++. This situation is new.

Also the nature of code on these micros today; they are hundreds of mhz and we do a whole lot more with them, including more complex logic and number crunching. Old micros were conveniently coded in asm, and nobody ever really expected them to do much of anything.

These micros are comparable to a PlayStation 2 or something of that era; there's a really great opportunity to write meaningful software for them. ~2004 is back... and it's actually a pretty interesting new domain.

Most micro software even on these modern processors doesn't do anything particularly interesting (turns a light on or off over wifi); but if you want to do something interesting with these chips, there's a lot of careful handling which gives huge advantages. The branch predictor is *by far* the most significant detail to take care. That's not hyperbole.

The "hurt" I meant is in maintenance of the compiler frontend
> which already is in quite a bad complexity state (subtle bugs existing and introduced upon almost every change). Adding yet another special case is imo a net loss.
>

I reckon we're dealing with one pointer to an attribute expression in the branch node (which will be completely ignored except for by the backend), and a line in the parser to allow attaching an attribute expression to that pointer. I'd imagine the patch is less than 10 lines, and not very invasive at all.

I don't see this is a special case. The grammar tweak would be no more
special case than attaching attributes to other declarations.
I reckon you're exaggerating the complexity.

>


September 15
On Sunday, 15 September 2024 at 12:06:53 UTC, Manu wrote:
> On Sat, 14 Sept 2024, 09:36 Johan via Digitalmars-d, < digitalmars-d@puremagic.com> wrote:
>
>> On Thursday, 12 September 2024 at 22:59:32 UTC, Manu wrote:
>> >
>> > How?
>> > And it's not 'rare'; it's 'niche'. For a microcontroller with
>> > no branch
>> > prediction, it's common and essential.
>> > It's literally unworkable to write code for the platform
>> > without this tool;
>> > you can't have branches constantly mispredicting.
>>
>> It would help if your arguments would not use so much hyperbole. Obviously it is not unworkable, demonstrated by decades of microcontroller programming without it (C does not have it).
>
>
> Huh? I've been writing expect statements in C for 20 years or maybe more...

I meant - of course - something besides `expect`. I was responding to your "this tool", thinking that it must refer to your new proposal, because otherwise why would there be a discussion? But apparently you meant "expect". Then there is no debate:
when writing D for microcontrollers today (i.e. you need to use GDC or LDC) you have the option to use the equivalent of `expect`. Use it.

I'm sorry for entering this discussion and will leave. I do not get a sense that the desire is to build a balanced view and come to a shared conclusion.

-Johan

September 15
On Sun, 15 Sept 2024, 14:11 Johan via Digitalmars-d, < digitalmars-d@puremagic.com> wrote:

> On Sunday, 15 September 2024 at 12:06:53 UTC, Manu wrote:
> > On Sat, 14 Sept 2024, 09:36 Johan via Digitalmars-d, < digitalmars-d@puremagic.com> wrote:
> >
> >> On Thursday, 12 September 2024 at 22:59:32 UTC, Manu wrote:
> >> >
> >> > How?
> >> > And it's not 'rare'; it's 'niche'. For a microcontroller with
> >> > no branch
> >> > prediction, it's common and essential.
> >> > It's literally unworkable to write code for the platform
> >> > without this tool;
> >> > you can't have branches constantly mispredicting.
> >>
> >> It would help if your arguments would not use so much hyperbole. Obviously it is not unworkable, demonstrated by decades of microcontroller programming without it (C does not have it).
> >
> >
> > Huh? I've been writing expect statements in C for 20 years or maybe more...
>
> I meant - of course - something besides `expect`. I was
> responding to your "this tool", thinking that it must refer to
> your new proposal, because otherwise why would there be a
> discussion? But apparently you meant "expect". Then there is no
> debate:
> when writing D for microcontrollers today (i.e. you need to use
> GDC or LDC) you have the option to use the equivalent of
> `expect`. Use it.
>
> I'm sorry for entering this discussion and will leave. I do not get a sense that the desire is to build a balanced view and come to a shared conclusion.
>
> -Johan
>

No I'm not proposing expect; but saying that expect has been the
state-of-the-art forever, and we're done with that now.
Using expect has always been an unsatisfying experience, but at least it's
rare. Now that this particular issue is more relevant than ever, and at
least C++ has acknowledged and accepted this, it's time we fix this too.

>


September 18

On Friday, 13 September 2024 at 11:18:59 UTC, Zoadian wrote:

>

On Friday, 13 September 2024 at 10:57:56 UTC, Quirin Schroll wrote:

>

On Friday, 13 September 2024 at 10:26:48 UTC, Richard (Rikki) Andrew Cattermole wrote:
Let’s say you have a big switch in a hot loop. You profiled and now the data tells you how likely each branch was. What would you prefer? Reordering the branches by likelihood, leading to a diff that’s basically impossible to understand or even vet that it’s just a reordering, or the pure addition of likelihood annotations, for which in the diff it’s absolutely clear nothing else changes. And if there’s a fallthrough, you have to jump to the right case now.

i'd prefer handing the compiler a profile log, and the compiler just optimizing based on that file without the need to do any annotations by hand.

Sure, that’s way easier for devs. You just can’t do that if you also want to distribute the code with the branch hints.

1 2 3 4 5 6 7 8 9 10 11
Next ›   Last »