April 12, 2019
On 4/12/19 12:32 AM, Mike Franklin wrote:
> On Friday, 12 April 2019 at 04:19:19 UTC, Mike Franklin wrote:
> 
>> And what do you have to say about this gem? https://github.com/dlang/dmd/blob/4c65b8726a66314e12ad1cad66b099d358e7165d/src/dmd/optimize.d#L62-L65 
>>
> 
> ```
> Expression nullReturn()
> {
>      return null;
> }
> 
> return nullReturn;
> ```
> 
> That leaves me dumbfounded every time I see it.

Well it's a local function which drastically restricts its clowniness. In all likelihood appeared as a result of refactoring gotos away. Could safely go, but either way won't sensibly change the state of affairs.

The other you mentioned seems to be a debugging hook, which seems reasonable.

At any rate, we need to make progress from the pot-calling-the-kettle-black stance to a net positive gradient, e.g. pull requests that improve the codebase.
April 12, 2019
On Friday, 12 April 2019 at 06:14:05 UTC, Nick Sabalausky (Abscissa) wrote:
> Parson me here, but: Bullshit.
> [snip]

Well said.
April 12, 2019
On Friday, 12 April 2019 at 11:03:21 UTC, Walter Bright wrote:
> On 4/11/2019 11:14 PM, Nick Sabalausky (Abscissa) wrote:
>> But here's the problem: That's your superpower. It's enviable, it's impressive, but it's absolutely NOT shared by the vast majority of GOOD programmers, let alone rank-and-file ones.
>
> Knowing what (x & 1) is is a superpower? What about (x + x) ?

Its not about not knowing, its that it takes more time to decipher. People do waaaay more math than bit operations (which is why hash functions are still black magic to me).
April 12, 2019
On 4/12/2019 6:31 AM, Timon Gehr wrote:
> "Who's the clown who's put that there? Don't D programmers know about the array index operator?!"

front() is there for arrays as an adapter so one can use range operations on it.
April 13, 2019
On 4/12/19 7:03 AM, Walter Bright wrote:
> On 4/11/2019 11:14 PM, Nick Sabalausky (Abscissa) wrote:
>> But here's the problem: That's your superpower. It's enviable, it's impressive, but it's absolutely NOT shared by the vast majority of GOOD programmers, let alone rank-and-file ones.
> 
> Knowing what (x & 1) is is a superpower? 

No. Like I said, I know what it is too, so do tons of programmers. But there's a big difference between possessing that knowledge vs having it internalized at a glance so deeply that a mere glace mentally registers as "is this an odd/even number". The former is common, and such people have to do a mental translation step (even if they're not always aware they're doing it). The latter, where no mental translation is even needed, does occur, but it's not especially common, even among good programmers.

Naturally I can't say this for sure, but based on your arguments, it sounds like you may very well be in the latter, uncommon category, meaning you would posses the genuine benefit of lacking a mental distinction between "& 1" and "evenness". In effect, it's mean you see straight through to the core truth (analogous to rain man's counting, hence the allusion to "superpowers").  If so, then that's fantastic, and I can certainly understand why you would balk at an isOdd/etc. It's just like how I would balk at anyone wrapping "x++" syntax with a "x.increment()" function. Like most with C-like experience, I look at "x++" and I instinctually "see" an increment, thus abstracting it would be pointless.

But understand that such lack of mental distinction between "& 1" and "evenness" isn't common, even among good programmers with low-level experience. And that distinction is exactly what high-level languages are all about: Not mixing high-level intent from from mechanical implementation in order to cope with the human brain's difficulty at handling different levels of abstraction simultaneously.

If for you, they're the same level of abstraction, that's great. For most, it isn't.
April 13, 2019
On 4/12/19 2:33 PM, Nicholas Wilson wrote:
> On Friday, 12 April 2019 at 11:03:21 UTC, Walter Bright wrote:
>>
>> Knowing what (x & 1) is is a superpower? What about (x + x) ?
> 
> Its not about not knowing, its that it takes more time to decipher. People do waaaay more math than bit operations (which is why hash functions are still black magic to me).

I'd argue it's not really even about the time it takes. It's about the cognitive load. (Though that does admittedly translate into additional time, among other things.)

...

Come to think of it, just a random stray thought here...I think it would benefit the programming world if the relationship between code, languages and cognitive load (and heck, just the psychology of code in general)), plus why cognitive load matters, were incorporated directly into programming instruction.

Really, if you think about it, real-world programming is *all about* managing the interplay between logic and psychology. I think we would do well to incorporate the psychological aspects in teachings just like we do the logic side. (Maybe then, things like dynamic languages and silver-bullet "everything-is-a-..." languages (and...PHP...) would gain less traction in the first place.)
April 12, 2019
On 4/12/2019 9:15 PM, Nick Sabalausky (Abscissa) wrote:
> If for you, they're the same level of abstraction, that's great. For most, it isn't.

If you want to write a library of such, and contribute it to Dub, feel free. If you find an audience for it, great.
April 13, 2019
On 4/12/19 7:03 AM, Walter Bright wrote:
> 
> If you want isOdd() for your own personal library, feel free. Putting it into Phobos, however, is not happening.

To be clear, I'm not pushing for that. Granted, I'd have zero objection if it did get in, and I'd even applaud it as a nice benefit for "D as a first language" and for other less-than-expert devs. But I'm certainly not pushing for it. I just disagree on a philosophical level with your objection to it. That's all.
April 12, 2019
On Sat, Apr 13, 2019 at 12:15:02AM -0400, Nick Sabalausky (Abscissa) via Digitalmars-d wrote:
> On 4/12/19 7:03 AM, Walter Bright wrote:
[...]
> > Knowing what (x & 1) is is a superpower?
[...]
> Naturally I can't say this for sure, but based on your arguments, it sounds like you may very well be in the latter, uncommon category, meaning you would posses the genuine benefit of lacking a mental distinction between "& 1" and "evenness". In effect, it's mean you see straight through to the core truth (analogous to rain man's counting, hence the allusion to "superpowers").  If so, then that's fantastic, and I can certainly understand why you would balk at an isOdd/etc. It's just like how I would balk at anyone wrapping "x++" syntax with a "x.increment()" function. Like most with C-like experience, I look at "x++" and I instinctually "see" an increment, thus abstracting it would be pointless.

Personally, even though I understand perfectly well what (x & 1) means,
I much rather write it as (x % 2) instead and let the optimizer
implement it as (x & 1), because that conveys intent better.  I consider
using (x & 1) for testing evenness/oddness akin to writing gotos in
place of loops.  It's semantically equivalent, but IMO at the wrong
level of abstraction.


> But understand that such lack of mental distinction between "& 1" and "evenness" isn't common, even among good programmers with low-level experience. And that distinction is exactly what high-level languages are all about: Not mixing high-level intent from from mechanical implementation in order to cope with the human brain's difficulty at handling different levels of abstraction simultaneously.
[...]

I don't find it difficult to parse (x & 1) as testing for odd/even, but
I do find it distasteful the same way most people would find writing
gotos instead of loops or if-statements distasteful. `&` is a bitwise
operator, and as such has different connotations from (x % 2), which is
more obviously equivalent to the mathematical concept of odd/even.
The machine doesn't care about the difference -- and in the old days, &
was preferred because it produced better machine code, though nowadays
any non-joke compiler would optimize away the difference -- but writing
code isn't merely just telling the machine what to do, it's also
documenting to the next human reader what you intended the machine to
do.  For the latter purpose, writing (x % 2) conveys intent much better
than (x & 1).

OTOH, writing an entire function to abstract away even/oddness is going a bit too far. That'd be like writing a function ifThen(condition, trueBranch, falseBranch) that abstracts away if-statements -- it's redundant and needless complexity for something already expressible with built-in constructs. Such would be justifiable only if you're doing something like runtime metaprogramming or writing an interpreter where you need to parcel away built-in constructs into runtime-manipulable objects.


T

-- 
Question authority. Don't ask why, just do it.
April 12, 2019
On 4/12/2019 10:18 PM, H. S. Teoh wrote:
> writing (x % 2) conveys intent much better
> than (x & 1).

They produced different code. Try this with your favorite C compiler:

int foo(int x)
{
    return x % 2;
}

int bar(int x)
{
    return x & 1;
}