November 27

On Sunday, 27 November 2022 at 03:28:06 UTC, cc wrote:

>

Having to copy/paste:

alias A = AttackType;
alias W = WeaponType;
alias M = MovementType;

is unideal. And the broader the scope they're declared in the more likely they are to conflict with other shortened alias names.

And then when you have, say enum EntityState and enum EnemySprite, what then? alias ETS, alias ESPR... just further increasing symbolic clutter and cognitive load.

Indeed here we hit the whole rationale of the DIP. Current enum syntax in modern languages like D, modern C++, C# etc departs from C purposely. I see how it can be tedious when importing C enums named in compensation e.g. IMG_INIT.IMG_INIT_jpg. In these or other exceptional cases aliasing can help.

But as a general practice shortening names for the sake of shortness is very bad practice. This is more genenral than enums, and recommended in the literature. Certainly in the company where I work, if I use non-self-explanatory or even abbreviated names (e.g. A, W, E) my pull request won't be approved until I make them readable. I do the same consideration when reviewing other people's PRs.

As I said in my first reply, the DIP provides no rationale other than current synatx "can be tedious" without providing any use case.

It looks to me that the actual rationale of this change is (orher than dealing with badly named enums imported from C) to enable unreadable coding styles. And to mortgage the language and the compiler for it.

November 27

On Sunday, 27 November 2022 at 07:48:48 UTC, XavierAP wrote:

>

But as a general practice shortening names for the sake of shortness is very bad practice.

Particularly when you shorten it to an ambiguous $, or just remove the name altogether.

November 27

On Sunday, 27 November 2022 at 07:48:48 UTC, XavierAP wrote:

>

On Sunday, 27 November 2022 at 03:28:06 UTC, cc wrote:

>

Having to copy/paste:

alias A = AttackType;
alias W = WeaponType;
alias M = MovementType;

is unideal. And the broader the scope they're declared in the more likely they are to conflict with other shortened alias names.

And then when you have, say enum EntityState and enum EnemySprite, what then? alias ETS, alias ESPR... just further increasing symbolic clutter and cognitive load.

Indeed here we hit the whole rationale of the DIP. Current enum syntax in modern languages like D, modern C++, C# etc departs from C purposely. I see how it can be tedious when importing C enums named in compensation e.g. IMG_INIT.IMG_INIT_jpg. In these or other exceptional cases aliasing can help.

But as a general practice shortening names for the sake of shortness is very bad practice. This is more genenral than enums, and recommended in the literature. Certainly in the company where I work, if I use non-self-explanatory or even abbreviated names (e.g. A, W, E) my pull request won't be approved until I make them readable. I do the same consideration when reviewing other people's PRs.

As I said in my first reply, the DIP provides no rationale other than current synatx "can be tedious" without providing any use case.

It looks to me that the actual rationale of this change is (orher than dealing with badly named enums imported from C) to enable unreadable coding styles. And to mortgage the language and the compiler for it.

>

the DIP provides no rationale other than current synatx "can be tedious" without providing any use case.

Wrong, i even personally provided use cases by replying to your threads, you seem to ignore them

C++/C# are not "modern" languages, they are old C oriented languages, C# repeats the same mistake as C++, they failed to improve C enums, they made them worse more verbose and less flexible, both doesn't even have pattern matching (C# has that just recently)

Thankfully the languages that stray away from C did enums and designated initialization of types right! Swift and Zig and to a lesser extent Kotlin for example

November 27

On Sunday, 27 November 2022 at 03:17:45 UTC, cc wrote:

>

$_ is the default variable in perl (or was). I can potentially envision people using _ as an iterator, match result, bitfield padding variable, etc. Whether that's a big enough issue worth worrying about, eh.

Fair point, we would have to see how much code uses it. I don't think it's common in D.

>

Though it does seem odd have to make the exception "_ is a word character, you can use it in any symbol name, except by itself." Suppose that's essentially the same situation for keywords, though.

I wasn't necessarily arguing that underscore should not work as a variable name, just that accessing _.foo should try to match an enum member when an enum member is expected rather than a member of whatever _ would otherwise evaluate to. I doubt that would break code.

November 27

On Saturday, 26 November 2022 at 11:14:05 UTC, XavierAP wrote:

>

But... Why then not not go one step further, and leave it up to each one's private code to

alias _ = EType;

when desired, and leave the language and backwards compatibility alone?

Firstly I don't think it would break backwards compatibility in practice, see my reply to cc.

Secondly alias or with often does not help at all when you want type inference, and type inference already works with function literals:

enum LongName
{
	a,b,c
}

LongName e;
void function(int) f;

void main()
{
	f = (int i){};
	f = (i){}; // infer parameter type

	e = LongName.a;
	e = $a; // infer parent type
}
November 27

On Wednesday, 23 November 2022 at 13:41:12 UTC, ryuukk_ wrote:

>
// verbose API, readable
struct SuperLongStruct
{
    struct SuperLongInnerStruct
    {
        enum SuperLongEnum
        {
            VALUE_A, VALUE_B
        }

        SuperLongEnum super_long_flags;
    }
    SuperLongInnerStruct some_data;
}

Isn't this design artificially conceived to shoe-horn the proposed change as a solution? Such nesting would probably never be advisable, too convoluted and coupled. And in this example (though I understand it's a toy example) there is no reason for nesting: no hiding, and no access of private fields by the nested types.

Certainly the enum here has in this design no reason to be nested other than namespacing it into the other types; and then avoiding expressing this (seemingly absurd but purposeful) namespacing is argued as the reason for a language change. Isn't it more adequate in this example to just un-nest this declaration?

November 27

On Sunday, 27 November 2022 at 21:58:00 UTC, XavierAP wrote:

>

On Wednesday, 23 November 2022 at 13:41:12 UTC, ryuukk_ wrote:

>
// verbose API, readable
struct SuperLongStruct
{
    struct SuperLongInnerStruct
    {
        enum SuperLongEnum
        {
            VALUE_A, VALUE_B
        }

        SuperLongEnum super_long_flags;
    }
    SuperLongInnerStruct some_data;
}

Isn't this design artificially conceived to shoe-horn the proposed change as a solution? Such nesting would probably never be advisable, too convoluted and coupled. And in this example (though I understand it's a toy example) there is no reason for nesting: no hiding, and no access of private fields by the nested types.

Certainly the enum here has in this design no reason to be nested other than namespacing it into the other types; and then avoiding expressing this (seemingly absurd but purposeful) namespacing is argued as the reason for a language change. Isn't it more adequate in this example to just un-nest this declaration?

Ok so you disagree just to disagree, that's unfortunate

Here if you want a snippet of a code from my project: https://forum.dlang.org/post/bapjpjlbljjhhwcworvz@forum.dlang.org

ctx.network_state = State.NetworkState.CONNECTED;

// vs

ctx.network_state = .CONNECTED;

I don't do OOP or any of that encapsulation crap

If you don't see the code above as an improvement, then you can keep use the old behavior, nothing will change for you, as for me i will keep advocate for it

And no, i will not subscribe to XavierAP_CODE_STYLE_GUIDELINE.md document

November 27

On Sunday, 27 November 2022 at 23:13:39 UTC, ryuukk_ wrote:

And you cut off the most interesting part of my code example that you quoted, i'll psot it again for clarity:

from: https://forum.dlang.org/post/wrcrrmjgvrckyyvwfxec@forum.dlang.org

// verbose API, readable
struct SuperLongStruct
{
    struct SuperLongInnerStruct
    {
        enum SuperLongEnum
        {
            VALUE_A, VALUE_B
        }

        SuperLongEnum super_long_flags;
    }
    SuperLongInnerStruct some_data;
}

// oh shoot

SuperLongStruct super_long_struct = {
    some_data: {
        super_long_flags:  SuperLongStruct.SuperLongInnerStruct.SuperLongEnum.VALUE_A | SuperLongStruct.SuperLongInnerStruct.SuperLongEnum.VALUE_A
    }
};

// oh nice!

SuperLongStruct super_long_struct = {
    some_data: {
        super_long_flags: .VALUE_A | .VALUE_A
    }
};

November 28

On Sunday, 27 November 2022 at 21:58:00 UTC, XavierAP wrote:

>

(seemingly absurd but purposeful)

Why is that absurd?

This:

struct Entity
{
    enum Kind
    {
    }

    Kind entity;
}

instead of this:

enum EntityKind
{
}

struct Entity
{
    EntityKind entity;
}

is a common practice.

If the inner definitions become too long, it does make sense to move them to the outer scope (and then optionally alias inside the inner scope).

November 28

On Sunday, 27 November 2022 at 23:13:39 UTC, ryuukk_ wrote:

>

Ok so you disagree just to disagree, that's unfortunate

Here if you want a snippet of a code from my project: https://forum.dlang.org/post/bapjpjlbljjhhwcworvz@forum.dlang.org

ctx.network_state = State.NetworkState.CONNECTED;

// vs

ctx.network_state = .CONNECTED;

I didn't mean to be dismissive. And for the record I'd be OK with adding to D most kinds of inference as long as it didn't require new syntax (such as $ or adding to .) and were 100% sure and safe. This might be possible for enums at least in initializations or even assignments (?) but it's not the current DIP (and there are probably other difficulties even with this).

All I'm saying is, nesting your types has no other effect (since there's no private data in between) than namespacing them, and so requiring yourself to qualify this namespacing. This is your choice; purely stylistic as far as I can tell. Then you want to be able to write code without this namespacing, and for this you suggest a lexical change. But this particular use case could be solved with a different stylistic design choice when declaring your types; so I don't see the need for the language change.

This is my main point: the DIP discussion should start on the concrete problem(s) and use cases that it's supposed to solve:

  • enums imported from C
  • nested enums
  • long/bad names
  • ...?

The discussion should not be so subjective or revolve around what styles it enables or what ones different people prefer.