January 11
On Tuesday, 11 January 2022 at 02:22:06 UTC, Timon Gehr wrote:
> On 10.01.22 14:48, Mike Parker wrote:
>> 
>> 
>> This is the discussion thread for the first round of Community Review of DIP 1042, "ProtoObject":
>> 
>> https://github.com/dlang/DIPs/blob/2e6d428f42b879c0220ae6adb675164e3ce3803c/DIPs/DIP1042.md
>
> I like the fact that there will be a way out of the additional Object bloat without using extern(C++). (My preferred way out of this would still be to change Object.)
>
> However, I strongly dislike the new interfaces with their opinions about which qualifiers have to be on which methods. `const` prevents any kind of lazy initialization, and realistically, what's the use case of those interfaces?

I think that most code does not fall into that category. Moreover, the interfaces
are completely optional. You can define whatever interfaces with whatever
qualifiers you like. However, in the standard library we will provide interfaces
for the most common cases. For example, in the general case, comparing two items
should not allocate and should not throw; you want to implement something more
esoteric, that's fine, you can define your own interface.
January 11

On Monday, 10 January 2022 at 14:27:31 UTC, 12345swordy wrote:

>

On Monday, 10 January 2022 at 13:48:14 UTC, Mike Parker wrote:

>

Discussion Thread

This is the discussion thread for the first round of Community Review of DIP 1042, "ProtoObject":

[...]

A small nitpick regarding this dip. Why the name ProtoObject, shouldn't it be something like TopObject?
Also shouldn't it have no deconstructor by definition?

  • Alex

I, personally, don't care about the name, however, ProtoObject is pretty
descriptive in this case.

January 11

On Tuesday, 11 January 2022 at 02:22:06 UTC, Timon Gehr wrote:

>

On 10.01.22 14:48, Mike Parker wrote:

>

This is the discussion thread for the first round of Community Review of DIP 1042, "ProtoObject":

https://github.com/dlang/DIPs/blob/2e6d428f42b879c0220ae6adb675164e3ce3803c/DIPs/DIP1042.md

I like the fact that there will be a way out of the additional Object bloat without using extern(C++). (My preferred way out of this would still be to change Object.)

However, I strongly dislike the new interfaces with their opinions about which qualifiers have to be on which methods. const prevents any kind of lazy initialization, and realistically, what's the use case of those interfaces?

I partly agree with this, forcing attributes in the interface is most painfully for the programmer.

If every class and everything would perfectly use the attributes it would be no problem, but with @safe alone we can already see that not all code is written with perfect typing in mind. (half of dub packages not being @safe as default compatible and even more these attributes here not even being defaults as well)

adr's recent patch for @safe comparing classes shows that given more type information (not using the topmost class type) it's easy to do this with attributes. However this alone would fall short here if standard library stuff would suddenly use interfaces to accept arguments:

void sort(Ordered[] objects);

Either this sort function can't be @nogc @safe nothrow pure, or all implementations everywhere in the D ecosystem need to force these attributes, even if they can't be properly used or need to use hacky workarounds. (I could imagine a custom class wanting to throw and/or allocate an exception or keep track of the number of comparisons) It's pick your poison here.

I think D is missing the required things to make this DIP viable now. e.g. for the example above it would be best if Ordered didn't force any attributes and instead the sort function could infer all the attributes from the Ordered type used. This is doable with templates right now but then the DIP goes in the wrong direction by forcing all the attributes and the question comes up as to why you would be using classes for this at all.

If the interfaces and everything in the DIP wouldn't force the attribute on the programmer, I would be more supportive of this DIP.

January 11

On Tuesday, 11 January 2022 at 06:25:34 UTC, Rumbu wrote:

>

On Monday, 10 January 2022 at 18:09:21 UTC, russhy wrote:

>

Does the language not have enough to avoid OOP? let's solve that first!

Here we talk about "Object" which - I fail to understand the surprise - it is about OOP.

More than that, since it is a tight dependency between objects and the garbage collector, what improvement will render a @nogc thrown here and there? Currently there is little support în the language for using objects outside gc, I really don't understand why limit the overrides to patterns that will be difficult to satisfy. In my opinion the only qualifier that must be put to these basic functions is @safe. Hashing can be a costly operation (think at files or streams), const will prevent any caching. Comparing strings for example can discover invalid unicode characters, why nothrow? Comparing timestamps may need reading the current time zone, why pure?

But we are not imposing anything by switching to ProtoObject.
ProtoObject is an empty class that gives you the ability to
implement whatever you want. The utility functions that
we are providing are optional. If they don't cut it for your
special case you can just implement whatever interface you want.

January 11

On Tuesday, 11 January 2022 at 12:52:42 UTC, RazvanN wrote:

>

On Tuesday, 11 January 2022 at 06:25:34 UTC, Rumbu wrote:

>

On Monday, 10 January 2022 at 18:09:21 UTC, russhy wrote:

>

Does the language not have enough to avoid OOP? let's solve that first!

Here we talk about "Object" which - I fail to understand the surprise - it is about OOP.

More than that, since it is a tight dependency between objects and the garbage collector, what improvement will render a @nogc thrown here and there? Currently there is little support în the language for using objects outside gc, I really don't understand why limit the overrides to patterns that will be difficult to satisfy. In my opinion the only qualifier that must be put to these basic functions is @safe. Hashing can be a costly operation (think at files or streams), const will prevent any caching. Comparing strings for example can discover invalid unicode characters, why nothrow? Comparing timestamps may need reading the current time zone, why pure?

But we are not imposing anything by switching to ProtoObject.
ProtoObject is an empty class that gives you the ability to
implement whatever you want. The utility functions that
we are providing are optional. If they don't cut it for your
special case you can just implement whatever interface you want.

The whole point of the standard library vocabulary types is to provide common abstractions that the whole ecosystem can rely on.

If every D shop has to come up with their own interfaces because the ones offered by the DIP don't cut, then what is the purpose of having them in first place?

January 11

On Tuesday, 11 January 2022 at 13:45:35 UTC, Paulo Pinto wrote:

>

The whole point of the standard library vocabulary types is to provide common abstractions that the whole ecosystem can rely on.

If every D shop has to come up with their own interfaces because the ones offered by the DIP don't cut, then what is the purpose of having them in first place?

+1. Unless the runtime and standard library themselves intend to use the interfaces, I cannot see what useful purpose they serve. As-is, they could be cut from this DIP, and nobody would miss them.

January 11
> (let's not discuss here if that is desirable or not, but,
such changes come with large overhead in terms of migration).

This is a convenient position for the DIP authors to take. Instead of accepting reality and evaluating alternatives, including the fact that no breaking change is actually necessary, they say let's just "not discuss" the merits.

The reality is the only thing here that would actually require a breaking change is removing the monitor. The attributes work *today* and require no change at all. You could choose to remove them but there's no actual need - the DIPs statement that they do more harm than good is false. (The only thing that is kinda bad about them is Object a < b could be a compile error instead of a runtime error, so there is room for improvement, but this hasn't proven to be a problem in practice. I suppose the extra slots in the vtable is a potential space optimization but since that's per class instead of per instance, it doesn't add up.)

Removing the monitor is a breaking change, but you have to get into the details of the migration path to evaluate the weight of that. Just saying "breaking change, no discussion" means nothing can happen - fixing any bug can be a breaking change if someone depended on it. Adding ANY symbol can be a breaking change due to name conflicts.

Breaking changes with a clear migration path is a short-term annoyance that can be justified by the long term benefit.

Adding another subtly different way of doing things is a long term annoyance and this cost needs to be considered.

Recently, on the chat room, someone asked what the definition of "method" vs "function" is in __traits(isVirtualMethod) vs isVirtualFunction. It looks to me like isVirtualFunction was simply buggy, and instead of fixing the bug, they added another one that's the same except for the fixed bug and a new name.

That was added in February 2012. Now, here, nine years later, people are still wasting their time trying to figure out which one to use and why. (It would at least help a lot of the documentation explained it!)

In fact, here's the exact thing:

commit adb62254d26ab0b29f543f2562a55b331f4ef297
Author: Walter Bright <walter@walterbright.com>
Date:   Sun Jan 22 00:35:46 2012 -0800

    fix Issue 1918 - __traits(getVirtualFunctions) returns final functions


https://issues.dlang.org/show_bug.cgi?id=1918


Bug filed in 2008. In 2012, something was finally done:

"Documenting the agreement reached with Walter:

We'll define __traits(getVirtualMethods) to do the "right" thing, i.e. only include final methods that actually override something, and put __traits(getVirtualFunctions) on the slow deprecation path."


Well, there's been no such slow deprecation path. I had to go hunting quite a bit to find this history to explain to the poor new user why getVirtualFunctions returned non-virtual functions as well as virtual methods and why getVirtualMethods and isVirtualMethod are the only reference to "method" in the traits documentation.

If there was a deprecation path, perhaps this could have been avoided... but then you're deprecating it anyway. Might as well just do it and move on.

With a trait, there's more of an argument to leave things alone since reflection code is hard to catch subtle differences for migration. It can certainly be done - the compiler could detect a case where the answer changed and call it out as it happens. For example, for this it could see __traits(getVirtualFunctions) and say "Thing.foo is final, which was returned by getvirtualFunctions until version 2.058, but is excluded now. If you need that, use __traits(allMembers) instead. You can use __traits(isVirtualFunction) || __traits(isFinalFunction) to filter it to the same set the old getVirtualFunctions returned. Or if you are getting the behavior you want now, pass -silence=your.module:1918 or import core.compiler; and add @compatible(2058) to your module definition to suppress this warning."

Yeah, it is a little wordy, but that'd tell the user exactly what to do to make an informed choice. Odds are, they assumed getVirtualFunctions actually, you know, got virtual functions, so this warning is likely bringing a bug to their attention, not breaking code.

And then, when January 2022 comes around and people are on dmd 2.098, there's no more mystery. No more support burden.

Again, this is one of the hardest cases, fixing a user-visible bug in a reflection routine. And it can be done providing a long term benefit.

That's the question we have: short term minor code adjustments or long term job opportunities for Scott Meyer, Gary Bernhardt, and friends?


Similarly, let's talk about the monitor. I personally feel a big "meh" about an extra pointer in class instances. I actually use `synchronized(obj)` a decent amount anyway, so the benefits are real for me and the cost is irrelevant.

But what are the real options we have here?

1) Leave things the way they are.

2) Implicitly add a monitor to classes that use `synchronized(this)` so they keep magically working, but remove it from others. An outside `synchronized(obj)` would be a deprecation warning if it hasn't opted in until the period where it becomes a full error. People can always version-lock their compiler if they can't handle the warning.

3) Deprecate `synchronized(obj)` entirely in favor of `synchronized(explicit_mutex)`, no magic, it tells you to migrate. If someone misses the migration window, it becomes a hard build error with the same fix still spelled out - just use an explicit mutex instead. People can always version-lock their compiler if they can't handle the error.

Note that in both cases, you can add a Mutex *today* so libraries would be compatible with both old and new compilers if they follow the migration path with not need for version statements or anything else; it really is a painless process.

4) Abandon the Object class that everyone actually uses in favor of a new ProtoObject class. Have to explain to people at least nine years later why Object is there but subtly different than ProtoObject. All future code will have to write `class Foo : ProtoObject` instead of `class Foo` in perpetuity to get the benefits. The lazy author or the new user who doesn't know any better (since the default is "wrong") will never see the new benefits.

Libraries that do use ProtoObject will require special attention to maintain compatibility with older compilers. At least a `static if(__VERSION__ < 2100) alias ProtoObject = Object;` as a kind of polyfill shim. Since this isn't the default, good chance various libraries just won't do it and the ones that do now risk an incompatibility in the dependency tree. End user applications stop building anyway.

DLF propagandists will have to spend most their days (unsuccessfuly) fighting off comments on Reddit and Hacker News about how D is a verbose joke of a has-been language.



Like I said, I use synchronized(thing) somewhat often. A few of them are actually already mutexes, but I see 42 instances of it in the arsd repo and I have a few in my day job proprietary codebase too. By contrast, I have over 600 class definitions. Which is easier to update, 42 or 600?

Nevertheless, I lean toward option #3.

This would give the most benefit to the most people:

* New users find things just work
* classes that don't need the monitor automatically get the enhancement, no effort required by the author. It continues to just work on old versions with graceful degradation again at no effort.
* classes that DO need the monitor are given a very simple migration path with both backward and forward compatibility. The code requires only a trivial modification and then they actually get a little more clarity and control over the tricky synchronization code thanks to the explicitness.


I'm directly responsible for over 200,000 lines of D code, including one of the most popular library collections and several proprietary applications. I'm also one of the most active front-line support representatives for D and thus regularly field questions from new users.

No matter which choice is made, it is going to impact me. Having to change ~50 / >200,000 lines of code, with the compiler telling me exactly where and what to do, which will take me about an hour is *significantly less pain* than dealing with the consequences, both long and short term, of this DIP's solution.

And this is the most valuable part of the dip. The rest of it is even worse.


No wonder why the authors say to "not discuss here if that is desirable". They're clearly on the losing side of any rational discussion.
January 11
On Tuesday, 11 January 2022 at 12:48:00 UTC, WebFreak001 wrote:
> I partly agree with this, forcing attributes in the interface is most painfully for the programmer.

Yeah, if you have a flexible interface, implementations can always tighten it, but you can't go the other way around.

The good news is you can use @trusted on an implementation to bridge the gap.

> However this alone would fall short here if standard library stuff would suddenly use interfaces to accept arguments:
>
> ```d
> void sort(Ordered[] objects);
> ```

Yeah, best you can do is to template it or offer overloads. Kinda like the status quo with opApply... but you could do like

void sort(Ordered[] objects) { impl }

void sort(SafeOrdered[] safeObjects) @trusted {
   sort(cast(Ordered[]) safeObjects);
}

That's not too terrible but just like with opApply if you wanna do nogc and pure and friends too, the combinations multiply out pretty quickly. Templating it on the most derived common type can solve this... but it is still a little bit tricky to do the proper introspection to cast to the correct attributes when you forward to the un-attributed function (like the `@trusted` in the example). Some reflection code could probably do it....

void sortImpl(Ordered[] objects) {}

void sort(T)(T objects) {
    conditionalForward!sortImpl(objects);
}

auto conditionalForward(alias toWhat, T...)(T args) {
   // only worried about the actual impelmentation of the things actually in the interface
   static if(AllSatisfy!isNoGc, commonMethods!(Parameters!toWhat, T)))
        return cast(@nogc) toWhat(args);
   else static if(pure && nogc)
   else static if(pure)
   else static if (all the combinations)
}

What an ugly af conditionalForward, but such a thing should be possible.

Of course whether you'd gain much in terms of compile speed benefits etc using the dynamic dispatch after doing all that reflection would need tests. I'm not sure. It would at least reduce to a single instance of the actual sort function in the binary so I actually think it *would* come out ahead. Assuming users ok with dynamic dispatch anyway.


The fact is there's just some conflict between the static verification attributes provide and the dynamic dispatch interfaces provide. You CAN bridge it over, but there's just some fundamental conflict between them.

The DIP doesn't address this conflict. It just discards the current flexibility entirely in actual practice. Worth noting if you want to discard the current flexibility entirely, you can do that today in a child class. Remember, you're always allowed to tighten requirements in subtypes.

January 11
On Tuesday, 11 January 2022 at 06:25:34 UTC, Rumbu wrote:
> More than that, since it is a tight dependency between objects and the garbage collector, what improvement will render a @nogc thrown here and there? Currently there is little support în the language for using objects outside gc

Using objects without the GC is very easy (just put the scope keyword on the variable declaration).
January 11

On Tuesday, 11 January 2022 at 15:18:00 UTC, Adam D Ruppe wrote:

>
  1. Leave things the way they are.

It should be possible for an improved compiler backend to remove the mutex if it is never used. In many cases it will be fairly easy to deduce.

Monitors are actually the easiest concurrency primitive for newbies to deal with, so getting rid of it as an optimization might be a good solution.

But it requires a strategy of making the compiler more modern.

1 2 3 4 5 6 7 8 9