Adam D Ruppe 
Posted in reply to Mike Parker
| > (let's not discuss here if that is desirable or not, but,
such changes come with large overhead in terms of migration).
This is a convenient position for the DIP authors to take. Instead of accepting reality and evaluating alternatives, including the fact that no breaking change is actually necessary, they say let's just "not discuss" the merits.
The reality is the only thing here that would actually require a breaking change is removing the monitor. The attributes work *today* and require no change at all. You could choose to remove them but there's no actual need - the DIPs statement that they do more harm than good is false. (The only thing that is kinda bad about them is Object a < b could be a compile error instead of a runtime error, so there is room for improvement, but this hasn't proven to be a problem in practice. I suppose the extra slots in the vtable is a potential space optimization but since that's per class instead of per instance, it doesn't add up.)
Removing the monitor is a breaking change, but you have to get into the details of the migration path to evaluate the weight of that. Just saying "breaking change, no discussion" means nothing can happen - fixing any bug can be a breaking change if someone depended on it. Adding ANY symbol can be a breaking change due to name conflicts.
Breaking changes with a clear migration path is a short-term annoyance that can be justified by the long term benefit.
Adding another subtly different way of doing things is a long term annoyance and this cost needs to be considered.
Recently, on the chat room, someone asked what the definition of "method" vs "function" is in __traits(isVirtualMethod) vs isVirtualFunction. It looks to me like isVirtualFunction was simply buggy, and instead of fixing the bug, they added another one that's the same except for the fixed bug and a new name.
That was added in February 2012. Now, here, nine years later, people are still wasting their time trying to figure out which one to use and why. (It would at least help a lot of the documentation explained it!)
In fact, here's the exact thing:
commit adb62254d26ab0b29f543f2562a55b331f4ef297
Author: Walter Bright <walter@walterbright.com>
Date: Sun Jan 22 00:35:46 2012 -0800
fix Issue 1918 - __traits(getVirtualFunctions) returns final functions
https://issues.dlang.org/show_bug.cgi?id=1918
Bug filed in 2008. In 2012, something was finally done:
"Documenting the agreement reached with Walter:
We'll define __traits(getVirtualMethods) to do the "right" thing, i.e. only include final methods that actually override something, and put __traits(getVirtualFunctions) on the slow deprecation path."
Well, there's been no such slow deprecation path. I had to go hunting quite a bit to find this history to explain to the poor new user why getVirtualFunctions returned non-virtual functions as well as virtual methods and why getVirtualMethods and isVirtualMethod are the only reference to "method" in the traits documentation.
If there was a deprecation path, perhaps this could have been avoided... but then you're deprecating it anyway. Might as well just do it and move on.
With a trait, there's more of an argument to leave things alone since reflection code is hard to catch subtle differences for migration. It can certainly be done - the compiler could detect a case where the answer changed and call it out as it happens. For example, for this it could see __traits(getVirtualFunctions) and say "Thing.foo is final, which was returned by getvirtualFunctions until version 2.058, but is excluded now. If you need that, use __traits(allMembers) instead. You can use __traits(isVirtualFunction) || __traits(isFinalFunction) to filter it to the same set the old getVirtualFunctions returned. Or if you are getting the behavior you want now, pass -silence=your.module:1918 or import core.compiler; and add @compatible(2058) to your module definition to suppress this warning."
Yeah, it is a little wordy, but that'd tell the user exactly what to do to make an informed choice. Odds are, they assumed getVirtualFunctions actually, you know, got virtual functions, so this warning is likely bringing a bug to their attention, not breaking code.
And then, when January 2022 comes around and people are on dmd 2.098, there's no more mystery. No more support burden.
Again, this is one of the hardest cases, fixing a user-visible bug in a reflection routine. And it can be done providing a long term benefit.
That's the question we have: short term minor code adjustments or long term job opportunities for Scott Meyer, Gary Bernhardt, and friends?
Similarly, let's talk about the monitor. I personally feel a big "meh" about an extra pointer in class instances. I actually use `synchronized(obj)` a decent amount anyway, so the benefits are real for me and the cost is irrelevant.
But what are the real options we have here?
1) Leave things the way they are.
2) Implicitly add a monitor to classes that use `synchronized(this)` so they keep magically working, but remove it from others. An outside `synchronized(obj)` would be a deprecation warning if it hasn't opted in until the period where it becomes a full error. People can always version-lock their compiler if they can't handle the warning.
3) Deprecate `synchronized(obj)` entirely in favor of `synchronized(explicit_mutex)`, no magic, it tells you to migrate. If someone misses the migration window, it becomes a hard build error with the same fix still spelled out - just use an explicit mutex instead. People can always version-lock their compiler if they can't handle the error.
Note that in both cases, you can add a Mutex *today* so libraries would be compatible with both old and new compilers if they follow the migration path with not need for version statements or anything else; it really is a painless process.
4) Abandon the Object class that everyone actually uses in favor of a new ProtoObject class. Have to explain to people at least nine years later why Object is there but subtly different than ProtoObject. All future code will have to write `class Foo : ProtoObject` instead of `class Foo` in perpetuity to get the benefits. The lazy author or the new user who doesn't know any better (since the default is "wrong") will never see the new benefits.
Libraries that do use ProtoObject will require special attention to maintain compatibility with older compilers. At least a `static if(__VERSION__ < 2100) alias ProtoObject = Object;` as a kind of polyfill shim. Since this isn't the default, good chance various libraries just won't do it and the ones that do now risk an incompatibility in the dependency tree. End user applications stop building anyway.
DLF propagandists will have to spend most their days (unsuccessfuly) fighting off comments on Reddit and Hacker News about how D is a verbose joke of a has-been language.
Like I said, I use synchronized(thing) somewhat often. A few of them are actually already mutexes, but I see 42 instances of it in the arsd repo and I have a few in my day job proprietary codebase too. By contrast, I have over 600 class definitions. Which is easier to update, 42 or 600?
Nevertheless, I lean toward option #3.
This would give the most benefit to the most people:
* New users find things just work
* classes that don't need the monitor automatically get the enhancement, no effort required by the author. It continues to just work on old versions with graceful degradation again at no effort.
* classes that DO need the monitor are given a very simple migration path with both backward and forward compatibility. The code requires only a trivial modification and then they actually get a little more clarity and control over the tricky synchronization code thanks to the explicitness.
I'm directly responsible for over 200,000 lines of D code, including one of the most popular library collections and several proprietary applications. I'm also one of the most active front-line support representatives for D and thus regularly field questions from new users.
No matter which choice is made, it is going to impact me. Having to change ~50 / >200,000 lines of code, with the compiler telling me exactly where and what to do, which will take me about an hour is *significantly less pain* than dealing with the consequences, both long and short term, of this DIP's solution.
And this is the most valuable part of the dip. The rest of it is even worse.
No wonder why the authors say to "not discuss here if that is desirable". They're clearly on the losing side of any rational discussion.
|