June 04, 2013
On 4 June 2013 03:06, David Nadlinger <code@klickverbot.at> wrote:

> On Monday, 3 June 2013 at 16:25:24 UTC, Manu wrote:
>
>> You won't break every single method, they already went through that recently when override was made a requirement. […] A much
>>
>> smaller number than the breakage
>> which was gladly accepted recently. […] how did the override
>> change ever get accepted […]
>>
>
> It appears as if either you have a interesting definition of "recently", or you are deliberately misleading people by bringing up that point over and over again.
>
> According to http://dlang.org/changelog.**html<http://dlang.org/changelog.html>,
> omitting "override" produced a warning since D 2.004, which was released
> back in September 2007! Granted, it was only actually turned from a
> deprecation warning into an actual deprecation in 2.061 (if my memory
> serves me right), but it's mostly a flaw in the handling of that particular
> deprecation that it stayed at the first level for so long. The actual
> language change was made – and user-visible – almost six (!) years ago,
> which is a lot on the D time scale.
>

Ah, sorry, I didn't realise that. I only recall conversations arising when it actually became deprecated, and people had to change their code.

You are also ignoring the fact that in contrast to requiring "override",
> there is no clean deprecation path for your proposal, at least as far as I can see: Omitting the keyword started out as a warning, and IIRC still is allowed when you enable deprecated features via the compiler switch. How would a similar process look for virtual-by-default? As far as am isolated module with only a base class is concerned, this is not question of valid vs. invalid code, but a silent change in language semantics.
>

The exact same path is available if you want to apply it:
 1. Introduce 'virtual', create a warning when override is used on a
function not marked. Functions remain virtual-by-default for the time being.
 2. At some later time, deprecate it (collecting the stragglers who haven't
updated their code yet). Functions remain virtual-by-default.
 3. Pull the pin. Functions become final by default, and virtuals should
have already been marked during the previous periods.

I'm not sure I follow your point about an isolated base class.
In the event you have an isolated base class, where no classes are derived
from it, then it's not actually polymorphic, so why should it's methods be
virtual?
In the event that it's a base-class-in-waiting, then it's true that you'll
notice a compile error at a later time when you do eventually derive from
it, but that's not really a breaking change, and it's almost in line with
my points about explicit consideration; each method that receives the
virtual keyword would take a moments consideration from the author as to
whether it's actually correct/safe to be overriding that method or not. Ie,
the momentary compile error where you write 'virtual' gives you the
opportunity for that thought.

>From DConf I know that you are actually are a friendly, reasonable person,
> but in this discussion, you really come across as a narrow-minded zealot to me. So, please, let's focus on finding an actually practical solution!
>

But this is a practical solution. The only counter-proposal I've heard is Andrei's 'all methods use ufcs' idea, but I think that one would be a much harder sell to the community. I'm certainly not convinced.

It's really quite trivial, it's orthogonal with usage of override, it's more efficient, less error prone (which I have demonstrated happen all the time), enhances interoperation with C++, and it should even improve code fragility too (ie, people possibly overriding methods that are unsafe to override where the author never considered the possibility).

Is there a reason this change offends you enough to call me names? Or can you at least tell how I'm being narrow-minded?

For example, if we had !pure/!nothrow/!final or something along the lines,
> just mandate that "final:" is put at the top of everything in your style guide (easily machine-enforceable too) – problem solved?


That's not quite the case though. Even if I could retrain internal staff to start doing that everywhere, you've potentially blocked access to a whole bunch of libraries because library authors don't follow our style guide. We suffer this in C++ all the time (see my many rant's about unnecessarily spending my life re-inventing wheels). Anything to make subtle pushes that improve usability/portability of libraries can only be a good thing. Especially when library authors generally don't specifically consider all usage environments. The language can assist some some extent.

It's also precisely the same amount of work to type 'virtual:' (and it's
the lesser common case to want to), except taking that angle enables the
advantages I mention above, and also tends to force people to give a
moments consideration to their API design/intent wrt virtual.
Remember going virtual is a one-way trip. It can never be undone, which
makes it a terribly dangerous default state.

And maybe it would even catch on in the whole D community and lead to a
> language change in D3 or a future iteration of the language.
>
> David


June 04, 2013
On Monday, 3 June 2013 at 23:47:33 UTC, Jonathan M Davis wrote:
> 1. That'll only affect Windows unless we change the linking model on *nix
> systems.
>

It is evolving on the C/C++ side, so I see no point in being more conservative than theses.

> 2. That'll only affect stuff that isn't exported from a shared library. There
> are plenty of cases where a class is exported from a shared library, and it
> has lots of functions on it which are supposed to be non-virtual.
>

Calling into/from a shared lib is doomed to be a performance hit as the called code is opaque to the compiler anyway. Which mean assuming the worse on the caller side and disabling most optimizations.

> 3. Doesn't doing this require that the _linker_ optimize out the virtuality of
> the functions for you? If that's the case, it won't work any time soon (if
> ever), because we use the C linker, not our own.
>

I'm not sure what it imply for GCC, but this is actually not hard to implement in LLVM.
June 04, 2013
On Tuesday, 4 June 2013 at 00:19:39 UTC, Manu wrote:
> On 4 June 2013 03:06, David Nadlinger <code@klickverbot.at> wrote:
>
>> On Monday, 3 June 2013 at 16:25:24 UTC, Manu wrote:
>>
>>> You won't break every single method, they already went through that
>>> recently when override was made a requirement. […] A much
>>>
>>> smaller number than the breakage
>>> which was gladly accepted recently. […] how did the override
>>> change ever get accepted […]
>>>
>>
>> It appears as if either you have a interesting definition of "recently",
>> or you are deliberately misleading people by bringing up that point over
>> and over again.
>>
>> According to http://dlang.org/changelog.**html<http://dlang.org/changelog.html>,
>> omitting "override" produced a warning since D 2.004, which was released
>> back in September 2007! Granted, it was only actually turned from a
>> deprecation warning into an actual deprecation in 2.061 (if my memory
>> serves me right), but it's mostly a flaw in the handling of that particular
>> deprecation that it stayed at the first level for so long. The actual
>> language change was made – and user-visible – almost six (!) years ago,
>> which is a lot on the D time scale.
>>
>
> Ah, sorry, I didn't realise that. I only recall conversations arising when
> it actually became deprecated, and people had to change their code.
>
> You are also ignoring the fact that in contrast to requiring "override",
>> there is no clean deprecation path for your proposal, at least as far as I
>> can see: Omitting the keyword started out as a warning, and IIRC still is
>> allowed when you enable deprecated features via the compiler switch. How
>> would a similar process look for virtual-by-default? As far as am isolated
>> module with only a base class is concerned, this is not question of valid
>> vs. invalid code, but a silent change in language semantics.
>>
>
> The exact same path is available if you want to apply it:
>  1. Introduce 'virtual', create a warning when override is used on a
> function not marked. Functions remain virtual-by-default for the time being.
>  2. At some later time, deprecate it (collecting the stragglers who haven't
> updated their code yet). Functions remain virtual-by-default.
>  3. Pull the pin. Functions become final by default, and virtuals should
> have already been marked during the previous periods.
>
> I'm not sure I follow your point about an isolated base class.
> In the event you have an isolated base class, where no classes are derived
> from it, then it's not actually polymorphic, so why should it's methods be
> virtual?
> In the event that it's a base-class-in-waiting, then it's true that you'll
> notice a compile error at a later time when you do eventually derive from
> it, but that's not really a breaking change, and it's almost in line with
> my points about explicit consideration; each method that receives the
> virtual keyword would take a moments consideration from the author as to
> whether it's actually correct/safe to be overriding that method or not. Ie,
> the momentary compile error where you write 'virtual' gives you the
> opportunity for that thought.
>
>>From DConf I know that you are actually are a friendly, reasonable person,
>> but in this discussion, you really come across as a narrow-minded zealot to
>> me. So, please, let's focus on finding an actually practical solution!
>>
>
> But this is a practical solution. The only counter-proposal I've heard is
> Andrei's 'all methods use ufcs' idea, but I think that one would be a much
> harder sell to the community. I'm certainly not convinced.
>
> It's really quite trivial, it's orthogonal with usage of override, it's
> more efficient, less error prone (which I have demonstrated happen all the
> time), enhances interoperation with C++, and it should even improve code
> fragility too (ie, people possibly overriding methods that are unsafe to
> override where the author never considered the possibility).
>
> Is there a reason this change offends you enough to call me names? Or can
> you at least tell how I'm being narrow-minded?
>
> For example, if we had !pure/!nothrow/!final or something along the lines,
>> just mandate that "final:" is put at the top of everything in your style
>> guide (easily machine-enforceable too) – problem solved?
>
>
> That's not quite the case though. Even if I could retrain internal staff to
> start doing that everywhere, you've potentially blocked access to a whole
> bunch of libraries because library authors don't follow our style guide.
> We suffer this in C++ all the time (see my many rant's about unnecessarily
> spending my life re-inventing wheels). Anything to make subtle pushes that
> improve usability/portability of libraries can only be a good thing.
> Especially when library authors generally don't specifically consider all
> usage environments. The language can assist some some extent.
>
> It's also precisely the same amount of work to type 'virtual:' (and it's
> the lesser common case to want to), except taking that angle enables the
> advantages I mention above, and also tends to force people to give a
> moments consideration to their API design/intent wrt virtual.
> Remember going virtual is a one-way trip. It can never be undone, which
> makes it a terribly dangerous default state.
>
> And maybe it would even catch on in the whole D community and lead to a
>> language change in D3 or a future iteration of the language.
>>
>> David

+1
June 04, 2013
On 6/3/13 8:18 PM, Jonathan M Davis wrote:
> So, the breakage is minimal and noisy, but it _is_ still breakage. If we were
> starting from scratch, I really don't think that there would be much excuse
> for not making functions non-virtual by default given that we require
> overriding functions to be marked with override, but it is more of an open
> question when we're this far into the game. I still think that it's worth
> making the change though.

Absolutely it is breakage. The entire discussion is aimless though because it compares poorly costs and benefits of very different design choices, and as you mention at completely different historical times.

It's useless to focus on the breakage override has caused. Yes it did cause breakage. There is no conclusion to draw from that without considering the considerably complex dynamics surrounding the whole matter (experience, benefits, number of users affected positively and negatively).

To use that breakage as an argument linked to absorbing breakage caused by switching to final-by-default does not make sense. I'll try to abstain replying to this particular point in the future, it just instantly lowers the quality of the dialog.


Andrei

June 04, 2013
On 6/3/13 8:19 PM, Manu wrote:
> But this is a practical solution.

It's not a solution because there's no problem. We're talking about a default, not about the ability or lack thereof to do something. So the "solution" does not _solve_ anything.

Andrei
June 04, 2013
On Mon, 03 Jun 2013 12:25:11 -0400, Manu <turkeyman@gmail.com> wrote:

> You won't break every single method, they already went through that
> recently when override was made a requirement.
> It will only break the base declarations, which are far less numerous.

Coming off the sidelines:

1. I think in the general case, virtual by default is fine.  In code that is not performance-critical, it's not a big deal to have virtual functions, and it's usually more useful to have them virtual.  I've experienced plenty of times with C++ where I had to go back and 'virtualize' a function.  Any time you change that, you must recompile everything, it's not a simple change.  It's painful either way.  To me, this is simply a matter of preference.  I understand that it's difficult to go from virtual to final, but in practice, breakage happens rarely, and will be loud with the new override requirements.
2. I think your background may bias your opinions :)  We aren't all working on making lightning fast bare-metal game code.
3. It sucks to have to finalize all but N methods.  In other words, we need a virtual *keyword* to go back to virtual-land.  Then, one can put final: at the top of the class declaration, and virtualize a few methods.  This shouldn't be allowed for final classes though.

My one real experience on this was with dcollections.  I had not declared anything final, and I realized I was paying a performance penalty for it.  I then made all the classes final, and nobody complained.

-Steve
June 04, 2013
On Monday, June 03, 2013 22:25:13 Andrei Alexandrescu wrote:
> It's useless to focus on the breakage override has caused. Yes it did cause breakage. There is no conclusion to draw from that without considering the considerably complex dynamics surrounding the whole matter (experience, benefits, number of users affected positively and negatively).
> 
> To use that breakage as an argument linked to absorbing breakage caused by switching to final-by-default does not make sense. I'll try to abstain replying to this particular point in the future, it just instantly lowers the quality of the dialog.

The comparison is made because we're talking about a related change, and the actual breakage caused by override was fairly recent, so while the decision to cause that breakage was made quite some time ago, we were still willing to cause that breakage fairly recently, so the implication then is that it would be acceptable to do something related which causes less breakage.

Now, that being said, I do think that we need to look at this change in its own right, and it needs to justify itself, but clearly folks like Manu think that it does justify itself and feel that there's something off if we're willing to make the change with override and not this one, when they're related, and this one causes even less breakage.

I do think that virtual-by-default was a mistake and would like to see it fixed, but I'm also not as passionate about it as Manu or Don. Manu in particular seems to be sick of having to fix performance bugs at Remedy Games caused by this issue and so would really like to see non-virtual be the default. The folks using D in companies in real-world code seem to think that the ROI on this change is well worth it.

And a technical issue which affects us all is how this interacts with extern(C++). Daniel Murphy is having to improve extern(C++) in order to be able to port the dmd frontend to D (so that it can properly interact with the backends), and the fact that member functions are virtual by default definitely causes problems there. He would know the details about that better than I would, but IIRC, it had to do with the fact that we needed to be able to interface with non-virtual member C++ functions. So, depending on the details with that, that alone could make it worth switching to non-virtual by default, particularly when the breakage is actually quite loud and easy to fix.

- Jonathan M Davis
June 04, 2013
On 4 June 2013 12:50, Steven Schveighoffer <schveiguy@yahoo.com> wrote:

> On Mon, 03 Jun 2013 12:25:11 -0400, Manu <turkeyman@gmail.com> wrote:
>
>  You won't break every single method, they already went through that
>> recently when override was made a requirement.
>> It will only break the base declarations, which are far less numerous.
>>
>
> Coming off the sidelines:
>
> 1. I think in the general case, virtual by default is fine.  In code that is not performance-critical, it's not a big deal to have virtual functions, and it's usually more useful to have them virtual.  I've experienced plenty of times with C++ where I had to go back and 'virtualize' a function.  Any time you change that, you must recompile everything, it's not a simple change.  It's painful either way.  To me, this is simply a matter of preference.  I understand that it's difficult to go from virtual to final, but in practice, breakage happens rarely, and will be loud with the new override requirements.
>

I agree that in the general case, it's 'fine', but I still don't see how
it's a significant advantage. I'm not sure what the loss is, but I can see
clear benefits to being explicit from an API point of view about what is
safe to override, and implicitly, how the API is intended to be used.
Can you see my point about general correctness? How can a class be correct
if everything can be overridden, but it wasn't designed for it, and
certainly never been tested?


> 2. I think your background may bias your opinions :)  We aren't all working on making lightning fast bare-metal game code.
>

Of course it does. But what I'm trying to do is show the relative merits of one default vs the other. I may be biased, but I feel I've presented a fair few advantages to final-by-default, and I still don't know what the advantages to virtual-by-default are, other than people who don't care about the matter feel it's an inconvenience to type 'virtual:'. But that inconvenience is going to be forced upon one party either way, so the choice needs to be based on relative merits.


> 3. It sucks to have to finalize all but N methods.  In other words, we
> need a virtual *keyword* to go back to virtual-land.  Then, one can put
> final: at the top of the class declaration, and virtualize a few methods.
>  This shouldn't be allowed for final classes though.
>

The thing that irks me about that is that most classes aren't base classes,
and most methods are trivial accessors and properties... why cater to the
minority case?
It also doesn't really address the problem where programmers just won't do
that. Libraries suffer, I'm still inventing wheels 10 years from now, and
I'm wasting time tracking down slip ups.
What are the relative losses to the if it were geared the other way?

My one real experience on this was with dcollections.  I had not declared
> anything final, and I realized I was paying a performance penalty for it.
>  I then made all the classes final, and nobody complained.
>

The userbase of a library will grow with time. Andrei wants a million D
users, that's a lot more opportunities to break peoples code and gather
complaints.
Surely it's best to consider these sorts of changes sooner than later?

And where is the most likely source of those 1 million new users to migrate from? Java?


June 04, 2013
On 6/3/13 10:51 PM, Jonathan M Davis wrote:
> On Monday, June 03, 2013 22:25:13 Andrei Alexandrescu wrote:
>> It's useless to focus on the breakage override has caused. Yes it did
>> cause breakage. There is no conclusion to draw from that without
>> considering the considerably complex dynamics surrounding the whole
>> matter (experience, benefits, number of users affected positively and
>> negatively).
>>
>> To use that breakage as an argument linked to absorbing breakage caused
>> by switching to final-by-default does not make sense. I'll try to
>> abstain replying to this particular point in the future, it just
>> instantly lowers the quality of the dialog.
>
> The comparison is made because we're talking about a related change, and the
> actual breakage caused by override was fairly recent, so while the decision to
> cause that breakage was made quite some time ago, we were still willing to
> cause that breakage fairly recently, so the implication then is that it would
> be acceptable to do something related which causes less breakage.

This nice one-long-sentence paragraph does little in the way of helping because it just restates the same well-understood matter that I disagree with without adding information.

My argument is that the relatedness of the change is tenuous, and explains why I think so. The paragraph above presupposes again relatedness, and proceeds with a tedious re-explanation of the consequences of that assumption.

We don't make progress like this:

Speaker A: "XYZ, therefore ABC".

Speaker B: "I disagree with XYZ because TUV."

Speaker A: "But since XYZ then ABC."

> Now, that being said, I do think that we need to look at this change in its
> own right, and it needs to justify itself, but clearly folks like Manu think
> that it does justify itself and feel that there's something off if we're
> willing to make the change with override and not this one, when they're
> related, and this one causes even less breakage.

The matters of override vs virtual/final are not related so analyzing the consequences of said relatedness is not very productive.

That doesn't make one necessarily more important than the other, but it must be understood that they are distinct matters, and we can't compare one comma in one with one comma in the other.

Requiring "override" in overriding methods:

1. Protects against categories of bugs that otherwise would be impossible to protect against: accidental overriding and accidental non-overriding.

2. Provides an important maintenance tool for code evolution, statically breaking code that would otherwise change semantics silently.

It's important that without "override" there is virtually no protection against these issues. We're talking about an "all goodness" feature, and this kind of stuff is in very low supply in this world.

Choosing "virtual" by default:

1. Fosters flexibility by allowing derived classes to override unannotated methods in base classes.

2. Is suboptimal in speed because users pay for the potential flexibility, even when that flexibility is not actually realized (barring a static analysis called class hierarchy analysis).

3. Ultimately lets the programmer choose the right design by using annotations appropriately.

Choosing "final" by default:

1. Fosters speed by statically binding calls to unannotated methods.

2. Is suboptimal in flexibility because users pay for the speed with loss of flexibility, even when speed is not a concern but flexibility is.

3. Ultimately lets the programmer choose the right design by using annotations appropriately.

The introduction of "override" allows a language to choose either final or virtual by default, without being exposed to potential bugs. This is pretty much the entire extent to which "override" is related to the choice of virtual vs. final by default.

Today, D makes it remarkably easy to choose the right design without significant boilerplate, regardless of the default choice:

- "struct" introduces a monomorphic type with a limited form of subtyping (via alias this) and no dynamic binding of methods.

- "final class" introduces a leaf class that statically disallows inheritance and consequently forces static calls to all methods. (BTW I recall there were some unnecessary virtual calls for final classes, has that been fixed?)

- "final { ... }" introduces a pseudo-scope in which all declared methods are final

- "final:" introduces a pseudo-label after which all declared methods are final

(Granted, there's an asymmetry - there's no "~final:" label to end final, which makes it marginally more tedious to arrange final and non-final methods in the class.)

This leaves an arguably small subset of designs, scenarios, projects, and teams that would be affected by the choice of default. When that point has been made, it has been glibly neglected with an argument along the lines of "yeah, well programmers will take the path of least resistance, not really think things through, come from C++ and assume the wrong default", which may as well be true for a subset of situations, but further reduces the persona typically affected by the choice of default.

I personally have a hard time picturing someone who is at the same time obsessed with performance, disinclined to assess it, unwilling to learn how to improve it, and incapable of using simple tools to control it. Yet this persona is put at the center of the argument that we must change the default right now seeing as it is a huge problem. To top it off, the entire fallacy about override causing more breakage is brought about. Yes, smoking kills, and the fact that cars kill more people doesn't quite have a bearing on that.

> I do think that virtual-by-default was a mistake and would like to see it
> fixed, but I'm also not as passionate about it as Manu or Don.

Choosing virtual (or not) by default may be dubbed a mistake only in a context. With the notable exception of C#, modern languages aim for flexibility and then do their best to obtain performance. In the context of D in particular, there are arguments for the default going either way. If I were designing D from scratch it may even make sense to e.g. force a choice while offering no default whatsoever.

But bottom line is, choosing the default is not a big deal for D because this wonderful language offers so many great building blocks for any design one might imagine.

> Manu in
> particular seems to be sick of having to fix performance bugs at Remedy Games
> caused by this issue and so would really like to see non-virtual be the
> default. The folks using D in companies in real-world code seem to think that
> the ROI on this change is well worth it.

I'm wary/weary of polls with a small number of participants. Let's also not forget that these people do use D successfully, and if sticking "final" here and there is the most difficult endeavor that has helped performance of their programs, I'd say both them and D are in great shape.

> And a technical issue which affects us all is how this interacts with
> extern(C++). Daniel Murphy is having to improve extern(C++) in order to be
> able to port the dmd frontend to D (so that it can properly interact with the
> backends), and the fact that member functions are virtual by default definitely
> causes problems there. He would know the details about that better than I
> would, but IIRC, it had to do with the fact that we needed to be able to
> interface with non-virtual member C++ functions. So, depending on the details
> with that, that alone could make it worth switching to non-virtual by default,
> particularly when the breakage is actually quite loud and easy to fix.

I don't know much about that matter, as I don't know about the argument related to mock injection and such, so I won't comment on this.

Finally, I'll note that I'd started a reply to this remark by Manu (who in turn replied to David):

> Is there a reason this change offends you enough to call me names? Or
> can you at least tell how I'm being narrow-minded?

I deleted that reply, but let me say this. In a good argument:

1. Participants have opinions and beliefs derived from evidence they have accumulated.

2. The very ongoing discourse offers additional evidence to all participants by means of exchange of information. This is to be expected because participants have varied backgrounds and often it's possible to assess how competent they are.

3. The merits of various arguments are discussed, appreciated, and integrated within the opinions and beliefs of the participants.

4. A conclusion is reached in light of everything discussed and everybody is richer that way.

In a not-so good argument:

1. Participants start each from an immutable belief.

2. Their preoccupation is to amass, bend, or fabricate any argument that would make that belief prevail, and to neglect any argument to the contrary.

3. The entire discussion has a foregone conclusion for everyone involved, i.e. nobody changes opinions and nobody is gained.

The attitude "I know what's right, the only problem is to make you understand" doesn't serve anyone, because it locks "me" in a trench with no horizon and no mobility, and elicits an emotional response in "you".

Here we don't want to keep "virtual" default and we don't want to make "final" default. We want to do what's right. So the discussion should progress toward finding what's right, not starting from knowing what's right and working arguments from there.


Andrei
June 04, 2013
On 4 June 2013 12:26, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 6/3/13 8:19 PM, Manu wrote:
>
>> But this is a practical solution.
>>
>
> It's not a solution because there's no problem. We're talking about a default, not about the ability or lack thereof to do something. So the "solution" does not _solve_ anything.


Virtual is a significant performance problem, and x86 is BY FAR the most
tolerant architecture wrt virtual.
The fact that virtual is a one way trip, and it can not safely be revoked
later and therefore a very dangerous choice as the default is a maintenance
problem.
The fact that I'm yet to witness a single programmer ever declare their
final methods at the time of authoring is a problem.
The fact that many useful libraries might become inaccessible to what I'm
sure is not an insignificant niche of potential D users is a problem.
And I argue the subjective opinion, that code can't possibly be correct if
the author never considered how the API may be used outside his design
premise, and can never test it.