June 03, 2013
On 6/3/13 3:05 AM, Manu wrote:
> On 3 June 2013 02:37, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org
> <mailto:SeeWebsiteForEmail@erdani.org>> wrote:
>
>     On 6/2/13 9:59 AM, Manu wrote:
>
>         I've never said that virtuals are bad. The key function of a
>         class is
>         polymorphism.
>         But the reality is that in non-tool or container/foundational
>         classes
>         (which are typically write-once, use-lots; you don't tend to
>         write these
>         daily), a typical class will have a couple of virtuals, and a whole
>         bunch of properties.
>
>
>     I've argued if no dispatch is needed just make those free functions.
>
>
> You're not going to win many friends, and probably not many potential D
> users by insisting people completely change their coding patterns that
> they've probably held for decades on a trivial matter like this.

This is actually part of the point. You keep on discussing as if we design the language now, when in fact there's a lot of code out there that relies on the current behavior. We won't win many friends if we break every single method that has ever been overridden in D, over a trivial matter.


Andrei
June 03, 2013
On Monday, 3 June 2013 at 15:27:58 UTC, Andrei Alexandrescu wrote:
> This is actually part of the point. You keep on discussing as if we design the language now, when in fact there's a lot of code out there that relies on the current behavior. We won't win many friends if we break every single method that has ever been overridden in D, over a trivial matter.
>

Agreed. But can you please consider the export proposal ? This would allow for finalization using LTO. That we can avoid virtual dispatch when it is unneeded.
June 03, 2013
On 3 June 2013 18:11, deadalnix <deadalnix@gmail.com> wrote:

> On Monday, 3 June 2013 at 07:30:56 UTC, Kapps wrote:
>
>> On Monday, 3 June 2013 at 07:06:05 UTC, Manu wrote:
>>
>>> There are functions that
>>> the author intended to be overridden, and functions that have no business
>>> being overridden, that the author probably never imagined anyone would
>>> override.
>>> What if someone does come along and override one of these, and it was
>>> never
>>> designed to work under that circumstance in the first place?
>>> At very least, it will have never been tested. That's not a very robust
>>> API
>>> offering if you ask me.
>>>
>>
>> This is something just as important as the performance issues. Most of
>> the time people will leave functions to simply use whatever the default is
>> for virtual/final. If it's final, this works fairly well. The author hasn't
>> put in the effort to decide how to handle people overriding their function.
>> But with virtual by default, you don't know if the author actually
>> considered that people will be overriding the function or if it's simply
>> that they didn't bother specifying. I know the vast majority of my code is
>> virtual, simply because I didn't specify the 'final' keyword 500 times, and
>> didn't think about that I'd need to do it. The resulting code is unsafe
>> because I didn't consider those functions actually being overridden.
>>
>>
> The whole concept of OOP revolve around the fact that a given class and users of the given class don't need to know about its subclasses (Liskov's substitution principle). It is subclass's responsibility to decide what it override or not, not the upper class to decide what is overriden by subclasses.
>

Then OOP is fundamentally unsafe, because the author will never consider all the possibilities!

If you want to create a class with customizable parts, pass parameters to
> the constructor. This isn't OOP what OOP is about.
>

Eh?

The performance concern is only here because things has been smashed
> together in a inconsequent way (as it is often done in D). In Java for instance, only overriden function are actually virtual. Everything else is finalized at link time.


Java is not compiled. If you compile Java code, all functions are virtual
always.
It's impossible in D with separate compilation, and dynamic libraries seal
the deal.


> Which is great because you are able to override everything when testing to create mock for instance, while keeping good performance when actually running the application.
>

I'm not taking away your ability to make everything virtual, you can type 'virtual:' as much as you like.


June 03, 2013
On 3 June 2013 18:20, Jacob Carlborg <doob@me.com> wrote:

> On 2013-06-03 10:11, deadalnix wrote:
>
>  The whole concept of OOP revolve around the fact that a given class and
>> users of the given class don't need to know about its subclasses (Liskov's substitution principle). It is subclass's responsibility to decide what it override or not, not the upper class to decide what is overriden by subclasses.
>>
>> If you want to create a class with customizable parts, pass parameters to the constructor. This isn't OOP what OOP is about.
>>
>> The performance concern is only here because things has been smashed together in a inconsequent way (as it is often done in D). In Java for instance, only overriden function are actually virtual. Everything else is finalized at link time. Which is great because you are able to override everything when testing to create mock for instance, while keeping good performance when actually running the application.
>>
>
> I've read a book, Effective Java, where it says, something like:
>
> If you don't intend your class to be subclassed make it final, otherwise document how to subclass and which methods to override.


Sounds like even they know the truth I speak, but they must enforce this by
convention/documentation rather than offering strict guarantees ;)
It's interesting (but not at all surprising) that C# which is much more
modern decided to go the C++ way rather than the Java way.


June 03, 2013
On 4 June 2013 01:28, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 6/3/13 3:05 AM, Manu wrote:
>
>> On 3 June 2013 02:37, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org <mailto:SeeWebsiteForEmail@**erdani.org <SeeWebsiteForEmail@erdani.org>>> wrote:
>>
>>     On 6/2/13 9:59 AM, Manu wrote:
>>
>>         I've never said that virtuals are bad. The key function of a
>>         class is
>>         polymorphism.
>>         But the reality is that in non-tool or container/foundational
>>         classes
>>         (which are typically write-once, use-lots; you don't tend to
>>         write these
>>         daily), a typical class will have a couple of virtuals, and a
>> whole
>>         bunch of properties.
>>
>>
>>     I've argued if no dispatch is needed just make those free functions.
>>
>>
>> You're not going to win many friends, and probably not many potential D users by insisting people completely change their coding patterns that they've probably held for decades on a trivial matter like this.
>>
>
> This is actually part of the point. You keep on discussing as if we design the language now, when in fact there's a lot of code out there that relies on the current behavior. We won't win many friends if we break every single method that has ever been overridden in D, over a trivial matter.


You won't break every single method, they already went through that
recently when override was made a requirement.
It will only break the base declarations, which are far less numerous.

How can you justify the change to 'override' with a position like that? We
have already discussed that we know PRECISELY the magnitude of breakage
that will occur.
It is: magnitude_of_breakage_from_override /
total_number_of_derived_classes. A much smaller number than the breakage
which was gladly accepted recently.

And the matter is far from trivial. In fact, if you think this is trivial, then how did the override change ever get accepted? That is most certainly trivial by contrast, and far more catastrophic in terms of breakage.


June 03, 2013
On 6/3/13 12:25 PM, Manu wrote:
> You won't break every single method, they already went through that
> recently when override was made a requirement.
> It will only break the base declarations, which are far less numerous.

That's what I meant.

> How can you justify the change to 'override' with a position like that?
> We have already discussed that we know PRECISELY the magnitude of
> breakage that will occur.
> It is: magnitude_of_breakage_from_override /
> total_number_of_derived_classes. A much smaller number than the breakage
> which was gladly accepted recently.

Well it's kinda too much relativism that the number of breakages is considered small because it's smaller than another number.

> And the matter is far from trivial.

It is trivial. To paraphrase a classic: "I'm not taking away your ability to make everything final, you can type 'final:' as much as you like."

> In fact, if you think this is
> trivial, then how did the override change ever get accepted? That is
> most certainly trivial by contrast, and far more catastrophic in terms
> of breakage.

That's a completely different issue, so this part of the argument can be considered destroyed.


Andrei
June 03, 2013
On Monday, 3 June 2013 at 16:25:24 UTC, Manu wrote:
> You won't break every single method, they already went through that recently when override was made a requirement. […] A much
> smaller number than the breakage
> which was gladly accepted recently. […] how did the override
> change ever get accepted […]

It appears as if either you have a interesting definition of "recently", or you are deliberately misleading people by bringing up that point over and over again.

According to http://dlang.org/changelog.html, omitting "override" produced a warning since D 2.004, which was released back in September 2007! Granted, it was only actually turned from a deprecation warning into an actual deprecation in 2.061 (if my memory serves me right), but it's mostly a flaw in the handling of that particular deprecation that it stayed at the first level for so long. The actual language change was made – and user-visible – almost six (!) years ago, which is a lot on the D time scale.

You are also ignoring the fact that in contrast to requiring "override", there is no clean deprecation path for your proposal, at least as far as I can see: Omitting the keyword started out as a warning, and IIRC still is allowed when you enable deprecated features via the compiler switch. How would a similar process look for virtual-by-default? As far as am isolated module with only a base class is concerned, this is not question of valid vs. invalid code, but a silent change in language semantics.

From DConf I know that you are actually are a friendly, reasonable person, but in this discussion, you really come across as a narrow-minded zealot to me. So, please, let's focus on finding an actually practical solution!

For example, if we had !pure/!nothrow/!final or something along the lines, just mandate that "final:" is put at the top of everything in your style guide (easily machine-enforceable too) – problem solved? And maybe it would even catch on in the whole D community and lead to a language change in D3 or a future iteration of the language.

David
June 03, 2013
On 6/3/13 1:06 PM, David Nadlinger wrote:
> On Monday, 3 June 2013 at 16:25:24 UTC, Manu wrote:
>> You won't break every single method, they already went through that
>> recently when override was made a requirement. […] A much
>> smaller number than the breakage
>> which was gladly accepted recently. […] how did the override
>> change ever get accepted […]
>
> It appears as if either you have a interesting definition of "recently",
> or you are deliberately misleading people by bringing up that point over
> and over again.
>
> According to http://dlang.org/changelog.html, omitting "override"
> produced a warning since D 2.004, which was released back in September
> 2007! Granted, it was only actually turned from a deprecation warning
> into an actual deprecation in 2.061 (if my memory serves me right), but
> it's mostly a flaw in the handling of that particular deprecation that
> it stayed at the first level for so long. The actual language change was
> made – and user-visible – almost six (!) years ago, which is a lot on
> the D time scale.
>
> You are also ignoring the fact that in contrast to requiring "override",
> there is no clean deprecation path for your proposal, at least as far as
> I can see: Omitting the keyword started out as a warning, and IIRC still
> is allowed when you enable deprecated features via the compiler switch.
> How would a similar process look for virtual-by-default? As far as am
> isolated module with only a base class is concerned, this is not
> question of valid vs. invalid code, but a silent change in language
> semantics.
[snip]

There's one more issue with the comparison that must be clarified (I thought it was fairly obvious so I didn't make it explicit in my previous note): override is not comparable because it improves code correctness and maintainability, for which there is ample prior evidence. It's also a matter for which, unlike virtual/final, there is no reasonable recourse.

So invoking the cost of imposing explicit override vs imposing virtual as an argument for the latter is fallacious.


Andrei
June 03, 2013
Am 03.06.2013 18:19, schrieb Manu:
> On 3 June 2013 18:20, Jacob Carlborg <doob@me.com <mailto:doob@me.com>>
> wrote:
>
>     On 2013-06-03 10:11, deadalnix wrote:
>
>         The whole concept of OOP revolve around the fact that a given
>         class and
>         users of the given class don't need to know about its subclasses
>         (Liskov's substitution principle). It is subclass's
>         responsibility to
>         decide what it override or not, not the upper class to decide
>         what is
>         overriden by subclasses.
>
>         If you want to create a class with customizable parts, pass
>         parameters
>         to the constructor. This isn't OOP what OOP is about.
>
>         The performance concern is only here because things has been smashed
>         together in a inconsequent way (as it is often done in D). In
>         Java for
>         instance, only overriden function are actually virtual.
>         Everything else
>         is finalized at link time. Which is great because you are able to
>         override everything when testing to create mock for instance, while
>         keeping good performance when actually running the application.
>
>
>     I've read a book, Effective Java, where it says, something like:
>
>     If you don't intend your class to be subclassed make it final,
>     otherwise document how to subclass and which methods to override.
>
>
> Sounds like even they know the truth I speak, but they must enforce this
> by convention/documentation rather than offering strict guarantees ;)
> It's interesting (but not at all surprising) that C# which is much more
> modern decided to go the C++ way rather than the Java way.

C# just followed the Object Pascal/Delphi model, which is based in C++. That's why.

You have to thank Anders for it.

--
Paulo
June 03, 2013
Am 03.06.2013 10:11, schrieb deadalnix:
> On Monday, 3 June 2013 at 07:30:56 UTC, Kapps wrote:
>> On Monday, 3 June 2013 at 07:06:05 UTC, Manu wrote:
>>> There are functions that
>>> the author intended to be overridden, and functions that have no
>>> business
>>> being overridden, that the author probably never imagined anyone would
>>> override.
>>> What if someone does come along and override one of these, and it was
>>> never
>>> designed to work under that circumstance in the first place?
>>> At very least, it will have never been tested. That's not a very
>>> robust API
>>> offering if you ask me.
>>
>> This is something just as important as the performance issues. Most of
>> the time people will leave functions to simply use whatever the
>> default is for virtual/final. If it's final, this works fairly well.
>> The author hasn't put in the effort to decide how to handle people
>> overriding their function. But with virtual by default, you don't know
>> if the author actually considered that people will be overriding the
>> function or if it's simply that they didn't bother specifying. I know
>> the vast majority of my code is virtual, simply because I didn't
>> specify the 'final' keyword 500 times, and didn't think about that I'd
>> need to do it. The resulting code is unsafe because I didn't consider
>> those functions actually being overridden.
>>
>
> The whole concept of OOP revolve around the fact that a given class and
> users of the given class don't need to know about its subclasses
> (Liskov's substitution principle). It is subclass's responsibility to
> decide what it override or not, not the upper class to decide what is
> overriden by subclasses.
>
> If you want to create a class with customizable parts, pass parameters
> to the constructor. This isn't OOP what OOP is about.
>
> The performance concern is only here because things has been smashed
> together in a inconsequent way (as it is often done in D). In Java for
> instance, only overriden function are actually virtual. Everything else
> is finalized at link time. Which is great because you are able to
> override everything when testing to create mock for instance, while
> keeping good performance when actually running the application.

While this is true for most OO languages, you should not forget the "Fragile base class" principle.

But this applies to both cases, regardless what is defined by default.

--
Paulo