March 12, 2014
On Tuesday, 11 March 2014 at 18:33:16 UTC, Daniel Kozák wrote:
> Steven Schveighoffer píše v Út 11. 03. 2014 v 14:14 -0400:
>
>> Consider how long Walter resisted the call to make functions final by  default, until he suddenly switched sides. I think of it like a  dictatorship with 1000 advisors. And no death squads :)

If he resisted for a long time, it makes little sense to say "he suddenly switched sides". Reviewing the old thread, it looks like Manu's arguments amongst others were convincing that this change is worth it.

I have mixed feelings on this topic. I think methods should be final by default, but it is late in the day for these kind of changes, especially given D's unpleasant history with respect to changes.

-- Brian
March 12, 2014
On 12 March 2014 07:28, Dicebot <public@dicebot.lv> wrote:

> On Tuesday, 11 March 2014 at 21:24:22 UTC, bearophile wrote:
>
>> Dicebot:
>>
>>  Not in 2.065
>>> 2.066  will introduce "virtual" keyword
>>> 2.067+ will change the defaults if it will still be considered good idea
>>>
>>
>> What's the point of having "virtual" if the default doesn't change?
>>
>> Bye,
>> bearophile
>>
>
> frequently mentioned example:
>
> class Something
> {
>     final: // want all to be final by default
>
>     // ...
>
>     virtual void foo() {} // but this one function
> }
>
> I think just keeping "virtual" but not changing defaults is a good practical compromise.
>

I'm really trying to keep my lid on here...

I'll just remind that in regard to this particular point which sounds reasonable, it's easy to forgot that *all library code where the author didn't care* is now unusable by anybody who does. The converse would not be true if the situation was reversed.

virtual-by-default is incompatible with optimisation, and it's reliable to
assume that anybody who doesn't explicitly care about this will stick with
the default, which means many potentially useful libraries may be
eliminated for use by many customers.
Also, as discussed at length, revoking virtual from a function is a
breaking change, adding virtual is not. Which means that, instead of making
a controlled breaking change with a clear migration path here and now, we
are committing every single instance of any user's intent to 'optimise'
their libraries (by finalising unnecessarily virtuals) to breaking changes
in their ABI - which *will* occur, since virtual is the default.
According to semantic versioning, this requires bumping the major version
number... that's horrible!

What's better; implementing a controlled deprecation path now, or leaving it up to any project that ever uses the 'class' keyword to eventually confront breaking changes in their API when they encounter a performance oriented customer?


March 12, 2014
On Tuesday, 11 March 2014 at 18:56:15 UTC, Indica wrote:
> I'd like to point out that Walter and Andrei can't do it all themselves. It takes a team and part of pulling it off is well defined goals and job descriptions with devoted people.

This is one of the motivations for my remark. They both have probably more tan enough to do without having to wade through the tremendous volume of responses.

Steve

March 12, 2014
On Tuesday, 11 March 2014 at 22:24:15 UTC, Nick Sabalausky wrote:
> On 3/11/2014 2:42 PM, Steve Teale wrote:
>>
>> Well if we're going there, we should go the whole hog and have final,
>> direct, and virtual.
>
> Pardon my ignorance: What's 'direct'?
>
> > It's a system programming language, so you should
>> be able to walk down the street naked as long as you are prepared to put
>> up with the consequences.
>
> There has been much debate in the programming community over what exactly "system programming language" means. I think you, sir, have found the winner! Gets my vote, anyway! :)

What I meant by final is simply the third leg of a tuffet. As I understand it, final means called directly, and you can't override, virtual means called through the vtable and you can override, direct means called directly, but you can override (hide) in a derived class, like if you define a method with the same signature in a derived class in C++ when the base class method is not marked as virtual.

I tried that with G++ the other day, and it still seems to compile.

There may be other possibilities, I have not attempted to draw the matrix table.

Steve

March 12, 2014
On Tuesday, 11 March 2014 at 20:43:07 UTC, Walter Bright wrote:
> On 3/11/2014 10:47 AM, Steve Teale wrote:
>> What D needs at this point is a dictator.
>
>
> http://www.youtube.com/watch?v=poDaTeyqIm4

Ace Walter - how do you find the time? I believe that you are becoming truly benevolent as you grow older ;=)

March 12, 2014
On Wednesday, 12 March 2014 at 03:05:00 UTC, Manu wrote:
> I'm really trying to keep my lid on here...
>
> I'll just remind that in regard to this particular point which sounds
> reasonable, it's easy to forgot that *all library code where the author
> didn't care* is now unusable by anybody who does. The converse would not be
> true if the situation was reversed.
>
> virtual-by-default is incompatible with optimisation, and it's reliable to
> assume that anybody who doesn't explicitly care about this will stick with
> the default, which means many potentially useful libraries may be
> eliminated for use by many customers.
> Also, as discussed at length, revoking virtual from a function is a
> breaking change, adding virtual is not. Which means that, instead of making
> a controlled breaking change with a clear migration path here and now, we
> are committing every single instance of any user's intent to 'optimise'
> their libraries (by finalising unnecessarily virtuals) to breaking changes
> in their ABI - which *will* occur, since virtual is the default.
> According to semantic versioning, this requires bumping the major version
> number... that's horrible!
>
> What's better; implementing a controlled deprecation path now, or leaving
> it up to any project that ever uses the 'class' keyword to eventually
> confront breaking changes in their API when they encounter a performance
> oriented customer?

Case in point:
https://github.com/D-Programming-Language/phobos/pull/1771
"mark std.zip classes as final"

Long story short: MartinNowak decided to make the Zip classes final, since it made no sense to have any of the functions virtual, or to have anybody derive from them anyways.

https://github.com/D-Programming-Language/phobos/pull/1771#issuecomment-36524041
Comment from Dav1dde:
"Just to let you know, it broke my code"
March 12, 2014
On 12 March 2014 20:40, monarch_dodra <monarchdodra@gmail.com> wrote:

> On Wednesday, 12 March 2014 at 03:05:00 UTC, Manu wrote:
>
>> I'm really trying to keep my lid on here...
>>
>> I'll just remind that in regard to this particular point which sounds
>> reasonable, it's easy to forgot that *all library code where the author
>> didn't care* is now unusable by anybody who does. The converse would not
>> be
>> true if the situation was reversed.
>>
>> virtual-by-default is incompatible with optimisation, and it's reliable to
>> assume that anybody who doesn't explicitly care about this will stick with
>> the default, which means many potentially useful libraries may be
>> eliminated for use by many customers.
>> Also, as discussed at length, revoking virtual from a function is a
>> breaking change, adding virtual is not. Which means that, instead of
>> making
>> a controlled breaking change with a clear migration path here and now, we
>> are committing every single instance of any user's intent to 'optimise'
>> their libraries (by finalising unnecessarily virtuals) to breaking changes
>> in their ABI - which *will* occur, since virtual is the default.
>> According to semantic versioning, this requires bumping the major version
>> number... that's horrible!
>>
>> What's better; implementing a controlled deprecation path now, or leaving it up to any project that ever uses the 'class' keyword to eventually confront breaking changes in their API when they encounter a performance oriented customer?
>>
>
> Case in point: https://github.com/D-Programming-Language/phobos/pull/1771 "mark std.zip classes as final"
>
> Long story short: MartinNowak decided to make the Zip classes final, since it made no sense to have any of the functions virtual, or to have anybody derive from them anyways.
>
> https://github.com/D-Programming-Language/phobos/pull/1771#issuecomment-
> 36524041
> Comment from Dav1dde:
> "Just to let you know, it broke my code"
>

Thank you.
There you go, it's not even hypothetical.


March 12, 2014
On Tue, 2014-03-11 at 18:01 +0000, John Colvin wrote:
[…]

> Also, of course Walter can decide not to do something due to community pressure. He has the ultimate say, it's his language, but that doesn't mean he shouldn't listen.

Python used to be Guido's language, but he and the community evolved out of that phase. It has not diminished Guido's status or standing.
-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder


March 12, 2014
On Wednesday, 12 March 2014 at 11:40:39 UTC, Manu wrote:
> On 12 March 2014 20:40, monarch_dodra <monarchdodra@gmail.com> wrote:
>
>> On Wednesday, 12 March 2014 at 03:05:00 UTC, Manu wrote:
>>
>>> I'm really trying to keep my lid on here...
>>>
>>> I'll just remind that in regard to this particular point which sounds
>>> reasonable, it's easy to forgot that *all library code where the author
>>> didn't care* is now unusable by anybody who does. The converse would not
>>> be
>>> true if the situation was reversed.
>>>
>>> virtual-by-default is incompatible with optimisation, and it's reliable to
>>> assume that anybody who doesn't explicitly care about this will stick with
>>> the default, which means many potentially useful libraries may be
>>> eliminated for use by many customers.
>>> Also, as discussed at length, revoking virtual from a function is a
>>> breaking change, adding virtual is not. Which means that, instead of
>>> making
>>> a controlled breaking change with a clear migration path here and now, we
>>> are committing every single instance of any user's intent to 'optimise'
>>> their libraries (by finalising unnecessarily virtuals) to breaking changes
>>> in their ABI - which *will* occur, since virtual is the default.
>>> According to semantic versioning, this requires bumping the major version
>>> number... that's horrible!
>>>
>>> What's better; implementing a controlled deprecation path now, or leaving
>>> it up to any project that ever uses the 'class' keyword to eventually
>>> confront breaking changes in their API when they encounter a performance
>>> oriented customer?
>>>
>>
>> Case in point:
>> https://github.com/D-Programming-Language/phobos/pull/1771
>> "mark std.zip classes as final"
>>
>> Long story short: MartinNowak decided to make the Zip classes final, since
>> it made no sense to have any of the functions virtual, or to have anybody
>> derive from them anyways.
>>
>> https://github.com/D-Programming-Language/phobos/pull/1771#issuecomment-
>> 36524041
>> Comment from Dav1dde:
>> "Just to let you know, it broke my code"
>>
>
> Thank you.
> There you go, it's not even hypothetical.

It can also happen the other way if I mark a method virtual that used to be final, which was overloaded in a subclass, right?

--
Paulo
March 12, 2014
"Andrei Alexandrescu"  wrote in message news:531F70ED.3040304@erdani.org...

On 3/11/14, 1:18 PM, Vladimir Panteleev wrote:

> A combination of both. The change would break a lot of code and it seems to me final vs. virtual by default is a judgment call more than an obvious decision in favor of final. That said, if I'd do things over again I'd advocate final by default. But we're not doing things over again.
>
> Andrei

FWIW this is exactly where I was back before dconf13.

I was convinced when I realized that:
- It is impossible for the optimizer to achieve the same performance in all cases thanks to dynamic linking
- Linking with C++ usually requires marking _almost_ every method with 'final'
- Only the introducing virtual methods need to be changed, so the breakage is actually very small and trivially handled

So after 5 minutes of adding 'virtual' where the compiler tells me to, I can delete a whole bunch of cruft from my code and keep the same performance.

(Note that those 5 minutes can be done at your leisure over the years it takes for this to progress through the deprecation stages)