March 12, 2014
On Wednesday, 12 March 2014 at 12:32:14 UTC, Michel Fortin wrote:
> I'll add another argument to the mix.
>
> Currently, you can't have private/package functions that are virtual, contrary to what TDPL says.
>
> To make things coherent we could change private function so they become virtual by default. You might be able to see the problem with this option: making private function virtual by default is likely to add many performance regressions in existing code, silently.
>

No.

In the case of private functions, by definition, the compiler
have all overrides in the module and can finalize what needs to
be.

> We could always keep things as they currently are: private/package is not virtual, everything else is virtual unless marked final. Honestly I'd like to se that go. The protection attribute should have nothing to do with whether a function is virtual or not.

That is horrible. Bad separation of concerns.
March 12, 2014
On 13 March 2014 02:46, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 3/12/14, 4:40 AM, Manu wrote:
>
>> Thank you.
>> There you go, it's not even hypothetical.
>>
>
> I think the example given works against your argument.


How so?
Virtual was removed, code broke. This is the default state of D.
The other direction has no such problem.


March 12, 2014
On Wednesday, 12 March 2014 at 13:22:34 UTC, Steven Schveighoffer wrote:
> when you do it in a separate module, an error? What if you want to define that function name, but it's taken by the base class, what happens?
>
> -Steve

That's when you need 'direct' ;=)

Steve

March 12, 2014
On 13 March 2014 03:47, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 3/11/14, 8:04 PM, Manu wrote:
>
>> I'm really trying to keep my lid on here...
>>
>
> Yep, here we go again :o).


*sigh*

 I'll just remind that in regard to this particular point which sounds
>> reasonable, it's easy to forgot that *all library code where the author didn't care* is now unusable by anybody who does. The converse would not be true if the situation was reversed.
>>
>
> There's an asymmetry introduced by the fact there's today code in use.


Do you think the deprecation path is particularly disruptive? It can be implemented over a reasonably long time.

 virtual-by-default is incompatible with optimisation, and it's reliable
>> to assume that anybody who doesn't explicitly care about this will stick with the default, which means many potentially useful libraries may be eliminated for use by many customers.
>>
>
> Virtual by default is, however, compatible with customization and flexibility.
>

I completely disagree with the everything-should-be-virtual idea in
principle; I think it's dangerous and irresponsible API design, but that's
my opinion.
Whether you are into that or not, I see it as a design decision, and I
think it's reasonable to make that decision explicit by typing 'virtual:'.

What's not subjective, is that the optimiser can't optimise virtual-by-default. That's just a fact, and one which I care about deeply. I think it's also statistically reliable that people will stick with the default in almost all cases.

Unstated assumption: "many potential useful libraries" assumes many
> libraries use traditional OO design in their core components.
>

In my experience; physics, sound, scene graph... these sorts of things are
common libraries, and also heavy users of OO. Each of those things are
broken into small pieces implemented as many objects.
If people then make use of properties, we're in a situation which is much
worse than what we already struggle with in C++.

Unstated assumption: "many customers".


Do I need to quantify?

I work in a gigantic industry. You might call it niche, but it's like,
really, really big.
I often attend an annual conference called GDC which attracts multiple 10s
of thousands of developers each year. It's probably the biggest software
developer conference in the world.
A constantly recurring theme at those conferences is low-level performance
on embedded hardware, and specifically, the mistakes that PC developers
make when first moving to embedded architectures.
There's a massive audience for these topics, because everyone is suffering
the same problems. Virtual is one of the most expensive hazards, just not
on x86.
Most computers in the world today don't run x86 processors.

 Also, as discussed at length, revoking virtual from a function is a
>> breaking change, adding virtual is not.
>>
>
> Changing the default is a breaking change.


Yes, but there is an opportunity for a smooth transition and elimination of the problem, rather than committing to consistent recurrence of breaking libraries in the future whenever anyone wants to optimise in this way.

 Which means that, instead of
>> making a controlled breaking change with a clear migration path here and now, we are committing every single instance of any user's intent to 'optimise' their libraries (by finalising unnecessarily virtuals) to breaking changes in their ABI - which *will* occur, since virtual is the default.
>>
>
> Unstated assumption: "every single instance" assumes again that people interested in writing fast libraries have virtual calls as a major bottleneck, and furthermore they didn't care about speed to start with, to wake up later. This pictures library designers as quite incompetent people.


YES! This is absolutely my professional experience! I've repeated this many
times.
Many(/most) libraries I've wanted to use in the past are written for a PC;
rarely any real consideration for low-level performance.
Those that are tested for cross-compiling are often _originally_ written
for a PC; API is architecturally pre-disposed to poor performance.

This is precisely the sort of thing that library authors don't care about
until some subset of customers come along that do. At that point, they are
faced with a conundrum; breaking the API or ignoring the minority - which
can often take years to resolve, meanwhile buggering up our schedule or
wasting our time re-inventing some wheel.
PC programmers are careless programmers on average. Because x86 is the most
tolerant architecture WRT low-level performance by far, unless library
authors actively test their software on a wide variety of machines, they
have no real bearing to judge their code.

 According to semantic versioning, this requires bumping the major
>> version number... that's horrible!
>>
>
> Appeal to emotion.


Bumping major version numbers is not an emotional expression. People take semantic versioning very seriously.

 What's better; implementing a controlled deprecation path now, or
>> leaving it up to any project that ever uses the 'class' keyword to eventually confront breaking changes in their API when they encounter a performance oriented customer?
>>
>
> It's better to leave things be. All I see is the same anecdote gets vividly told again whenever the topic comes up.


Whatever.


March 12, 2014
On Wednesday, 12 March 2014 at 17:08:59 UTC, Andrei Alexandrescu
wrote:
> On 3/12/14, 10:05 AM, monarch_dodra wrote:
>> On Wednesday, 12 March 2014 at 16:46:26 UTC, Andrei Alexandrescu wrote:
>>> On 3/12/14, 4:40 AM, Manu wrote:
>>>> Thank you.
>>>> There you go, it's not even hypothetical.
>>>
>>> I think the example given works against your argument.
>>>
>>> Andrei
>>
>> How so? The example was his argument verbatim.
>
> His argument assumed at core that the library designer knows better than the library user what the customization points are, and that most functions are virtual by mistake.

And this argument is absolutely correct, in my experience.  By
making virtuality an explicit choice, the library designer is
specifying that a given function is a part of the published
interface for a class and overriding it has some explicit purpose
that will be maintained over time.

In a "virtual by default" world, there is no explicit contract
regarding the virtuality of functions, and so the library writer
may not feel any obligation to retain those functions at all,
their signatures, or their application to the operation of the
class.  Unless a particular function is documented as
overridable, you can't assume that the library writer didn't
simply forget to mark it final.  Doing otherwise puts the library
writer in an awkward position by imposing an obligation to
preserve behavior that was never intended to be public in the
first place.  The example provided demonstrated this exact
problem.
March 12, 2014
On 2014-03-12 18:07, Andrei Alexandrescu wrote:

> That's why I said "most" and not "all".
>
> I think we must take final-by-default off the table. Doing so would also
> make the addition of "virtual" too much a price to pay for the negation
> of "final".

There has previously been suggestions of adding syntax turn off/negating attributes. Something like:

class Foo
{
    final:

    !final void foo () {}
}

-- 
/Jacob Carlborg
March 12, 2014
And for the record, I'm absolutely pedantic about adding "final"
qualifiers to functions that aren't intended to be overridable,
and yet it recently came to light that the methods a class in
Druntime that I created (Condition) are not labeled final.  I
can't imagine any reason why I'd have made that decision
explicitly, since there's no real function to having them be
overridable.  It was obviously a mistake on my part, and yet
there are two instances where Condition has been subclassed and
these methods overridden (once by me, to be honest), to
questionable ends.  And so at this point I can't add the "final"
qualifier without breaking user code, even though it was always
meant to be there.

In short, I don't think that saying this is often overlooked is
even necessarily an assertion that library writers are sloppy or
don't care about performance.  It being the default behavior and
having no detectable issues at design time means that these
problems will slip in from time to time.  It's hard to grep for
the lack of a keyword.
March 12, 2014
On 2014-03-12 20:51, Sean Kelly wrote:

> And this argument is absolutely correct, in my experience.  By
> making virtuality an explicit choice, the library designer is
> specifying that a given function is a part of the published
> interface for a class and overriding it has some explicit purpose
> that will be maintained over time.

There's a book, Effective Java, that recommends all methods should be marked as final unless explicitly intended to be overridden. The argument is that a class needs to be explicitly designed for subclassing.

-- 
/Jacob Carlborg
March 12, 2014
12-Mar-2014 23:51, Sean Kelly пишет:
> On Wednesday, 12 March 2014 at 17:08:59 UTC, Andrei Alexandrescu
> wrote:
>> On 3/12/14, 10:05 AM, monarch_dodra wrote:
>>> On Wednesday, 12 March 2014 at 16:46:26 UTC, Andrei Alexandrescu wrote:
>>>> On 3/12/14, 4:40 AM, Manu wrote:
>>>>> Thank you.
>>>>> There you go, it's not even hypothetical.
>>>>
>>>> I think the example given works against your argument.
>>>>
>>>> Andrei
>>>
>>> How so? The example was his argument verbatim.
>>
>> His argument assumed at core that the library designer knows better
>> than the library user what the customization points are, and that most
>> functions are virtual by mistake.
>
> And this argument is absolutely correct, in my experience.  By
> making virtuality an explicit choice, the library designer is
> specifying that a given function is a part of the published
> interface for a class and overriding it has some explicit purpose
> that will be maintained over time.

Seconded.


-- 
Dmitry Olshansky
March 12, 2014
On 3/11/2014 2:28 PM, Michel Fortin wrote:
> class Foo
> {
> final:
>      void bar();
>      void baz();
>
> virtual:
>      void crack();
>      void crunch();
>
> final:
>      void dodge();
>      void damp();
>      virtual void divert();
>      void doh();
> }

class Foo
{
  final
  {
     void bar();
     void baz();
  }

     void crack();
     void crunch();

  final
  {
     void dodge();
     void damp();
     void divert();
     void doh();
  }
}

That said, there's still a case for !final.