March 12, 2014
On Wednesday, 12 March 2014 at 13:53:28 UTC, Steven Schveighoffer wrote:
> On Wed, 12 Mar 2014 09:45:22 -0400, monarch_dodra <monarchdodra@gmail.com> wrote:
>
>> On Wednesday, 12 March 2014 at 13:22:34 UTC, Steven Schveighoffer wrote:
>>> OK, I can see that being useful. You are right, I was thinking C++ private.
>>>
>>> -Steve
>>
>> Even in C++, private virtual a key part of the "non-virtual interface" thing.
>>
>> EG: You define your base class as having only non-virtual public functions, and private virtual function. They are private, so the derived classes can't *call* them, but they can still override them.
>
> Nonsense. If I'm writing a function, I can call it. There is no way to prevent it. e.g.:
>

The point is that you can't call parent's virtual (often abstract) method. Of course you can call method you define but that is not the point of NVI.
March 12, 2014
On Wednesday, 12 March 2014 at 13:53:28 UTC, Steven Schveighoffer wrote:
> Nonsense. If I'm writing a function, I can call it. There is no way to prevent it.

Yes, you are right. You can always call the function directly if you so wish, since you *are* defining it.

But the idea is that it should just be a "piece of implementation" that should only be used in a controlled environment, as defined by the base class.

> OK, so your idea is that I can't call my copy of _foo, which if that's how it works, is dumb in my opinion. But if that's the case, I'm in control of its implementation, I can just forward to another function that I can call.

Absolutely. And in NVI, arguably, that's a good idea.

By doing this, you are basically creating a second level NVI for a potential third level implementation.

*if* your derived class made a direct call to the virtual (protected) function, then you'd be voiding the base classes' guarantees for any other derived class.

> The idea is fine, but protected serves this purpose just as well.

The basic idea of NVI is encapsulation of behavior. The derived class, while they are the ones defining the implementation, are considered no different from any other client, and are not given call rights.

If you make the functions protected, then you are granting them access to implementation internals, which means weaker encapsulation. For a 1-level hierarchy, it doesn't make much difference, but the deeper you go, and the less robust the design gets.
March 12, 2014
On 03/12/2014 02:22 PM, Steven Schveighoffer wrote:
>>
>
> OK, I can see that being useful. You are right, I was thinking C++ private.
>
> So essentially, a virtual private function can only be overridden in
> classes defined in the same module. What happens when you do it in a
> separate module, an error?

In a different module, attempting an override should behave as if the base class didn't define the member function.

> What if you want to define that function
> name, but it's taken by the base class, what happens?

It's not taken.

There are corner cases, when eg. the module with the private member function imports a module with a public member function hiding the private one (i.e., circular imports) and then uses this newly defined member function, possibly expecting to call its own private member. Then it should maybe be an error. :o)

Anyway, I think the main issue with virtual private and virtual package is that they'd influence the class ABI.
March 12, 2014
On 3/12/14, 4:40 AM, Manu wrote:
> Thank you.
> There you go, it's not even hypothetical.

I think the example given works against your argument.

Andrei
March 12, 2014
On 3/12/14, 5:20 AM, Daniel Murphy wrote:
> - Only the introducing virtual methods need to be changed, so the
> breakage is actually very small and trivially handled

Most breakages are very small and trivially handled. That's not the point.

Andrei


March 12, 2014
Andrei Alexandrescu:

> Most breakages are very small and trivially handled. That's not the point.

Unexpected regressions, silent dangerous breakage, safe but hard to fix breakage, and carefully planned breakage that can be trivially handled three are very different cases.

Bye,
bearophile
March 12, 2014
On Wednesday, 12 March 2014 at 16:46:26 UTC, Andrei Alexandrescu wrote:
> On 3/12/14, 4:40 AM, Manu wrote:
>> Thank you.
>> There you go, it's not even hypothetical.
>
> I think the example given works against your argument.
>
> Andrei

How so? The example was his argument verbatim.
March 12, 2014
On 3/12/14, 10:04 AM, bearophile wrote:
> Andrei Alexandrescu:
>
>> Most breakages are very small and trivially handled. That's not the
>> point.
>
> Unexpected regressions, silent dangerous breakage, safe but hard to fix
> breakage, and carefully planned breakage that can be trivially handled
> three are very different cases.

That's why I said "most" and not "all".

I think we must take final-by-default off the table. Doing so would also make the addition of "virtual" too much a price to pay for the negation of "final".


Andrei

March 12, 2014
On 3/12/14, 10:05 AM, monarch_dodra wrote:
> On Wednesday, 12 March 2014 at 16:46:26 UTC, Andrei Alexandrescu wrote:
>> On 3/12/14, 4:40 AM, Manu wrote:
>>> Thank you.
>>> There you go, it's not even hypothetical.
>>
>> I think the example given works against your argument.
>>
>> Andrei
>
> How so? The example was his argument verbatim.

His argument assumed at core that the library designer knows better than the library user what the customization points are, and that most functions are virtual by mistake.

Andrei

March 12, 2014
On 3/11/14, 8:04 PM, Manu wrote:
> I'm really trying to keep my lid on here...

Yep, here we go again :o).

> I'll just remind that in regard to this particular point which sounds
> reasonable, it's easy to forgot that *all library code where the author
> didn't care* is now unusable by anybody who does. The converse would not
> be true if the situation was reversed.

There's an asymmetry introduced by the fact there's today code in use.

> virtual-by-default is incompatible with optimisation, and it's reliable
> to assume that anybody who doesn't explicitly care about this will stick
> with the default, which means many potentially useful libraries may be
> eliminated for use by many customers.

Virtual by default is, however, compatible with customization and flexibility.

Unstated assumption: "many potential useful libraries" assumes many libraries use traditional OO design in their core components.

Unstated assumption: "many customers".

> Also, as discussed at length, revoking virtual from a function is a
> breaking change, adding virtual is not.

Changing the default is a breaking change.

> Which means that, instead of
> making a controlled breaking change with a clear migration path here and
> now, we are committing every single instance of any user's intent to
> 'optimise' their libraries (by finalising unnecessarily virtuals) to
> breaking changes in their ABI - which *will* occur, since virtual is the
> default.

Unstated assumption: "every single instance" assumes again that people interested in writing fast libraries have virtual calls as a major bottleneck, and furthermore they didn't care about speed to start with, to wake up later. This pictures library designers as quite incompetent people.

> According to semantic versioning, this requires bumping the major
> version number... that's horrible!

Appeal to emotion.

> What's better; implementing a controlled deprecation path now, or
> leaving it up to any project that ever uses the 'class' keyword to
> eventually confront breaking changes in their API when they encounter a
> performance oriented customer?

It's better to leave things be. All I see is the same anecdote gets vividly told again whenever the topic comes up.


Andrei