December 04, 2013
On 4 December 2013 12:58, Kapps <opantm2+spam@gmail.com> wrote:

> On Tuesday, 3 December 2013 at 18:10:43 UTC, Dejan Lekic wrote:
>
>> This is *a radical change*, and instead of "(unknown) people agreed" the community deserves a better explanation why are we switching to final as default storage class...
>>
>
> Let me state in advance that I believe that in most situations people will leave the default option unless they have reason to (that is, stick with the virtual unless they have reason to mark something final, or stick with final unless they have reason to mark something virtual). For virtual, there is often less reason to realize something should be final as things look like they work until much later. Forgetting to stick a 'final' in front of your function will never give you a compiler error. That being said, some pros / cons of virtual/final by default. Obviously heavily biased.
>
> 1) The most obvious reason is final has better performance. Some effort can be taken to reduce the impact of virtual methods but ultimately there will always be a performance hit. In a large program, this may not be too substantial a hit. I consider this the weakest argument for final-by-default, but others who attempt to deliver as efficient as possible code consider it very important.
>

I think this has a tendency to be underestimated until it is a project requirement. Consider that there may be any number of useful libraries available - many of which didn't explicitly try to deliver the most efficient possible code, by they are still useful libraries nonetheless - but pervasive use of virtual may unintentionally inhibit their usage in performance critical software.

This is not conjecture, I've run in to this professionally on many
occasions. And worse than C++, D has 1st class properties, which makes it
much more important a consideration than in C++.
What I'm proposing is simply that, with this change, it will not be a
hidden/unintentional cost, but the placement of the keyword one way or
another will result in library authors considering, even for just a moment,
the reality of the _choice_ they are making. It can't be default, or the
choice will be overlooked and the damage is done, since it can't be revoked.

Library authors are very apprehensive to make breaking changes to their API and go bumping the major version of their software to fix an otherwise trivial performance issue.


2) A function that is virtual cannot be changed to be final without
> breaking existing code. If you are writing a library and have a virtual method which you wish to change to final, anyone using that library that overrides your method will have their code break and require substantial design changes if they were relying on simply being able to override your method. If a method was final and is changed to be virtual later on, no code will break unless the developer then decides to change it back to final afterwards.
>
> 3) Related to the above, if a method is not virtual there is nothing you can do about it. A method might be perfectly capable of being virtual without issues, yet if the developer forgot to mark it as 'virtual' you cannot override it even though it may be safe to do so.
>

You can change it easily (open source), or request the change (which
usually wouldn't be objected, since it's not a breaking change). And if
neither of those are to your taste, you can very easily wrap it as a last
resort; in D, 'alias this' makes wrapping things easier than ever. Those
aren't options when going the other direction, rather, there is no
option... you're simply screwed. And I have been on the receiving end of
this on many occasions in C++, and it sucks!
It will be much worse for D, since properties are more pervasive.


4) If we assume that the statement about people sticking with the default
> is true (which I strongly believe it is, yet lack actual data for), final by default means more 'virtual' keywords being stuck everywhere. For a language that may already be considered to have too many modifiers to stick in front of your methods, this would add yet another one for many class methods.
>

I have hard experience that shows that even when people know it's a critical performance penalty and a project requirement, they STILL stick with the default, either from habit, or ignorance/inexperience, or they simply forget.

WRT adding additional modifiers, I'd make the argument that looking from
the other (current) perspective where virtual is default and you need to
explicitly mark functions final everywhere, you will end up with far more
'final' spam than you would with 'virtual' as you suggest.
If that's not the case, then it's likely the truth is that D libraries are
over-virtualised, and will suffer performance penalties as a result.

The majority of functions in most OOP classes are properties and trivial
accessors which should almost never be virtual. Imagine, making virtual
function calls to access trivial properties? They can't be inlined anymore;
inlining trivial accessors is one of the most important optimisations for
OOP that there is.
If people stick with the default, then D suffers severe performance
penalties against C++ and even Java/C#/etc, since nothing can be done to
optimise the calls. And like you say, you (and I) agree that most people
will stick with the default in most cases unless they have good reason not
to (ie, a compile error).


5) Marking something virtual is a huge commitment and makes substantial
> guarantees about implementation details. A simple example is a property. Making a property virtual is a statement that you will never access the field behind the property (assuming there is one), as someone may have overridden the property to use a different backing field or a constant value. This is something that requires actual acknowledgement, yet virtual by default does not require you to think about this as it's automatically virtual. This is actually a mistake I catch myself making not infrequently.
>

Yes, another very important point that I have tried to make many times. I support that it is important to acknowledge this commitment you have made. It can't reasonably be a default commitment, since it has severe performance and maintainability implications.


A less obvious example is add vs addRange. If you implement a class to
> extend an existing collection yet add capabilities for storing length by overriding add/remove to increase/decrease length, should you override addRange and add range.length? What if something then overrides addRange for more efficient adding of multiple elements rather than calling add N times, or something that only sometimes calls add depending on the element type or state? These are considerations that require actual thought, and while simply adding a "virtual" keyword to them doesn't mean you thought about the implications, the very action of having to add this keyword makes it a conscious effort and so helps you consider the side-effects of adding it and what it means.
>

+1 hundred :)


I don't understand how virtual-by-default is even considered for new
> languages. It provides slight convenience while being error prone. Yet breaking almost every single D program is a huge issue and stability is something that D is desperately trying to achieve. I'd like virtual by default, but it's a hard sell now. I wouldn't mind updating my code for this and I feel it better to do this sort of change now rather than later when more/larger code uses D, but it is still a substantial code break. If it was actually done it would need a transition flag or such to help the process (final by default normally, virtual by default with the transition flag), and likely a virtual keyword implemented well ahead of time so people could start making their changes early and gradually.
>

You mean you'd like virtual-by-default as you say (as is now), or you'd like final-by-default at the cost of the transition?

There is a proposed transition process across a few releases which should
make the transition relatively painless, at least, it wouldn't be error
prone, since warnings/deprecation messages would assist the process.
1. 'virtual' keyword is introduced, 'override'-ing unattributed methods is
a warning.
2. it becomes deprecated, but you can still compile with -d.
3. it becomes an error, but surely you've already taken the opportunity to
update your software, right?


December 04, 2013
On Wednesday, 4 December 2013 at 07:00:58 UTC, Manu wrote:
> You mean you'd like virtual-by-default as you say (as is now), or you'd
> like final-by-default at the cost of the transition?

Sorry, I meant to say I'd like final by default, but I can understand people being hesitant with the breakage required.

> There is a proposed transition process across a few releases which should
> make the transition relatively painless, at least, it wouldn't be error
> prone, since warnings/deprecation messages would assist the process.
> 1. 'virtual' keyword is introduced, 'override'-ing unattributed methods is
> a warning.
> 2. it becomes deprecated, but you can still compile with -d.
> 3. it becomes an error, but surely you've already taken the opportunity to
> update your software, right?

This transition also works well. It does mean that the performance benefits would not occur until step 3, but while giving time to transition it also immediately provides the safety benefits. Ideally it would even catch bugs in peoples code as they make the required changes to add virtual where needed. The only issue is that it pretty much guarantees that most libraries that make use of classes and are no longer maintained will fail to compile, but honestly, I don't know think many projects will compile anyways when using a compiler / Phobos version that's more than a year newer than when the library was last modified.
December 04, 2013
On 4 December 2013 17:24, Kapps <opantm2+spam@gmail.com> wrote:

> On Wednesday, 4 December 2013 at 07:00:58 UTC, Manu wrote:
>
>> There is a proposed transition process across a few releases which should
>>
> make the transition relatively painless, at least, it wouldn't be error
>> prone, since warnings/deprecation messages would assist the process.
>> 1. 'virtual' keyword is introduced, 'override'-ing unattributed methods is
>> a warning.
>> 2. it becomes deprecated, but you can still compile with -d.
>> 3. it becomes an error, but surely you've already taken the opportunity to
>> update your software, right?
>>
>
> This transition also works well. It does mean that the performance benefits would not occur until step 3, but while giving time to transition it also immediately provides the safety benefits. Ideally it would even catch bugs in peoples code as they make the required changes to add virtual where needed. The only issue is that it pretty much guarantees that most libraries that make use of classes and are no longer maintained will fail to compile, but honestly, I don't know think many projects will compile anyways when using a compiler / Phobos version that's more than a year newer than when the library was last modified.
>

Indeed. I think it's very safe to say that any un-maintained code already won't compile. This isn't the first, or even the greatest in magnitude breaking change that there's been recently.


December 04, 2013
On 2013-12-03 17:55, Manu wrote:

> C++ support in D is 'not complete', but what's there now is still very
> important and not considered a bad partial feature to have.
> I'd imagine that likewise, if the 'core' of Obj-C support is solid, then
> maybe it's worth it in the same sense?
> OSX and iOS are 2 of the most popular platforms on earth. The mobile
> space is probably the most important prospective target for D.
> I can't comment on the state it's in. But if what's there is solid and
> useful like the current C++ support, then it's an important start...?

I agree but Michel has a point as well. Walter, I think it was, has also said that D will never have complete support for C++ because that will require to build in a C++ compiler in the D compiler. That's not something we would like to do.

Objective-C, on the other hand, would be much more realistic to add complete support for.

You can always help if you like :)

-- 
/Jacob Carlborg
December 04, 2013
On 2013-12-04 04:47, Michel Fortin wrote:

> - it's 32-bit-OS-X-only right now, iOS and 64-bit OS X both use a
> different runtime which requires different codegen, and Apple has been
> phasing out 32-bit for some time already

BTW, is there much difference between the modern runtime for 32bit and 64bit? I'm thinking, is things like the optimization of using the pointer for storing the data for types like NSNumber used on 32bit with the modern runtime?

-- 
/Jacob Carlborg
December 04, 2013
On Tuesday, 3 December 2013 at 09:51:17 UTC, Iain Buclaw wrote:
> On 3 December 2013 00:01, nazriel <spam@dzfl.pl> wrote:
>> On Thursday, 28 November 2013 at 21:01:39 UTC, Fra wrote:
>>>
>>> Personally I would love to see this old issue finally implemented/fixed:
>>> There can be only one alias this.
>>> https://d.puremagic.com/issues/show_bug.cgi?id=6083
>>>
>>> What would your choice be?
>>
>>
>> I would support LDC project - so maybe more people would work on it.
>>
>> //*me* wishes he could use LDC with Vibe.d
>
> Similarly I won't be around forever, and having a bus factor of 1 is
> taking some burdening as time drags on...

Exactly my point.
Other 2 compilers seem to be unnoticed as the ones who need help.

I believe that for some people ldc&gdc are the only reason to even consider D as an alternative to C or C++, given how much slower are binaries generated by DMD.

I am very worried seeing commits in gdc/ldc being made by 1 man for the past weeks.

Just my 2 cents :)
December 04, 2013
On 4 December 2013 08:28, Jacob Carlborg <doob@me.com> wrote:
> On 2013-12-03 17:55, Manu wrote:
>
>> C++ support in D is 'not complete', but what's there now is still very
>> important and not considered a bad partial feature to have.
>> I'd imagine that likewise, if the 'core' of Obj-C support is solid, then
>> maybe it's worth it in the same sense?
>> OSX and iOS are 2 of the most popular platforms on earth. The mobile
>> space is probably the most important prospective target for D.
>> I can't comment on the state it's in. But if what's there is solid and
>> useful like the current C++ support, then it's an important start...?
>
>
> I agree but Michel has a point as well. Walter, I think it was, has also said that D will never have complete support for C++ because that will require to build in a C++ compiler in the D compiler. That's not something we would like to do.
>

What complete support?  All you really need is:
1) ABI compatibility when passing around basic types.... check [1]
2) ABI compatibility when passing around structs.... check [1]
3) C++ mangling when declaring functions/decls as extern(C++).... check
4) ABI compatibility with C++ classes.... check

[1] Oh wait... dmd has a problem... https://d.puremagic.com/issues/show_bug.cgi?id=5570

Forget templates and everything else...
December 04, 2013
On Wednesday, 4 December 2013 at 07:00:58 UTC, Manu wrote:
> The majority of functions in most OOP classes are properties and trivial
> accessors which should almost never be virtual. Imagine, making virtual
> function calls to access trivial properties? They can't be inlined anymore;
> inlining trivial accessors is one of the most important optimisations for
> OOP that there is.

This is an absolutely important point that should be considered the foundation for the argument for final-by-default. Your average programmer is NOT going to tag everything as final, and the cost of virtual accessors could be tremendous. There's no argument for virtual-by-default that is nearly as powerful as this.
December 04, 2013
On 2013-12-04 10:12, Iain Buclaw wrote:

> Forget templates and everything else...

Then it's not complete, just as I said.

-- 
/Jacob Carlborg
December 04, 2013
On 4 December 2013 19:24, Jacob Carlborg <doob@me.com> wrote:

> On 2013-12-04 10:12, Iain Buclaw wrote:
>
>  Forget templates and everything else...
>>
>
> Then it's not complete, just as I said.


In a sense. But I don't think the goal is to be able to write C++ code in
D. It's just to achieve binary compatibility.
Templates in C++ must be declared in header files, and are rarely present
in libs. You almost never link a template instance.
In the extremely rare event that you want to (I've never wanted to), it's
not inconceivable that an extern(C++) template with super-restrictive
template parameter rules could be made to mangle the same as C++, and then
it truly would be complete :)
I personally have no use for this though... but it's probably technically
possible.