June 06, 2013
On 6/6/13 5:45 PM, Jonathan M Davis wrote:
> On Thursday, June 06, 2013 17:23:03 Andrei Alexandrescu wrote:
>>> LOL. Yes, well. That's certainly much closer, but if the shared library
>>> changes and later derives more types than it did before. And since shared
>>> libraries can be swapped out, that could break the optimizations that the
>>> tool did, so you'd have to run the tool over the whole thing again. So,
>>> it's definitely an idea with potential, but I don't think that it could
>>> be guaranteed to work in all cases, and the programmer likely have to be
>>> aware of when it didn't work in order to avoid some nasty bugs.
>>
>> There would be no bugs, worst case compilation errors. (The tool I'm
>> envisioning would add final annotations or prompt the user to add them.)
>
> That would definitely work, but it would probably work better if the user were
> told to add them rather than them being add automatically, or you risk some
> functions being devrtualized when it's known a shared library may need them to
> be virtual in a later version. Regardless, at minimum, it would provide a way
> to track down all of the virtual functions which may not need to be virtual,
> which could be quite valuable regardless of whether the programmer decides to
> make them non-virtual or not
>
> - Jonathan M Davis

--in-place

Andrei
June 06, 2013
On 6/6/2013 3:12 PM, Jonathan M Davis wrote:
> On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
>> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
>>> (The tool I'm envisioning
>>> would add final annotations or prompt the user to add them.)
>>
>> Sorry, that's never going to fly.
>
> It could tell the programmer which functions it _thinks_ don't need to be
> virtual, but it can't be 100% correct. So, it would effectively be a lint-like
> tool targeting possible devirtualization opportunities. It would actually be
> potentially useful regardless of whether virtual or non-virtual is the
> default, since programmers may have needlessly marked functions as virtual.
> But if it's a question of whether it's a good solution for optimizing away
> virtuality instead of making functions non-virtual, then I don't think that it
> would fly - not if optimization is a prime concern. It would just be a nice
> helper tool for static analysis which could give you suggestions on things you
> might be able to improve in your program.

I know. But people are never going to use that tool.


> But as it sounds like the primary argument which has swayed you towards making
> non-virtual the default is tied to cleaner code evolution and maintenance
> rather than performance, the suggestion obviously wouldn't be a viable
> counterargument for going with virtual-by-default.

The thing is, when code 'works' there is rarely sufficient motivation to go back and annotate things for safety and performance (hence why the tool above will be a failure). Code that works is left alone, and we see the situation Manu is talking about.

But if it's final by default, if the user needs it to be virtual, then he has to go back and add the annotation - it's not going to work, and the compiler will tell him it doesn't work.

I wouldn't have changed my mind if it were possible for the compiler to auto-finalize methods.

BTW, this is also why D hasn't opted for the pointer tagging system Rust has. It all looks great on paper, but I suspect that in practice not much use will be made of it - people will just default to using the most widely usable pointer type and their code will work and they'll forget about the rest of the annotations.

I have a lot of experience with this with DOS 16 bit code. There we had all kinds of pointer types - near, far, SS relative, CS relative, etc. You know what people did? The default pointer type. Almost nobody used those optimized pointer types, even though they got big speed boosts when used appropriately.

What does work is throw -O and have the compiler go to town and optimize the hell out of it. As much as possible we should be selecting default semantics that enable -O to kick ass.

June 06, 2013
On 6/6/2013 12:55 PM, Andrei Alexandrescu wrote:
> On 6/6/13 2:31 PM, Walter Bright wrote:
>> Ok, Manu, you win, I'm pretty much convinced.
>
> In my heart of hearts I somehow hope this will blow over and we'll get to some
> actually interesting stuff...

You could review my proposal on inferring immutability and uniqueness. Getting that working right will be kick ass!

June 06, 2013
On 7 June 2013 05:37, Walter Bright <newshound2@digitalmars.com> wrote:

> On 6/6/2013 12:31 PM, Andrei Alexandrescu wrote:
>
>> I think class hierarchy analysis is very doable for whole D projects. You
>> just
>> pass the tool all files in the project and it does its thing.
>>
>> http://www.cs.ucla.edu/~**palsberg/tba/papers/dean-** grove-chambers-ecoop95.pdf<http://www.cs.ucla.edu/~palsberg/tba/papers/dean-grove-chambers-ecoop95.pdf>
>>
>> This would actually be a great GSoC-style project, distributable via tools/.
>>
>
> The trouble, as has been pointed out before, is shared libraries.
>

And the existence of 'sufficiently smart linker', and the fact that the
platforms that suffer from this stuff way harder than x86 almost always
have less mature compilers/optimisers/linkers.
I just wouldn't ever place my faith in the future arrival of some
sufficiently-smart-[tool]. You couldn't make a business investment on that
illusive possibility.


June 06, 2013
On Thursday, 6 June 2013 at 23:48:33 UTC, Walter Bright wrote:
> On 6/6/2013 3:12 PM, Jonathan M Davis wrote:
>> On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
>>> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
>>>> (The tool I'm envisioning
>>>> would add final annotations or prompt the user to add them.)
>>>
>>> Sorry, that's never going to fly.
>>
>> It could tell the programmer which functions it _thinks_ don't need to be
>> virtual, but it can't be 100% correct. So, it would effectively be a lint-like
>> tool targeting possible devirtualization opportunities. It would actually be
>> potentially useful regardless of whether virtual or non-virtual is the
>> default, since programmers may have needlessly marked functions as virtual.
>> But if it's a question of whether it's a good solution for optimizing away
>> virtuality instead of making functions non-virtual, then I don't think that it
>> would fly - not if optimization is a prime concern. It would just be a nice
>> helper tool for static analysis which could give you suggestions on things you
>> might be able to improve in your program.
>
> I know. But people are never going to use that tool.
>

I think it depend of his simplicity and integration in the common D process development. Maybe because D build fast we can add some extra steps during build of the release?
And developers of companies that develop the biggest application will be aware of this tool and certainly have script or advanced tools to build their software release, adding a line during the building process seems acceptable.
June 07, 2013
On 6/6/13 5:57 PM, Walter Bright wrote:
> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
>> (The tool I'm envisioning
>> would add final annotations or prompt the user to add them.)
>
> Sorry, that's never going to fly.

Like a dove.

Andrei

June 07, 2013
On 6/6/13 6:12 PM, Jonathan M Davis wrote:
> On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
>> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
>>> (The tool I'm envisioning
>>> would add final annotations or prompt the user to add them.)
>>
>> Sorry, that's never going to fly.
>
> It could tell the programmer which functions it _thinks_ don't need to be
> virtual, but it can't be 100% correct.

It would be 100% correct if given the entire application. You may want to take a look at the CHA paper.

Andrei


June 07, 2013
On 6/6/13, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote:
> On 6/6/13 2:31 PM, Walter Bright wrote:
>> Ok, Manu, you win, I'm pretty much convinced.
>
> In my heart of hearts I somehow hope this will blow over and we'll get to some actually interesting stuff...

Which stuff? :)
June 07, 2013
On 6/6/13 8:01 PM, Andrej Mitrovic wrote:
> On 6/6/13, Andrei Alexandrescu<SeeWebsiteForEmail@erdani.org>  wrote:
>> On 6/6/13 2:31 PM, Walter Bright wrote:
>>> Ok, Manu, you win, I'm pretty much convinced.
>>
>> In my heart of hearts I somehow hope this will blow over and we'll get
>> to some actually interesting stuff...
>
> Which stuff? :)

E.g. finalizing shared and threading.

Andrei
June 07, 2013
On Thursday, June 06, 2013 16:48:32 Walter Bright wrote:
> On 6/6/2013 3:12 PM, Jonathan M Davis wrote:
> > On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
> >> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
> >>> (The tool I'm envisioning
> >>> would add final annotations or prompt the user to add them.)
> >> 
> >> Sorry, that's never going to fly.
> > 
> > It could tell the programmer which functions it _thinks_ don't need to be virtual, but it can't be 100% correct. So, it would effectively be a lint-like tool targeting possible devirtualization opportunities. It would actually be potentially useful regardless of whether virtual or non-virtual is the default, since programmers may have needlessly marked functions as virtual. But if it's a question of whether it's a good solution for optimizing away virtuality instead of making functions non-virtual, then I don't think that it would fly - not if optimization is a prime concern. It would just be a nice helper tool for static analysis which could give you suggestions on things you might be able to improve in your program.
> 
> I know. But people are never going to use that tool.

Some would, but I completely agree that it wouldn't be used enough to be viable as a real solution for making functions non-virtual.

> I wouldn't have changed my mind if it were possible for the compiler to auto-finalize methods.

Yeah, if the compiler could figure it all out, that would be great, and this might not be necessary, but unfortunately, it clearly can't -t hough even if it could, there's definitely some value in the programmer being explicit about what is and isn't overridden. Taken to the extreme, that would probably require that every single member function be marked with either virtual, override, or final so that the programmer has to explicitly choose in each case what a function is supposed to be doing in terms of virtuality, but that's likely overkill and would break too much code at this point even if it were ultimately a good idea. As it stands, we can simply say that introducing functions must be marked with virtual and that functions marked with override must override a function which is marked with either virtual or override, and all people will have to do is mark the introducing functions with virtual. That'll certainly break some code, but considering how big a change it is, it affects a suprisingly small portion of the code.

- Jonathan M Davis