June 06, 2013
Another issue: It does not play well with DIP26 :-)


June 06, 2013
On 6/6/13 3:37 PM, Walter Bright wrote:
> On 6/6/2013 12:31 PM, Andrei Alexandrescu wrote:
>> I think class hierarchy analysis is very doable for whole D projects.
>> You just
>> pass the tool all files in the project and it does its thing.
>>
>> http://www.cs.ucla.edu/~palsberg/tba/papers/dean-grove-chambers-ecoop95.pdf
>>
>>
>> This would actually be a great GSoC-style project, distributable via
>> tools/.
>
> The trouble, as has been pointed out before, is shared libraries.

I wrote:

"I think class hierarchy analysis is very doable for whole D projects."

Whole. Projects.

WHOLE PROJECTS.

WHOLE. PROJECTS.

Worked on by Top. Men.


Andrei
June 06, 2013
On Mon, 2013-06-03 at 17:05 +1000, Manu wrote:
> Interestingly, you didn't actually disagree with my point about the common case here, and I don't buy the Java doctrine.

Also Java is interpreted with hotspot JIT -> methods can be de-virtualized at runtime. So Java is on another playground than D.

June 06, 2013
On Thursday, June 06, 2013 15:56:24 Andrei Alexandrescu wrote:
> On 6/6/13 3:37 PM, Walter Bright wrote:
> > On 6/6/2013 12:31 PM, Andrei Alexandrescu wrote:
> >> I think class hierarchy analysis is very doable for whole D projects.
> >> You just
> >> pass the tool all files in the project and it does its thing.
> >> 
> >> http://www.cs.ucla.edu/~palsberg/tba/papers/dean-grove-chambers-ecoop95.p df
> >> 
> >> 
> >> This would actually be a great GSoC-style project, distributable via tools/.
> > 
> > The trouble, as has been pointed out before, is shared libraries.
> 
> I wrote:
> 
> "I think class hierarchy analysis is very doable for whole D projects."
> 
> Whole. Projects.
> 
> WHOLE PROJECTS.
> 
> WHOLE. PROJECTS.
> 
> Worked on by Top. Men.

LOL. Yes, well. That's certainly much closer, but if the shared library changes and later derives more types than it did before. And since shared libraries can be swapped out, that could break the optimizations that the tool did, so you'd have to run the tool over the whole thing again. So, it's definitely an idea with potential, but I don't think that it could be guaranteed to work in all cases, and the programmer likely have to be aware of when it didn't work in order to avoid some nasty bugs.

- Jonathan M Davis
June 06, 2013
06-Jun-2013 21:47, deadalnix пишет:
> On Thursday, 6 June 2013 at 15:33:19 UTC, David Nadlinger wrote:
>> In fact, from what I remember from the various discussions at DConf, I
>> think we have pretty much the same opinion regarding how hidden costs
>> are a bit too pervasive in present-day D, respectively how it
>> encourages an inherently wasteful style of coding in more places than
>> necessary. I also agree that in the current class design,
>> virtual-by-default is dangerous. If we were to go back to the drawing
>> board, though, I'd be interested in exploring alternative directions
>> in the design space, away from the Java-style OOP model altogether.
>>
>
+1 Never liked current OOP scheme.

> scala's is pretty much the definition of awesome on that one. If it had
> to be redone, I'd push in that direction.

And another plus one, though I've only glimpsed over scala by reading Martin's book "Programming in Scala" and trying simple "scripts". It felt very nice though.

-- 
Dmitry Olshansky
June 06, 2013
On 6/6/13 4:30 PM, Jonathan M Davis wrote:
> On Thursday, June 06, 2013 15:56:24 Andrei Alexandrescu wrote:
>> On 6/6/13 3:37 PM, Walter Bright wrote:
>>> On 6/6/2013 12:31 PM, Andrei Alexandrescu wrote:
>>>> I think class hierarchy analysis is very doable for whole D projects.
>>>> You just
>>>> pass the tool all files in the project and it does its thing.
>>>>
>>>> http://www.cs.ucla.edu/~palsberg/tba/papers/dean-grove-chambers-ecoop95.p
>>>> df
>>>>
>>>>
>>>> This would actually be a great GSoC-style project, distributable via
>>>> tools/.
>>>
>>> The trouble, as has been pointed out before, is shared libraries.
>>
>> I wrote:
>>
>> "I think class hierarchy analysis is very doable for whole D projects."
>>
>> Whole. Projects.
>>
>> WHOLE PROJECTS.
>>
>> WHOLE. PROJECTS.
>>
>> Worked on by Top. Men.
>
> LOL. Yes, well. That's certainly much closer, but if the shared library
> changes and later derives more types than it did before. And since shared
> libraries can be swapped out, that could break the optimizations that the tool
> did, so you'd have to run the tool over the whole thing again. So, it's
> definitely an idea with potential, but I don't think that it could be
> guaranteed to work in all cases, and the programmer likely have to be aware of
> when it didn't work in order to avoid some nasty bugs.

There would be no bugs, worst case compilation errors. (The tool I'm envisioning would add final annotations or prompt the user to add them.)

Andrei
June 06, 2013
On 06/06/2013 07:55 PM, Steven Schveighoffer wrote:
> On Thu, 06 Jun 2013 13:50:11 -0400, deadalnix <deadalnix@gmail.com> wrote:
>
>> On Thursday, 6 June 2013 at 15:06:38 UTC, Kapps wrote:
>>> On Thursday, 6 June 2013 at 01:08:36 UTC, deadalnix wrote:
>>>> This is why I wrote that this may have been true in the past.
>>>> Nevertheless, it is completely false today.
>>>
>>> C# often does not inline virtual methods, and even if it can inline
>>> them there's still an overhead. This (2008) article goes into depth
>>> about how it handles it:
>>> www.codeproject.com/Articles/25801/JIT-Optimizations - Essentially
>>> uses frequency analysis to determine if the virtual method call is
>>> still going to call the same method as it would previously.
>>> Regardless, we can not perform such optimizations, so whether or not
>>> it applies to C#, it does apply to D.
>>>
>>
>> Quite frankly, I don't care what C# does. Java does it at link time,
>> and we can do it at link time the same way, that is all that matter
>> for this discussion.
>
> How do you finalize a method with the possibility that a dynamic library
> will come along and extend that type?  Not a rhetorical question, I
> really want to know if there is a way.  Java and C# clearly have more
> flexibility there, since they are run on a VM.
>
> -Steve

The more advanced JVM's assume a closed world for JIT optimization and then perform deoptimization whenever a dynamic library is loaded that invalidates some assumptions made for optimization.
June 06, 2013
On Thursday, June 06, 2013 17:23:03 Andrei Alexandrescu wrote:
> > LOL. Yes, well. That's certainly much closer, but if the shared library changes and later derives more types than it did before. And since shared libraries can be swapped out, that could break the optimizations that the tool did, so you'd have to run the tool over the whole thing again. So, it's definitely an idea with potential, but I don't think that it could be guaranteed to work in all cases, and the programmer likely have to be aware of when it didn't work in order to avoid some nasty bugs.
> 
> There would be no bugs, worst case compilation errors. (The tool I'm envisioning would add final annotations or prompt the user to add them.)

That would definitely work, but it would probably work better if the user were told to add them rather than them being add automatically, or you risk some functions being devrtualized when it's known a shared library may need them to be virtual in a later version. Regardless, at minimum, it would provide a way to track down all of the virtual functions which may not need to be virtual, which could be quite valuable regardless of whether the programmer decides to make them non-virtual or not

- Jonathan M Davis
June 06, 2013
On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
> (The tool I'm envisioning
> would add final annotations or prompt the user to add them.)

Sorry, that's never going to fly.

June 06, 2013
On Thursday, June 06, 2013 14:57:00 Walter Bright wrote:
> On 6/6/2013 2:23 PM, Andrei Alexandrescu wrote:
> > (The tool I'm envisioning
> > would add final annotations or prompt the user to add them.)
> 
> Sorry, that's never going to fly.

It could tell the programmer which functions it _thinks_ don't need to be virtual, but it can't be 100% correct. So, it would effectively be a lint-like tool targeting possible devirtualization opportunities. It would actually be potentially useful regardless of whether virtual or non-virtual is the default, since programmers may have needlessly marked functions as virtual. But if it's a question of whether it's a good solution for optimizing away virtuality instead of making functions non-virtual, then I don't think that it would fly - not if optimization is a prime concern. It would just be a nice helper tool for static analysis which could give you suggestions on things you might be able to improve in your program.

But as it sounds like the primary argument which has swayed you towards making non-virtual the default is tied to cleaner code evolution and maintenance rather than performance, the suggestion obviously wouldn't be a viable counterargument for going with virtual-by-default.

- Jonathan M Davis