April 10, 2013
On Wednesday, 10 April 2013 at 15:38:49 UTC, Andrei Alexandrescu wrote:
> On 4/10/13 7:30 AM, Manu wrote:
>> The _problem_ is that functions are virtual by
>> default. It's a trivial problem to solve, however it's a major breaking
>> change, so it will never happen.
>
> I agree. We may as well save our breath on this one.
>
>> Hence my passing comment that spawned this whole thread, I see it as the
>> single biggest critical mistake in D, and I'm certain it will never be
>> changed. I've made my peace, however disappointing it is to me.
>
> I disagree with the importance assessment, but am soothed by your being at peace.
>
>
> Andrei

Why is virtual by default a problem?

You could have non-virtual by default and would live happily until a day where you forget to declare the base class destructor virtual. Then you spent a lot of time trying to find why you are leaking memory.

In C++ you have to be aware all the time not to forget something and screw everything. D is more forgiving, at a small cost of performance.

So I don't buy the non-virtual by default argument. If your profiler tells you that a particular virtual function is the bottleneck, go on and make it final. That's why profilers exist.
April 10, 2013
On Wednesday, 10 April 2013 at 15:38:49 UTC, Andrei Alexandrescu
wrote:
> On 4/10/13 7:30 AM, Manu wrote:
>> The _problem_ is that functions are virtual by
>> default. It's a trivial problem to solve, however it's a major breaking
>> change, so it will never happen.
>
> I agree. We may as well save our breath on this one.
>
>> Hence my passing comment that spawned this whole thread, I see it as the
>> single biggest critical mistake in D, and I'm certain it will never be
>> changed. I've made my peace, however disappointing it is to me.
>
> I disagree with the importance assessment, but am soothed by your being at peace.
>
>
> Andrei

Why is virtual by default a problem?

You could have non-virtual by default and would live happily
until a day where you forget to declare the base class destructor
virtual. Then you spent a lot of time trying to find why you are
leaking memory.

In C++ you have to be aware all the time not to forget something
and screw everything. D is more forgiving, at a small cost of
performance.

So I don't buy the non-virtual by default argument. If your
profiler tells you that a particular virtual function is the
bottleneck, go on and make it final. That's why profilers exist.
April 10, 2013
On 11 April 2013 01:38, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 4/10/13 7:30 AM, Manu wrote:
>
>> Hence my passing comment that spawned this whole thread, I see it as the
>>
> single biggest critical mistake in D, and I'm certain it will never be
>> changed. I've made my peace, however disappointing it is to me.
>>
>
> I disagree with the importance assessment, but am soothed by your being at peace.


Well, I personally have no other issues with D that I would call 'critical
mistakes', this is it... and it is a pretty big one. From a performance
point of view, it's very dangerous, and I've also illustrated a whole bunch
of other reasons why I think it's a mistake irrespective of performance.
Also Andrej's recent point was interesting.
Any other gripes I have are really just incomplete features, like rvalue ->
ref (or scope's incomplete implementation as I've always imagined it),
allocations where they don't need to be, better gc control, and general
lack of consideration for other architectures. I can see movement on all
those issues, they'll come around when they've finished baking.
Meanwhile, lots of things that were missing/buggy have been fixed in the
last year, D is feeling loads more solid recently, at least to me. I've
been able to do most things I need to do with relatively little friction.


April 10, 2013
On Wednesday, 10 April 2013 at 15:38:49 UTC, Andrei Alexandrescu
wrote:
> On 4/10/13 7:30 AM, Manu wrote:
>> The _problem_ is that functions are virtual by
>> default. It's a trivial problem to solve, however it's a major breaking
>> change, so it will never happen.
>
> I agree. We may as well save our breath on this one.
>
>> Hence my passing comment that spawned this whole thread, I see it as the
>> single biggest critical mistake in D, and I'm certain it will never be
>> changed. I've made my peace, however disappointing it is to me.
>
> I disagree with the importance assessment, but am soothed by your being at peace.
>
>
> Andrei


Why is virtual by default a problem?

You could have non-virtual by default and would live happily
until a day where you forget to declare the base class destructor
virtual. Then you spent a lot of time trying to find why you are
leaking memory.

In C++ you have to be aware all the time not to forget something
and screw everything. D is more forgiving, at a small cost of
performance.

So I don't buy the non-virtual by default argument. If your
profiler tells you that a particular virtual function is the
bottleneck, go on and make it final. That's why profilers exist.
April 10, 2013
On Wednesday, 10 April 2013 at 15:38:49 UTC, Andrei Alexandrescu wrote:
> On 4/10/13 7:30 AM, Manu wrote:
>> The _problem_ is that functions are virtual by
>> default. It's a trivial problem to solve, however it's a major breaking
>> change, so it will never happen.
>
> I agree. We may as well save our breath on this one.
>
>> Hence my passing comment that spawned this whole thread, I see it as the
>> single biggest critical mistake in D, and I'm certain it will never be
>> changed. I've made my peace, however disappointing it is to me.
>
> I disagree with the importance assessment, but am soothed by your being at peace.
>
>
> Andrei

Why is virtual by default a problem?

You could have non-virtual by default and would live happily
until a day where you forget to declare the base class destructor
virtual. Then you spent a lot of time trying to find why you are
leaking memory.

In C++ you have to be aware all the time not to forget something
and screw everything. D is more forgiving, at a small cost of
performance.

So I don't buy the non-virtual by default argument. If your
profiler tells you that a particular virtual function is the
bottleneck, go on and make it final. That's why profilers exist.
April 10, 2013
On 4/10/13, Manu <turkeyman@gmail.com> wrote:
> On 10 April 2013 23:15, Andrej Mitrovic <andrej.mitrovich@gmail.com> wrote:
>> On 4/10/13, Manu <turkeyman@gmail.com> wrote:
>> > It's a trivial problem to solve, however it's a major breaking change,
>> > so
>> > it will never happen.
>>
>> I wouldn't say never.
>>
> ... don't get my hopes up!

Just take a look at the upcoming changelog:

https://github.com/D-Programming-Language/d-programming-language.org/pull/303

(You can clone the repo and run
git fetch upstream pull/303/head:pull303 && git checkout pull303)

There is a ton of breaking language changes. Pretty much every release is a breaking one in one way or another.
April 10, 2013
On 11 April 2013 02:29, Minas Mina <minas_mina1990@hotmail.co.uk> wrote:

> On Wednesday, 10 April 2013 at 15:38:49 UTC, Andrei Alexandrescu wrote:
>
>  On 4/10/13 7:30 AM, Manu wrote:
>>
>>> The _problem_ is that functions are virtual by
>>> default. It's a trivial problem to solve, however it's a major breaking
>>> change, so it will never happen.
>>>
>>
>> I agree. We may as well save our breath on this one.
>>
>>  Hence my passing comment that spawned this whole thread, I see it as the
>>> single biggest critical mistake in D, and I'm certain it will never be changed. I've made my peace, however disappointing it is to me.
>>>
>>
>> I disagree with the importance assessment, but am soothed by your being at peace.
>>
>>
>> Andrei
>>
>
> Why is virtual by default a problem?
>

Seriously? There's like 100 posts in this thread.

You could have non-virtual by default and would live happily
> until a day where you forget to declare the base class destructor virtual. Then you spent a lot of time trying to find why you are leaking memory.
>

That's never happened to me. On the contrary, I'm yet to see another programmer properly apply final throughout a class...

In C++ you have to be aware all the time not to forget something
> and screw everything. D is more forgiving, at a small cost of performance.
>

'Small cost'? D is a compiled systems language, performance is not
unimportant.
And how do you quantify 'small'? Scattered dcache/icache misses are the
worst possible hazard.

So I don't buy the non-virtual by default argument. If your
> profiler tells you that a particular virtual function is the bottleneck, go on and make it final. That's why profilers exist.
>

Thanks for wasting my time! I already spend countless working hours looking
at a profiler. I'd like to think I might waste less time doing that in the
future.
Additionally, the profiler won't tell you the virtual function is the
bottleneck, it'll be the calling function that shows the damage, and in the
event the function is called from multiple/many places (as trivial
accessors are), it won't show up at all as a significant cost in any place,
it'll be evenly spread, which is the worst possible sort of performance
hazard. Coincidentally, this called-from-many-locations situation is the
time when it's most likely to cause icache/dcache misses!
It's all bad, and very hard to find/diagnose.

In C++, when I treat all the obvious profiler hot-spots, I start manually trawling through source files at random, looking for superfluous virtuals and if()'s. In D, I'm now encumbered by an additional task, since it's not MARKED virtual, I can't instantly recognise it, or reason about whether it actually should (or was intended to be) be virtual or not, so now I have to perform an additional tedious process of diagnosing for each suspicious function if it actually IS or should-be virtual before I can mark it final. And this doesn't just apply to the few existing functions marked virtual as in C++, now this new process applies to EVERY function. *sigh*

The language shouldn't allow programmers to make critical performance
blunders of that sort. I maintain that virtual-by-default is a critical
error.
In time, programmers will learn to be cautious/paranoid, and 'final' will
dominate your code window.


April 10, 2013
On 11 April 2013 02:59, Manu <turkeyman@gmail.com> wrote:

> In time, programmers will learn to be cautious/paranoid, and 'final' will dominate your code window.
>

Or more realistically, most programmers will continue to be oblivious, and we'll enjoy another eternity of the same old problem where many 3rd party libraries written on a PC are unusable on resource-limited machines, and people like me will waste innumerable more hours re-inventing wheels in-house, because the programmer of a closed-source library either didn't know, or didn't give a shit.

None of it would be a problem if he just had to type virtual when he meant it... the action would even assist in invoking conscious thought about whether that's actually what he wants to do, or if there's a better design. </okay, really end rant>


April 10, 2013
On Wed, 10 Apr 2013 08:38:26 -0400
Jeff Nowakowski <jeff@dilacero.org> wrote:

> On 04/09/2013 04:43 PM, Nick Sabalausky wrote:
> >
> > - Starcraft?: Starcraft is 15 years old, so it isn't an example of a
> >    modern AAA title in the first place.
> 
> StarCraft II came out a few years ago and sold very well. They also just released the second installment of it within the past month or so, and considering it is essentially an over-priced expansion pack, it also sold very well.
> 
> > In the ones I identified as "interactive movie", cinematic presentation deeply permeates the entire experience, gameplay and all.
> 
> Translation: Wearing your gumpy-old-man goggles, you dismiss games that feature lots of cinematics as "interactive movies", even though there is plenty of core gameplay to be had.
> 

"Dismissing" isn't the right word for it (Although I have gone
straight from "teenage angst" to what can be interpreted as "old dude
crotchetyness"). Like I said, I do like CoD: Modern Warfare (at least
what I've played). I'd also confidently toss the Splinter Cell series in
the "interactive movie" boat, and yet that's actually one of my all-time
favorite series (at least 1-3, and to a slightly lesser extent 4,
wouldn't know about 5). Same goes for the Portal games (although I
would have *very much* preferred they had actually included a
"fast-forward / skip-ahead" feature for all the scripted sequences.
Every movie in existence can handle that no problem, it's a *basic*
*expected* feature, why can't a videogame with a whole team of
programmers actually manage it? A true FF/Rewind obviously has
technical hurdles for real-time cinematics, but a "skip" sure as fuck
doesn't).

I guess I haven't been entirely clear, but the complaints I do have about what I've been calling "interactive movies" are that:

A. I think the non-indie industry has been focusing way too much on
them, to the detriment of the medium itself, and possibly the
health of the industry (and yes, to the determent of my own opinion on
modern videogaming as well).

It's strongly analogous to the irrationally high obsession with "3D" in
the mid 90's: 3D isn't bad, but it was WAAAAAY over-obsessed, and it
certainly isn't the *only* good way to go. A *good* 2D game would
have sold well: Rayman and Castlevania: SoTN proved that. The
problem was, publishers and developers pulled this notion that "Gamers
will only buy 3D" *completely* out of their asses, with absolutely zero
meaningful data to back it up, and instead shoveled out load after load
of mostly-bad, and mostly-ugly 3D games. I still consider that easily
the worst console generation. "Cinematic" is very much the new "3D".
Everything still applies, and history is repeating itself.

B. Most of them (from what I've seen) are very poorly done. Just to
rehash some examples:

- Skyward Sword is easily one of the worst Zeldas ever made. Same with
  the last Metroid (the one from the Ninja Gaiden reboot developers).
  Personally I thought Metroid Prime 3 had taken the series straight
  downhill too, but I guess I'm alone in that.

- Assassin's Creed (at least if AC2 is any indication) is one of the
  absolute worst multimedia productions ever created, period. It's just
  inane BS after inane BS after more inane BS. You may as well watch a
  soap.

- And the first 45 minutes of Bulletstorm is wretched as well. The
  "walking on the skyscraper's wall" *could* have been
  absolutely fantastic - *if* there had actually been anything to *do*
  besides listen to horrible dialog while walking to the next cutscene.

Portal and Splinter Cell did their story/dialog/presentation very well (despite Portal's asinine *forcing* of it, which is a royal PITA when you just want to play the puzzles), but quality in "interactive movie" storytelling is extremely rare in general. And the thing is, if you can't do a feature *well*, then it doesn't belong in the finished game, period.

I guess I've rambled again clouding my main points but basically:

Cinematic/Story/etc is the new 3D: It's not inherently bad, but it's usually done bad, and even if it weren't done badly it's way too heavily focused/obsessed on and over-used, to the detriment of the medium and possibly the industry.


> There *are* games that are essentially interactive movies, like Heavy Rain for example, or LA Noire, but putting shooters like BioShock Infinite or GTA (when doing the missions) in this category is ridiculous.

Well yea, Quantic Dream goes WAAAAAY off into the "interactive movie" realm. (Ex: Indigo Prophesy started out looking promising but quickly devolved into one long quicktime event). Quantic Dream is basically the new Digital Pictures or...whoever made Dragon's Lair.

Keep in mind, I'm using "interactive movie" largely for lack of a
better term. "Videogame" definitely isn't the right word for them. But
at the same time, these "interactive movie" things tend to swing back
and forth (within the very same game) between "more of a game than a
*true* interactive movie" and "literally *less* interactive than
a Hollywood movie, because you can't interact with a cuscene *and* you
can rarely fast-forward past it". (And then there's...dumb...shits like
Nintendo that *do* put in a skip feature, for *some* cutscenes, and
then deliberately *disable* it on any save-game that hasn't gotten at
least that far. Seriously, they could write a book on how to be an
asshole developer.) And for the record, in case anyone at Valve,
Irrational, or Human Head ever reads this: A cutscene that you can walk
around in while you wait is still a f&*@#*$ cutscene.

April 10, 2013
On 11 April 2013 02:08, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 4/10/13 8:44 AM, Manu wrote:
>
>> On 10 April 2013 22:37, Andrei Alexandrescu
>> <SeeWebsiteForEmail@erdani.org <mailto:SeeWebsiteForEmail@**erdani.org<SeeWebsiteForEmail@erdani.org>
>> >>
>>
>> wrote:
>>
>>     On 4/10/13 2:02 AM, Manu wrote:
>>
>>         I do use virtual functions, that's the point of classes. But most
>>         functions are not virtual. More-so, most functions are trivial
>>         accessors, which really shouldn't be virtual.
>>
>>
>>     I'd say a valid style is to use free functions for non-virtual
>>     methods. UFCS will take care of caller syntax.
>>
>>
>> Valid, perhaps. But would you really recommend that design pattern?
>> It seems a little obscure for no real reason. Breaks the feeling of the
>> OO encapsulation principle somewhat.
>>
>
> It may as well be a mistake that nonvirtual functions are at all part of a class' methods. This has been quite painfully seen in C++ leading to surprising conclusions: http://goo.gl/dqZrr.


Hmm, some interesting points. Although I don't think I buy what he's
selling.
It looks like over-complexity for the sake of it to me. I don't buy the
real-world benefit. At least not more so than the obscurity it introduces
(breaking the location of function definitions apart), and of course, C++
doesn't actually support this syntactically, it needs UFCS.
Granted, the principle applies far better to D, ie, actually works...


If I designed D's classes today, I'd only allow overridable methods and
> leave everything else to free functions.


Why? Sorry, that article didn't sell me. Maybe I need to sit and simmer on
it for a bit longer though. I like static methods (I prefer them to
virtuals!) ;)
If I had the methods spit, some inside the class at one indentation level,
and most outside at a different level, it would annoy me, for OCD reasons.
But I see no real advantage one way or the other in D, other than a
cosmetic one.