You just missed a big discussion on IRC about this, where I think I made some fair points that people actually agreed with.

On 1/4/2012 10:53 AM, Manu wrote:
Oh, and virtual-by-default... completely unacceptable for a systems language.
 most functions are NOT virtual, and finding the false-virtuals while
optimising will be extremely tedious and time consuming.

The only reason to use classes in D is for polymorphic behavior - and that means
virtual functions. Even so, a class member function will be called directly if
it is private or marked as 'final'.

Is this true? Surely the REAL reason to use classes is to allocate using the GC?
Aren't struct's allocated on the stack, and passed to functions by value? Do I need to start using the ref keyword to use GC allocated structs?

 
An easy way to find functions that are not overridden (what you called false virtuals) is to add:

  final:

at the top of your class definition. The compiler will give you errors for any functions that need to be virtual.

If you don't want polymorphic behavior, use structs instead. Struct member
functions are never virtual.

I have never written a class in any language where the ratio of virtual to non-virtual functions is more than 1:10 or so... requiring that one explicitly declared the vastly more common case seems crazy.
The thing I'm most worried about is people forgetting to declare 'final:' on a class, or junior programmers who DON'T declare final, perhaps because they don't understand it, or perhaps because they have 1-2 true-virtuals, and the rest are just defined in the same place... This is DANGEROUS. The junior programmer problem is one that can NOT be overstated, and doesn't seem to have been considered in a few design choices.
I'll bet MOST classes result in an abundance of false-virtuals, and this is extremely detrimental to performance on modern hardware (and getting worse, not better, as hardware progresses).

 
Worse, if libraries contain false virtuals, there's good chance I may not be
able to use said library on certain architectures (PPC, ARM in particular).

??

If a library makes liberal (and completely unnecessary) virtual calls to the point where it performs too poorly on some architecture; lets say ARM, or PPC (architectures that will suffer far more than x86 form virtual calls), I can no longer use this library in my project... What a stupid position to be in. The main strength of any language is its wealth of libraries available, and a bad language decision prohibiting use of libraries for absolutely no practical reason is just broken by my measure.