Jump to page: 1 2
Thread overview
Scala future, Sing#
Aug 23, 2009
bearophile
Aug 24, 2009
bearophile
Aug 25, 2009
Walter Bright
Aug 25, 2009
Ary Borenszweig
Aug 25, 2009
Walter Bright
Aug 25, 2009
Walter Bright
Aug 25, 2009
Walter Bright
Aug 25, 2009
Bill Baxter
Aug 25, 2009
bearophile
Aug 25, 2009
Walter Bright
Aug 26, 2009
bearophile
Aug 26, 2009
Danny Wilson
Aug 26, 2009
Walter Bright
August 23, 2009
As Sing# (http://en.wikipedia.org/wiki/Sing_sharp ) and Chapel, Scala is one of the languages to be followed more, because they share some of future purposes of D2/D3.

A small presentation about the close future of Scala (it's not a general introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky: http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf

Sing# has a complex and advanced core, and I haven't found good tutorials about it. I have found a confusing document about Spec#, that's a limited father of Sing#. I think the only significant improvement of Spec# over D2 is that Spec# contains a kind of inference engine that probably processes/uses contracts and class invariants quite better than D2 (probably in a similar way to Eiffel):
http://channel9.msdn.com/wiki/specsharp/specsharpobjectprimer

Bye,
bearophile
August 24, 2009
bearophile wrote:

> As Sing# (http://en.wikipedia.org/wiki/Sing_sharp ) and Chapel, Scala is one of the languages to be followed more, because they share some of future purposes of D2/D3.
> 
> A small presentation about the close future of Scala (it's not a general introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky: http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf

Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors:

- community: From what I've gathered, the Scala community mostly consists of much more experienced computer scientists and programmers (I don't mean industrial boiler-plate experience but experience with different kinds of languages, PL concepts and e.g. sound argumentation). These programmers aren't afraid of radical new ideas if it helps every day coding. These guys hate unorthogonality and love rigorous definition of semantics. They also want to discuss language issues and unlike Walter, Odersky doesn't lurk silently when important things are being discussed. This is a huge ++ to the PR. He also welcomes academics and doesn't ask them to go back to their ivory tower like most in D's community do. I understand embracing the industry, too, but it hasn't brought much money to D's development yet.

- bloat: Scala is more lightweight. I've heard Walter say that he doesn't like e.g. library defined control structures - it's double-edged sword, and D and Scala have taken different paths here (in D if something is commonly used and it can be made built-in, it will be added to the compiler, in Scala it's the opposite). Scala has a very lightweight language core, and many additional features are defined in libraries. Several optimizations that improve the performance of HOFs are already known, but the compiler and virtual machine are not yet as good as they can be. In theory a good supercompiler can make Scala as fast as D. I personally find it funny that the meta-programming features in D are perfect for shrinking the language core, but every year new features still keep creeping it.

- dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to JVM.

- OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is used). Thanks to the JIT compiler. The new 2.8 supports new static optimizations similar to what C++ & D have had.

- syntax: Scala has a consistent and orthogonal syntax. Syntactic sugar is
used sparingly and when it's being used, it shaves off boilerplate quite a
bit.
  * e.g. (_._2 * _._1) is something like (tuple a, tuple b) { return a(1) *
b(0); } in D.. I leave the definition of the tuple type as an exercise to
the reader.
  * (A => B) => (C => D) vs (B function(A)) function (D function(C))
  * case class foo(val a: Int, var b: String) is somewhere between 10-30 LOC
in D
  * In D syntactic sugar often saves only a couple of characters (like the
new template T!X syntax)

- modularity & types: Scala supports modularity much better IMO (pure OOP, self types etc.). The abstractions are well suited for most tasks. But this is a bit hard to compare objectively.

- high level features: Scala unifies OOP and FP. It also has novel new OOP concepts.

- low level features: D wins here, OTOH a low level language isn't good for e.g. sandboxed environments

- memory management: the JVM's GC is pretty mature, but of course manual memory management isn't as easy as in D

- compatibility: D wins (?) here if C/C++ compatibility is important, but Scala is compatible with the large existing code base of Java, though

- bugs: IMHO the unspecified parts of D and the huge amount of bugs made it unusable for me. Luckily I found Scala and have been really happy with it. I've only found maybe 1-2 bugs in it during the last 1.5 years. I usually find about 5-10 bugs in DMD in 15 minutes after coming back to D. And I'm also so happy to find that thanks to authors' knowledge of type systems (dependent types, HM, System F, etc.) Scala is a genius at inferring types. D doesn't really have a clue. Especially the array literal type inference is really naive.

- to summarize: I use Scala for high level tasks, and came back to D when I need to see the actual machine code and optimize some tight inner loop. D is sometimes more suitable for this than C/C++ since it has a bit saner syntax and high level abstractions. But in general I nowadays write 90% of my code in Scala. I'm much happier and more productive writing Scala. YMMV
August 24, 2009
Jari-Matti M.:

>There are certainly truckloads of features that could be taken from Scala to D.<

But D2 is already quite complex, so it's better to add things carefully. For example patterm matching is useful, but it adds a lot of complexity too.


>But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features.<

And this can be a good thing, because C and C++ are commonly used languages.


>- community: From what I've gathered, the Scala community mostly consists of
much more experienced computer scientists and programmers<

This is an advantage for D: it can be used by more ignorant people too (or people more ignorant of functional lanmguages).


>- bloat: Scala is more lightweight.<

This is a matter of balance, and there are no 'best' solutions. Moving things from the library to the language has some advantages.


>Several optimizations that improve the performance of HOFs are already known, but the compiler and virtual machine are not yet as good as they can be. In theory a good supercompiler can make Scala as fast as D.<

HotSpot Java GC is much more efficient than the current D GC, and HotSpot is often able to inline virtual methods. D has the advantage of having a simpler core. Creating a Scala compiler on LLVM may be hard, running D1 on LLVM was easy enough. Simpler systems have some advantages. Scala type inference is much more powerful but it's also harder to use (if you want to do complex things), and requires a more complex compiler.
In practice supercompilers are very hard to create, while D1 code running on the LLVM is already very efficient.
Simpler systems also have the advantahe of being more transparent: understanding why some D1 code is fast or slow is probably simpler than doing the same thing with a piece of Scala code.


>I personally find it funny that the meta-programming features in D are perfect for shrinking the language core, but every year new features still keep creeping it.<

They are not perfect, they have limits, and the ersults aren't always nice, see the struct bitfields.


>- dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to JVM.<

Some of such things can be added/improved in D too.


>- syntax: Scala has a consistent and orthogonal syntax.<

Too much orthogonality is bad, it produces the LEGO disease. A compromise is better.


>Syntactic sugar is used sparingly and when it's being used, it shaves off boilerplate quite a bit.   * e.g. (_._2 * _._1)<

Here there are risks too. I have seen a syntax for a fold (reduce) in Scala that's horribly unreadable. Python3 has even removed reduce() from the core language because folds aren't easy to understand and I agree with their decision. Keeping the language easy is more important. Too much boilerplate is boring, but the boilerplate is better than hard to understand code.


>* In D syntactic sugar often saves only a couple of characters (like the new template T!X syntax)<

I agree that was a silly idea, that I may even like to remove from D2. I want eager/lazy sequence comphrensions :-)


>- memory management: the JVM's GC is pretty mature, but of course manual memory management isn't as easy as in D<

Some forms of memory usage that I use in D are impossible on the JavaVM.


>- compatibility: D wins (?) here if C/C++ compatibility is important, but Scala is compatible with the large existing code base of Java, though<

D refuses to be compatible with C++. There's just partial compatibility (and this is probably good).


>the huge amount of bugs made it unusable for me.<

There are many bugs in D, but D is slowly opening a bit more and more toward the community, for example see the recent lot of bug fixes by Don. If more people will work like Don, lot of bugs will be removed. Walter is slowly understanding what open source development means. So I have hope still.


>And I'm also so happy to find that thanks to authors' knowledge of type systems (dependent types, HM, System F, etc.) Scala is a genius at inferring types. D doesn't really have a clue.<

Scala uses a totally different type system. I think it uses an Hindley–Milner type inference algorithm. Walter is probably not expert on such thing. A single person can't be expert on everything. Today designing concurrency, type inference or garbage collectors requires lot of specialized and even some academic knowledge. Scala author has used the JavaVM to avoid doing lot of low-level work. D type system isn't so bad. It has limits, and some of such limits may be lifted a little, but you can do lot of things with the D language anyway, see some of the things I've done in my dlibs. Keeping the type system simpler has some advantages too, for example compilation speed.


>Especially the array literal type inference is really naive.<

I'm sure it's not hard to fix the array literal type inference that currently is not good; it's just that Walter isn't interested in doing it, or he thinks things are good like this, like for the half-unfinished module system.


>- to summarize: I use Scala for high level tasks, and came back to D when I need to see the actual machine code and optimize some tight inner loop. D is sometimes more suitable for this than C/C++ since it has a bit saner syntax and high level abstractions.<

Today when you need hi-performance you need the GPU or to use SSE registers very well.
An usage usample of the GPU:
http://www.smartwikisearch.com/algorithm.html

In the end if you need really fast programs that have to perform heavy numerical computations you need languages like Python (plus the right libs, like CorePy), D (and Scala) isn't up to the task yet:
http://www.corepy.org/
http://mathema.tician.de/software/pycuda
http://python.sys-con.com/node/926439
http://pypi.python.org/pypi/python-opencl/0.2

Bye,
bearophile
August 25, 2009
Jari-Matti Mäkelä wrote:
> bearophile wrote:
> 
>> As Sing# (http://en.wikipedia.org/wiki/Sing_sharp ) and Chapel, Scala is
>> one of the languages to be followed more, because they share some of
>> future purposes of D2/D3.
>>
>> A small presentation about the close future of Scala (it's not a general
>> introduction to Scala): "Scala -- The Next 5 Years" by Martin Odersky:
>> http://www.scala-lang.org/sites/default/files/odersky/scalaliftoff2009.pdf
> 
> Scala is an impressive language and overall well designed. There are certainly truckloads of features that could be taken from Scala to D. But I'm afraid that the desire to have familiarity and compatibility with the C/C++ family is more important in this community than cool new functional features. Here's a quick comparison of some factors:
> 
> - community: From what I've gathered, the Scala community mostly consists of much more experienced computer scientists and programmers (I don't mean industrial boiler-plate experience but experience with different kinds of languages, PL concepts and e.g. sound argumentation). These programmers aren't afraid of radical new ideas if it helps every day coding. These guys hate unorthogonality and love rigorous definition of semantics. They also want to discuss language issues and unlike Walter, Odersky doesn't lurk silently when important things are being discussed. This is a huge ++ to the PR. He also welcomes academics and doesn't ask them to go back to their ivory tower like most in D's community do. I understand embracing the industry, too, but it hasn't brought much money to D's development yet.
> 
> - bloat: Scala is more lightweight. I've heard Walter say that he doesn't like e.g. library defined control structures -

Actually, you can do them with "lazy" function arguments. There was an example somewhere of doing control structures with it.

> it's double-edged sword, and D and Scala have taken different paths here (in D if something is commonly used and it can be made built-in, it will be added to the compiler, in Scala it's the opposite).

That's not quite right. I'll add things to the core if there is a good reason to - the compiler can do things a library cannot. For example, string literals.

> Scala has a very lightweight language core, and many additional features are defined in libraries. Several optimizations that improve the performance of HOFs are already known, but the compiler and virtual machine are not yet as good as they can be. In theory a good supercompiler can make Scala as fast as D.

I've been hearing that (about Java, same problem) for as long as Java has been around. It might get there yet, but that won't be in the near future.

> I personally find it funny that the meta-programming features in D are perfect for shrinking the language core, but every year new features still keep creeping it.

Actually, some features are being removed. Imaginary and complex variables, for one. There's some work being done to rewrite D forms into simpler D forms, saving hundreds of lines of code in the compiler.


> - dynamics: Scala is more dynamic (reflection, class loaders etc.) Thanks to JVM.

Yes, but an interpreter or JIT is required to make that work. That makes the language binary not lightweight.


> - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is used).

Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.

> Thanks to the JIT compiler. The new 2.8 supports new static optimizations similar to what C++ & D have had.
> 
> - syntax: Scala has a consistent and orthogonal syntax. Syntactic sugar is used sparingly and when it's being used, it shaves off boilerplate quite a bit.
>   * e.g. (_._2 * _._1) is something like (tuple a, tuple b) { return a(1) * b(0); } in D.. I leave the definition of the tuple type as an exercise to the reader.   * (A => B) => (C => D) vs (B function(A)) function (D function(C))
>   * case class foo(val a: Int, var b: String) is somewhere between 10-30 LOC in D
>   * In D syntactic sugar often saves only a couple of characters (like the new template T!X syntax)
> 
> - modularity & types: Scala supports modularity much better IMO (pure OOP, self types etc.). The abstractions are well suited for most tasks. But this is a bit hard to compare objectively.
> 
> - high level features: Scala unifies OOP and FP. It also has novel new OOP concepts.
> 
> - low level features: D wins here, OTOH a low level language isn't good for e.g. sandboxed environments

Sure, but there's the Safe D subset, and also D isn't intended for non-programmers to download untrusted source code from the internet and run.

> - memory management: the JVM's GC is pretty mature, but of course manual memory management isn't as easy as in D
> 
> - compatibility: D wins (?) here if C/C++ compatibility is important, but Scala is compatible with the large existing code base of Java, though

You can mechanically translate Java to D, but it still requires some manual touch-up.

> - bugs: IMHO the unspecified parts of D and the huge amount of bugs made it unusable for me. Luckily I found Scala and have been really happy with it. I've only found maybe 1-2 bugs in it during the last 1.5 years. I usually find about 5-10 bugs in DMD in 15 minutes after coming back to D.

I couldn't find any bugs you've submitted to the D bugzilla. If you don't submit them, they won't get fixed <g>.

> And I'm also so happy to find that thanks to authors' knowledge of type systems (dependent types, HM, System F, etc.) Scala is a genius at inferring types. D doesn't really have a clue.

Can you give an example?

> Especially the array literal type inference is really naive.

How should it be done?

> - to summarize: I use Scala for high level tasks, and came back to D when I need to see the actual machine code and optimize some tight inner loop. D is sometimes more suitable for this than C/C++ since it has a bit saner syntax and high level abstractions. But in general I nowadays write 90% of my code in Scala. I'm much happier and more productive writing Scala. YMMV

I appreciate you taking the time to tell us your impressions on this.
August 25, 2009
Walter Bright escribió:
> Jari-Matti Mäkelä wrote:
>> bearophile wrote:
>> - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is used).
> 
> Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.

I think the standard name is "adaptive optimization":

http://en.wikipedia.org/wiki/Adaptive_optimization

"Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile."

"Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."
August 25, 2009
Ary Borenszweig wrote:
> Walter Bright escribió:
>> Jari-Matti Mäkelä wrote:
>>> bearophile wrote:
>>> - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM is used).
>>
>> Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
> 
> I think the standard name is "adaptive optimization":
> 
> http://en.wikipedia.org/wiki/Adaptive_optimization
> 
> "Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile."
> 
> "Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."

It's also called profile guided optimization, but Jari-Matti said it was "OOP" related, so I wondered how that fit in.
August 25, 2009
Walter Bright wrote:

> Ary Borenszweig wrote:
>> Walter Bright escribió:
>>> Jari-Matti Mäkelä wrote:
>>>> bearophile wrote:
>>>> - OOP: Scala supports dynamic OOP optimizations unlike D (unless a VM
>>>> is used).
>>>
>>> Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.
>> 
>> I think the standard name is "adaptive optimization":
>> 
>> http://en.wikipedia.org/wiki/Adaptive_optimization
>> 
>> "Adaptive optimization is a technique in computer science that performs dynamic recompilation of portions of a program based on the current execution profile."
>> 
>> "Consider a hypothetical banking application that handles transactions one after another. These transactions may be checks, deposits, and a large number of more obscure transactions. When the program executes, the actual data may consist of clearing tens of thousands of checks without processing a single deposit and without processing a single check with a fraudulent account number. An adaptive optimizer would compile assembly code to optimize for this common case. If the system then started processing tens of thousands of deposits instead, the adaptive optimizer would recompile the assembly code to optimize the new common case. This optimization may include inlining code or moving error processing code to secondary cache."
> 
> It's also called profile guided optimization, but Jari-Matti said it was "OOP" related, so I wondered how that fit in.

I meant this

"Another important example of this kind of optimization is class-hierarchy- based optimization. A virtual method invocation, for example, involves looking at the class of the receiver object for the call to discover which actual target implements the virtual method for the receiver object. Research has shown that most virtual invocations have only a single target for all receiver objects, and JIT compilers can generate more-efficient code for a direct call than for a virtual invocation. By analyzing the class hierarchy's state when the code is compiled, the JIT compiler can find the single target method for a virtual invocation and generate code that directly calls the target method rather than performing the slower virtual invocation. Of course, if the class hierarchy changes and a second target method becomes possible, then the JIT compiler can correct the originally generated code so that the virtual invocation is performed. In practice, these corrections are rarely required. Again, the potential need to make such corrections makes performing this optimization statically troublesome."

http://www.ibm.com/developerworks/java/library/j-rtj2/index.html
August 25, 2009
Jari-Matti Mäkelä wrote:
> I meant this
> 
> "Another important example of this kind of optimization is class-hierarchy-
> based optimization. A virtual method invocation, for example, involves looking at the class of the receiver object for the call to discover which actual target implements the virtual method for the receiver object. Research has shown that most virtual invocations have only a single target for all receiver objects, and JIT compilers can generate more-efficient code for a direct call than for a virtual invocation. By analyzing the class hierarchy's state when the code is compiled, the JIT compiler can find the single target method for a virtual invocation and generate code that directly calls the target method rather than performing the slower virtual invocation. Of course, if the class hierarchy changes and a second target method becomes possible, then the JIT compiler can correct the originally generated code so that the virtual invocation is performed. In practice, these corrections are rarely required. Again, the potential need to make such corrections makes performing this optimization statically troublesome."
> 
> http://www.ibm.com/developerworks/java/library/j-rtj2/index.html

I'm not quite sure what that means, but I think it means nothing more than noting that a method is not overridden, and so can be called directly.

Currently, this optimization happens in D if a method or class is annotated with 'final'. It is possible for the compiler to do this if flow analysis can prove the direct type of a class reference, but the optimizer currently does not do that. It is also possible for the compiler to determine that methods are final automatically if it knows about all the modules that import a particular class.
August 25, 2009
Walter Bright wrote:

> Jari-Matti Mäkelä wrote:

>> - bloat: Scala is more lightweight. I've heard Walter say that he doesn't like e.g. library defined control structures -
> 
> Actually, you can do them with "lazy" function arguments. There was an example somewhere of doing control structures with it.

Agreed, you /can/ do something similar. But in Scala it's the standard way
of doing things. If you compare the grammars of both languages, you'll see
that Scala is a bit lighter than D (see http://www.scala-
lang.org/sites/default/files/linuxsoft_archives/docu/files/ScalaReference.pdf)

> 
>> it's double-edged sword, and
>> D and Scala have taken different paths here (in D if something is
>> commonly used and it can be made built-in, it will be added to the
>> compiler, in Scala it's the opposite).
> 
> That's not quite right. I'll add things to the core if there is a good reason to - the compiler can do things a library cannot. For example, string literals.

I exaggerated a bit. But there are some constructs that some think should not be there, e.g. foreach_reverse.

>> - bugs: IMHO the unspecified parts of D and the huge amount of bugs made it unusable for me. Luckily I found Scala and have been really happy with it. I've only found maybe 1-2 bugs in it during the last 1.5 years. I usually find about 5-10 bugs in DMD in 15 minutes after coming back to D.
> 
> I couldn't find any bugs you've submitted to the D bugzilla. If you don't submit them, they won't get fixed <g>.

I've submitted couple of reports years ago and luckily some of them have
already been fixed. Maybe the search is broken.

>> And I'm
>> also so happy to find that thanks to authors' knowledge of type systems
>> (dependent types, HM, System F, etc.) Scala is a genius at inferring
>> types. D doesn't really have a clue.
> 
> Can you give an example?

http://d.puremagic.com/issues/show_bug.cgi?id=3042

auto foo = [ 1, 2L ]; // typeof == int[2u]
auto foo = [ 2L, 1 ]; // typeof == long[2u]
auto foo = [ "a", "abcdefgh" ]; // typeof == char[1u][2u] in D1
auto foo = [ [], [1,2,3] ]; // doesn't even compile

> 
>> Especially the array literal type inference is really naive.
> 
> How should it be done?

You shouldn't use the type of the first given element when constructing the type of the array. If you have [ e_1, ..., e_n ], the type of the literal is unify(type_of_e_1, ..., type_of_e_n) + "[]". For instance:

=> typeof([ [], [1,2,3] ])
=> unify( typeof([]), typeof([1,2,3]) ) + "[]"
=> unify( "a[]", unify(typeof(1),typeof(2),typeof(3)) + "[]" ) + "[]"
=> unify( "a[]", unify("int","int","int") + "[]" ) + "[]"
=> unify( "a[]", "int" + "[]" ) + "[]"
=> unify( "a[]", "int[]" ) + "[]"   // a is a local type var, subst = { a ->
int }
=> "int[]" + "[]"
=> "int[][]"
August 25, 2009
Walter Bright:

>Actually, you can do them with "lazy" function arguments. There was an example somewhere of doing control structures with it.<

There are some problems with this:
- Are current (especially LDC) compilers able to inline those lazy delegates? Scala compiler contains some parts to do that at compile-time (so it's not done at runtime by the JavaVM).
- I have put inside my dlibs a select() (adapting code written by another person) that uses lazy arguments to implement an eager (it can't be lazy, unfortunately) array comphrension. I'ver tried to see how the LDC compiles it and people there have shown me distaste for that code of mine, even just for a benchmark. So it seems the D community doesn't like to use lazy arguments to create control structures.
- Andrei has shown so much distate for such things that the Phobos2 doesn't ususually even use normal delegates, and you have even added a "typeless" way to give a delegate to a template in D2. This shows there's little interest among D developers to go the way of Scala. Scala uses delegates for those purposes, and then inlines them.


>I've been hearing that (about Java, same problem) for as long as Java has been around. It might get there yet, but that won't be in the near future.<

Today Java is very fast, especially for very OOP-style code. Sometimes programs in C++ can be a little faster, but generally no more than 2 times. C# on dotnet too is fast, for example its GC and associative arrays are much faster.


>Yes, but an interpreter or JIT is required to make that work. That makes the language binary not lightweight.<

D can be improved/debugged in several ways in this regard, even if keeps not using a VM.


>Do you mean knowing a class or virtual method has no descendants? Sure, you need to know the whole program to do that, or just declare it as final.<

I can see there's lot of confusion about such matters.
Experience has shown that class-hierarchy-based optimization isn't much effective, because in most practical programs lot of virtual calls are bi- or multi- morphic. Other strategies like "Type feedback" work better.
I have already discussed this topic a little, but it was on the D.learn newsgroup, so you have missed it.

A good old paper about this topic:
"Eliminating Virtual Function Calls in C++ Programs" (1996), by Gerald Aigner, Urs Hölzle:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.7.7766

There are other papers written on this topic, and some of such papers are more modern/updated too (this is about 13 years old), but it's a good starting point. If yoy want more papers please ask.

Note that the LDC compiler has Link-Time Optimization too, and LTO can also be done when you "know the whole program". If the front-end gives some semantic annotations to LDC, it can do powerful things during LTO.


>Can you give an example?<

I don't know Scala enough to give you examples, so I leave this to Jari-Matti. But I think Scala uses an Hindley-Milner type inference algorithm, it's another class of type inferencing. I am not asking you to put Hindley-Milner inside D.


>[array literal type inference] How should it be done?<

Silently dropping information is bad. So cutting strings according to the length of the first one as in D1 is bad.
The type of an array literal has to be determined by the type specified by the programmer on the right. If such annotation is absent (because there's an auto, or because the array is inside an expression) the type has to be the most tight able to represent all the types contained in the array literal (or raise an error if no one can be found).
By default array literals have to produce dynamic arrays, unless the programmers specifies that he/she/shi wants a fixed-size one.
(People are asking for similar things for years, it's not a new topic invented by Jari-Matti M. or by me).

Bye,
bearophile
« First   ‹ Prev
1 2