April 06, 2013
On Saturday, 6 April 2013 at 21:29:20 UTC, Peter Alexander wrote:
> On Saturday, 6 April 2013 at 11:01:09 UTC, bearophile wrote:
>> Peter Alexander:
>>
>>> I also use a modified druntime that prints callstacks when a GC allocation occurs, so I know if it happens by accident.
>>
>> Is it possible to write a patch to activate those prints with a compiler switch?
>
> Yes, but I don't have time to do that right now.

We lack decent tools to even understand what the GC is doing, so that's the sort of thing that should be built directly into D rather than as a patch.

--rt
April 07, 2013
On Saturday, 6 April 2013 at 22:29:42 UTC, Rob T wrote:
> We lack decent tools to even understand what the GC is doing,

https://github.com/CyberShadow/Diamond

D1-only due to lack of interest.
April 07, 2013
Thanks for your response
> 
> In my case I have been able to mostly get around the problem by strategically disabling the GC during active memory allocations, and then re-enabling when all or most of the allocations are completed. In effect I'm doing manual memory management all over again because the automated version fails to do a good enough job. Using this technique I've been able to get near C++ performance speeds.

Incidentally, when you got this speed, what compiler were you using? dmd?
> 
> Part of the problem is that the GC implementation is simply not suitable for performance code and lacks fine tuning abilities (that I'm aware of). Without a decent brain, it does stupid things, so when you have a lot of allocations going on but no deallocations, the GC seems to be running needlessly slowing down the application by as much as 3x.

Maybe it's time the GC implementation is addressed - or otherwise, the whole concept of GC in D should be dropped. To be honest, I'm perfectly happy with RAII in C++ and since D supports that fully (even better IMHO), I don't really see that much point for GC in a language that is vying to become a top systems language.

D without a GC and as fast as C++ ............... that would be it - the ultimate IMHO.

April 07, 2013
Hi
> 
> D's GC is not as good as some other system programming languages like Oberon or Active Oberon, just to cite two examples from many.

As I said, maybe it's time (IMHO) for D's GC to addresses - or otherwise dropped.
> 
> However, does the current performance really impact the type of applications you are writing?

Yes it does; and to be honest, I don't buy into this argument that for certain apps I don't need the speed and all that... why should I ever want a slower app? And if performance was not such an issue, to be perfectly frank, then Java would more than suffice and I would not be looking at D in the first place. D is supposed to be a better C++ (or at least that's what I have been led to believe - or like to believe)...... so it's got to be an improvement all round. It is a better structured and neater language, but if it's going to come at the price of being slower to C++, than at the end of the day it is not an improvement at all.
> 
> I'm asking because I always found the C and C++ communities always care too much about micro optimizations in cases it does not matter. Coming from a polyglot background I never managed to grok that.
> 
> However there are cases where every byte and every ms matter, in those cases you are still better with C, C++ and Fortran.

But why are you so quick to give up on D being as fast as C++ ?
Wouldn't it be just awesome if D - with its better constructs and all that
- was just as fast as C++ ?
Can't it just be that someone does achieve the best of both worlds?
I feel that D is very close to that: a great, versatile and powerful
language... if only the performance was as good as C++'s then it'll be
what I have always dreamt of.

Just my 2p worth...

April 07, 2013
On Sunday, 7 April 2013 at 09:10:14 UTC, Adrian Mercieca wrote:
>> However, does the current performance really impact the type of
>> applications you are writing?
>
> Yes it does; and to be honest, I don't buy into this argument that for
> certain apps I don't need the speed and all that... why should I ever want
> a slower app? And if performance was not such an issue, to be perfectly
> frank, then Java would more than suffice and I would not be looking at D
> in the first placeю

The point here is that applications caring for performance don't do dynamic allocations at all. Both GC and malloc are slow, memory pools of pre-allocated memory are used instead. Having standard lib helpers for those may be helpful but anyway, those are GC-agnostic and hardly done any differently than in C++. So it should be possible to achieve performance similar to C/C++ even with current bad GC if application memory architecture is done right.

It is not a panacea and sometimes the very existence of GC harms performance requirements (When not only speed, but also latency matter). That is true. But for performance-hungry user applications situation is pretty acceptable right now. Well, it will be, once easy way to track accidental gc_malloc calls is added.
April 07, 2013
Am 07.04.2013 11:10, schrieb Adrian Mercieca:
> Hi
>>
>> D's GC is not as good as some other system programming languages like
>> Oberon or Active Oberon, just to cite two examples from many.
>
> As I said, maybe it's time (IMHO) for D's GC to addresses - or otherwise
> dropped.
>>
>> However, does the current performance really impact the type of
>> applications you are writing?
>
> Yes it does; and to be honest, I don't buy into this argument that for
> certain apps I don't need the speed and all that... why should I ever want
> a slower app? And if performance was not such an issue, to be perfectly
> frank, then Java would more than suffice and I would not be looking at D
> in the first place. D is supposed to be a better C++ (or at least that's
> what I have been led to believe - or like to believe)...... so it's got to
> be an improvement all round. It is a better structured and neater
> language, but if it's going to come at the price of being slower to C++,
> than at the end of the day it is not an improvement at all.

The current compilers just don't have the amount of investment in more than 20 years of code optimization like C++ has. You cannot expect to achieve that from one moment to the other.

>>
>> I'm asking because I always found the C and C++ communities always care
>> too much about micro optimizations in cases it does not matter. Coming
>> from a polyglot background I never managed to grok that.
>>
>> However there are cases where every byte and every ms matter, in those
>> cases you are still better with C, C++ and Fortran.
>
> But why are you so quick to give up on D being as fast as C++ ?
> Wouldn't it be just awesome if D - with its better constructs and all that
> - was just as fast as C++ ?
> Can't it just be that someone does achieve the best of both worlds?
> I feel that D is very close to that: a great, versatile and powerful
> language... if only the performance was as good as C++'s then it'll be
> what I have always dreamt of.
>
> Just my 2p worth...
>

I am not giving up speed. It just happens that I have been coding since 1986 and I am a polyglot programmer that started doing system programming in the Pascal family of languages, before moving into C and C++ land.

Except for some cases, it does not matter if you get an answer in 1s or
2ms, however most single language C and C++ developers care about the 2ms case even before starting to code, this is what I don't approve.

Of course I think given time D compilers will be able to achieve C++ like performance, even with GC or who knows, a reference counted version.

Nowadays the only place I do manual memory management is when writing Assembly code.

--
Paulo
April 07, 2013
On Sunday, 7 April 2013 at 09:02:25 UTC, Adrian Mercieca wrote:
>
> Incidentally, when you got this speed, what compiler were you using? dmd?

I was (and still am) using the latest released DMD compiler.

Here's the original thread where I presented the problem and the solution. Youu probably should read through it to understand what needs to be done.

http://forum.dlang.org/thread/waarzqtfcxuzhzdelhtt@forum.dlang.org

>
> Maybe it's time the GC implementation is addressed - or otherwise, the
> whole concept of GC in D should be dropped. To be honest, I'm perfectly
> happy with RAII in C++ and since D supports that fully (even better IMHO),
> I don't really see that much point for GC in a language that is vying to
> become a top systems language.
>
> D without a GC and as fast as C++ ............... that would be it - the
> ultimate IMHO.

Ideally, I think what we need is 1) a better GC since the pros with using one are very significant, and 2) the ability to selectively mark sections of code as "off limits" to all GC dependent code. What I mean by this is that the compiler will refuse to compile any code that makes use of automated memory allocations for a @noheap marked section of code.

There's been a proposal to do this that really ought to be taken seriously
http://d.puremagic.com/issues/show_bug.cgi?id=5219

You'll see there's also related proposals for better fine tuning through attribute marked sections of code in general, which is another item that I would like to see implemented one day.

Please vote it up if you agree.

--rt
April 07, 2013
On Sunday, 7 April 2013 at 09:41:21 UTC, Dicebot wrote:
[...]
> applications situation is pretty acceptable right now. Well, it will be, once easy way to track accidental gc_malloc calls is added.

That's the critical missing piece of the puzzle. In effect we need to be able to use a sub-set of D that is 100% GC free. Currently writing GC-free applications in D may be theoretically possible, but it is simply not a practical option in most situations for most people, it's far too error probe and fragile.

--rt
April 07, 2013
> We lack decent tools to even understand what the GC is doing, so that's the sort of thing that should be built directly into D rather than as a patch.
>
> --rt

GCStats isn't yet done, but would be a good start.

April 07, 2013
> I also use a modified druntime that prints callstacks when a GC allocation occurs, so I know if it happens by accident.

I'd happily welcome any patches that get rid of GC usage in druntime.