March 06, 2017
On Monday, 6 March 2017 at 15:40:54 UTC, Rico Decho wrote:
>> If this isn't a perfect example of D's marketing problem I don't know what is. Someone who likes D and takes the time to write on the forum yet thinks the GC will randomly run no matter what.
>>
>> To make it abundantly clear: I'm not bashing on you in the slightest, Rico Decho. I'm just pointing out that there's a clear problem here in that we can't expect to convert e.g. C++ game developers who have never written a line of D before if we haven't even managed to educate the community yet.
>>
>> Unfortunately, I have no ideas on how to remedy the situation. I also don't know how to get people to stop believing that C is magically fast either, which I think is a similar perception problem.
>>
>> Atila
>
> Actually it's written in the documentation.

That's true and was pointed out to me on Twitter by a C++ dev. I don't know what's up with that but I'm _pretty_ sure it doesn't happen in practice, but only somebody who knows the GC implementation well can comment I guess.


> If I remember well the garbage collection could be triggered during any allocation, for instance when concatenating some displayed text, and freeze all threads until the garbage collection is done.

Right. So slap `@nogc` on whatever is in the game loop and that's guaranteed to not happen.

>
> In my opinion, the problem is D's GC implementation.

This is also a problem, yes.


> For instance, Nim uses a soft (realtime) GC, which is why Nim's author himself has made the Urho3D wrapper :)
>
> With this approach, no need to disable the GC and make manual allocations to avoid that the GC freezes all threads.
>
> Instead you simply use all the standard libraries as normally, while still try to avoid allocating too much stuff during the rendering of course.
>
> During the render loop, in Nim you occasionally call the GC with an numeric argument telling how much milliseconds it is allowed to use in the worst case.

That's pretty cool.

Atila

March 06, 2017
On Sun, Mar 05, 2017 at 05:26:08PM +0000, Russel Winder via Digitalmars-d wrote:
> On Fri, 2017-03-03 at 09:33 -0800, H. S. Teoh via Digitalmars-d wrote:
> > On Thu, Mar 02, 2017 at 07:12:07PM -0500, Nick Sabalausky (Abscissa)
> > via Digitalmars-d wrote:
> > > […]
> > Ahh, the memories! (And how I am dating myself... but who cares.)  Such fond memories of evenings spent poring over AppleSoft code trying for the first time in my life to write programs. And those lovely error messages with backwards punctuation:
> > 
> > 	?SYNTAX ERROR
> > 
> > :-)
> 
> Youngster. :-)
> 
> Oh for the days when the only error message you ever got was 0c4.

I bow before your venerable age! :-P


> > […]
> > I was skeptical of OO, and especially of Java, at the time.  It's
> > odd, given that I had just been learning C++ in college and was
> > familiar with OO concepts, but when I saw the way Java pushed for OO
> > to the exclusion of all else, I balked.  Call me a non-conformist or
> > whatever, but every time I see too much hype surrounding something,
> > my kneejerk reaction is to be skeptical of it.  I eschew all
> > bandwagons.
> 
> So how come you are on the D bandwagon? ;-)
[...]

Haha, you got me there.

Though truth be told, I only chose D after a long search for a better language and finding nothing that more closely matches my ideal of what a programming language should be.  And at the time, there wasn't much hype surrounding D at all (in fact, I would never have found it had I not been actively searching for new programming languages).


[...]
> > To be fair, though, Java as a language in and of itself is not bad at all. [...]  The mentality and hype of the community surrounding it, though, seem to me to have gone off the deep end, and have bred rabid zealots, sad to say, to this very day, of the kind of calibre you described above.
> 
> Whilst I can see that of the 1994 to 2014 period, I am not sure I see it so much that way now. There are developers in Java shops who are a bit "jobsworth" and care little for personal development, and they are the people who refuse to accept the existence of languages other than Java. However most of the Java folk at the main conferences are actually JVM folk and they know languages such as Kotlin, Scala, Clojure, Groovy, Ceylon, Frege, etc. as well as Java. The zealotry, when present, is more about the JVM than Java per se.

Perhaps my perception is colored by a close acquiantance who happens to be a Java zealot to this very day. :-P  JVM zealotry, OTOH, I don't see very much at all. In fact, I've never even heard such a term until you said it.


> > (I also TA'd a Java course back in the day, and was quite appalled to observe the number of thoroughly-confused students who couldn't tell control flow from OO, because "classes" had been hammered into their heads long before they even understood what a statement was. Apparently, imperative statements are non-OO and therefore evil, so one was supposed to wrap literally everything in classes. Nobody ever explained how one would implement class methods without using statements, though.  I suppose calling other class methods was excepted from the "evil" label, but it seemed to escape people's minds that eventually nothing would actually get accomplished if all you had was an infinite regress of calling class methods with no imperative statements in between. But such was the rabid OO-fanaticism in those days.)
> 
> There were, and are, a lot of bad teachers. Overzealous as it seems in this episode. This does not make "objects first" a bad idea per se, it just has to be done properly. Just as teaching bottom up from statement does. A bad teacher can teach any curriculum badly, that should not reflect on the curriculum.
[...]

The thing that gets to me is that these teachers, good or bad, committed the fallacy of embracing a single paradigm to the exclusion of everything else, even in the face of obvious cases where said paradigm didn't fit very well with the problem domain.  Some aspects of Java also reflect this same fallacy -- such as those ubiquitous singleton static classes in the OS-wrapping modules, or the impossibility of declaring a function outside of a class -- which to me are indications that it wasn't just the teachers, but a more pervasive trend in the Java ecosystem of putting on OO-centric blinders.


T

-- 
In a world without fences, who needs Windows and Gates? -- Christian Surchi
March 06, 2017
On 03/06/2017 07:47 PM, H. S. Teoh via Digitalmars-d wrote:
> On Sun, Mar 05, 2017 at 05:26:08PM +0000, Russel Winder via Digitalmars-d wrote:
>>
>> Oh for the days when the only error message you ever got was 0c4.

You can get similar experiences even in modern times in the embedded area (at least hobbyist anyway, I guess there is all that JTAG stuff). I remember doing some demos on a late prototype Propeller MC, and there were times all I had for debugging was a single solitary LED. To this day, I still can't decide whether that was fun or horrible. I must've have a bit of masochist in me :P

>> The zealotry,
>> when present, is more about the JVM than Java per se.
>
> Perhaps my perception is colored by a close acquiantance who happens to
> be a Java zealot to this very day. :-P  JVM zealotry, OTOH, I don't see
> very much at all. In fact, I've never even heard such a term until you
> said it.

I learned the true meaning of Java zealotry ten or so years ago, when talking to a co-worker (our resident Java-fan - 'course, this was a VB6 house so I can't entirely blame him for Java fandom) and I made some remark involving checked exceptions (which, at the time, were already widely considered problematic, or even a mistake, even within the Java world). I was stunned to see a quizzical expression on his face and then learn he was genuinely puzzled by the mere suggestion of Java's checked exceptions having any downside.

Luckily, this does seem much less common that it was at the time.

> The thing that gets to me is that these teachers, good or bad, committed
> the fallacy of embracing a single paradigm to the exclusion of
> everything else, even in the face of obvious cases where said paradigm
> didn't fit very well with the problem domain.  Some aspects of Java also
> reflect this same fallacy -- such as those ubiquitous singleton static
> classes in the OS-wrapping modules, or the impossibility of declaring a
> function outside of a class -- which to me are indications that it
> wasn't just the teachers, but a more pervasive trend in the Java
> ecosystem of putting on OO-centric blinders.

Yes, this. Although, granted, the OO-koolaid *was* quite strong indeed in those days.

It really is strange to look back on all that, when I was fairly sold on OO too (just not quite as fanatically so), and compare to now:

At this point I feel that class-based polymorphism mostly just turned out to be an awkward work-around for the lack of first-class functions and closures in mainstream languages. What convinced me: After years of using D, I find myself using OO less and less (OO polymorphism nearly never, aside from exception hierarchies), and instead of feeling hamstringed I feel liberated - and I'm normally a kitchen-sink kinda guy!

March 07, 2017
On Mon, 2017-03-06 at 10:22 -0800, H. S. Teoh via Digitalmars-d wrote:
> 
[…]
> Nevertheless, it's certainly true that D's GC could use a major
> upgrade
> at some point.  While it's not horrible, the present implementation
> does
> leave more to be desired.  Hopefully the various efforts at GC by
> forum
> members will at some point turn into some major improvements to D's
> GC.
> There was talk a year or two ago about a precise for D (with fallback
> to
> conservative GC for cases where that wouldn't work), but I'm not sure
> what has come of it.
[…]

Learn the lesson from Java. It started with a truly crap GC and everyone said Java is crap because the GC is garbage. D has seemingly actually progressed beyond this stage technically but not marketing wise. The Java folk worked on the GC and kept replacing it over and over again. The GC got better and better. Now with the G1 GC almost all the problem have gone away – as has most of the moaning about Java having a crap GC. Most people never notice the GC and those that do, engineer it rather than moaning. The Java GC situation is now a sophisticated one where those who don't really care do not have a problem and those that do care have the tools to deal with it.

D seems to be in a situation where those who don't care have a crap GC which needs to be improved and those who do care have the tools to deal with it. So there needs to be ongoing replacement of the D GC until there is something good, this is a technical problem. That people who care about the effect of GC still think D is a crap GC-based language implies there is a marketing problem, not a technical one.

We all know that many, many people see the word garbage collector and run a mile in an uneducated prejudiced way. Who cares about them. We care about the people who are willing to try stuff out and have a problem.

</rant>

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

March 07, 2017
> D seems to be in a situation where those who don't care have a crap GC which needs to be improved and those who do care have the tools to deal with it. So there needs to be ongoing replacement of the D GC until there is something good, this is a technical problem. That people who care about the effect of GC still think D is a crap GC-based language implies there is a marketing problem, not a technical one.

But I don't think that D's GC is fine for people who care about it.

If it is, why are people on this forum giving advices on how to disable and/or avoid it for soft real-time applications where a GC freeze can't be tolerated.

D's GC isn't a crap at all, but better designs and implementations exist, and Nim's GC is one of them.

We can either learn from it, or ignore it... But the second solution won't make D more appropriate for soft real-time scenarios...


March 07, 2017
On Tue, Mar 07, 2017 at 06:45:55PM +0000, Rico Decho via Digitalmars-d wrote: [...]
> But I don't think that D's GC is fine for people who care about it.
> 
> If it is, why are people on this forum giving advices on how to disable and/or avoid it for soft real-time applications where a GC freeze can't be tolerated.
> 
> D's GC isn't a crap at all, but better designs and implementations exist, and Nim's GC is one of them.
> 
> We can either learn from it, or ignore it... But the second solution won't make D more appropriate for soft real-time scenarios...

What the D GC needs is somebody willing to sit down and actually spend the time to improve/rewrite the code.  Over the years there has been an endless stream of fancy ideas, feature requests, and wishlists for the GC, but without anybody actually doing the work, nothing will actually happen.  We are all very well aware of the current GC's limitations and flaws for years now, and there has been some amount of improvements going into it over the years.  But again, talking about it won't magically make it better.  *Somebody* has to write the code, after all.

If anyone is interested to help, take a look at:

	https://github.com/dlang/druntime/pull/1603

and review the code, give some feedback, run the benchmarks yourself, etc., to prod this PR along.

If you have other ideas for improving the GC (e.g., adapting ideas from Nim's GC), submitting PRs to that effect would be much more effective than merely talking about it.


T

-- 
If you want to solve a problem, you need to address its root cause, not just its symptoms. Otherwise it's like treating cancer with Tylenol...
March 07, 2017
On Mon, Mar 06, 2017 at 10:41:06PM -0500, Nick Sabalausky (Abscissa) via Digitalmars-d wrote:
[...]
> Yes, this. Although, granted, the OO-koolaid *was* quite strong indeed in those days.
> 
> It really is strange to look back on all that, when I was fairly sold on OO too (just not quite as fanatically so), and compare to now:
> 
> At this point I feel that class-based polymorphism mostly just turned out to be an awkward work-around for the lack of first-class functions and closures in mainstream languages. What convinced me: After years of using D, I find myself using OO less and less (OO polymorphism nearly never, aside from exception hierarchies), and instead of feeling hamstringed I feel liberated - and I'm normally a kitchen-sink kinda guy!

I was never fully "sold" to the OO bandwagon, though I did appreciate the different way of looking at a programming problem.  While I found OO to be a nice way of structuring a program that deals with highly-structured data (it was like abstract data types on steroids), I never really understood the folks who see it as the be-all and end-all and want to essentially recast all of computer science in OO terms.

Like you, after coming to terms with D's duck-typing range idioms I've started moving away from OO and leaning more in the direction of generic programming via templates. These days I even prefer static polymorphism via structs and alias this, than full-out classes.  Of course, classes still do have their place when runtime polymorphism is needed, and I do use that at times. But it occupies a far smaller percentage of my code than the OO advocates would rally for.


T

-- 
Do not reason with the unreasonable; you lose by definition.
March 14, 2017
On Tue, 2017-03-07 at 18:45 +0000, Rico Decho via Digitalmars-d wrote:
> > D seems to be in a situation where those who don't care have a crap GC which needs to be improved and those who do care have the tools to deal with it. So there needs to be ongoing replacement of the D GC until there is something good, this is a technical problem. That people who care about the effect of GC still think D is a crap GC-based language implies there is a marketing problem, not a technical one.
> 
> But I don't think that D's GC is fine for people who care about it.
> 
> If it is, why are people on this forum giving advices on how to disable and/or avoid it for soft real-time applications where a GC freeze can't be tolerated.

Because an option that may be sensibly available for those that cannot cope with a GC language is for D to have a GC-less mode – at the expense of not using Phobos. Of course soft-real time and GC are not incompatible except in some people's minds: it is entirely possible to have GC in a soft real-time system, if the programming language supports it. The question here is only whether the current GC allows D to be used for soft real time.

> D's GC isn't a crap at all, but better designs and implementations exist, and Nim's GC is one of them.
> 
> We can either learn from it, or ignore it... But the second solution won't make D more appropriate for soft real-time scenarios...

The question is who is the "we" here. A lot of people have a lot of opinions on D and it's GC, including me. However, it seems that none of the people expressing opinions are willing to do anything other than express opinions on the email list.

My gut feeling is that the D language execution and data model is not compatible with a "do not stop the world" GC. However this is opinion not really backed with evidence.

What needs to happen is for a group of people who like complaining about the GC to get together and gather evidence as to what needs to change in the D language to support a soft real-time compatible GC such as Go, Nim, Java G1, etc. You can't just transplant an algorithm since the GC has to fit with the language data and execution model and D is more like C than like Java or Go.

If the result is that a change to the D execution or data model is needed then this has to be proposed and debated. If this is not something open to change, then there is no point in going any further.

I cannot commit to being involved in anything such as this until 2017- 06-30Y17:01+01:00, but from then there is a good possibility of getting me on board an effort to create a new GC for D (but note I really abhor the Phobos coding style with it's wrongly place opening braces).

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

March 14, 2017
On Tue, 2017-03-07 at 11:06 -0800, H. S. Teoh via Digitalmars-d wrote:
> On Tue, Mar 07, 2017 at 06:45:55PM +0000, Rico Decho via Digitalmars-
> d wrote:
> [...]
> > But I don't think that D's GC is fine for people who care about it.
> > 
> > If it is, why are people on this forum giving advices on how to disable and/or avoid it for soft real-time applications where a GC freeze can't be tolerated.
> > 
> > D's GC isn't a crap at all, but better designs and implementations exist, and Nim's GC is one of them.
> > 
> > We can either learn from it, or ignore it... But the second
> > solution
> > won't make D more appropriate for soft real-time scenarios...
> 
> What the D GC needs is somebody willing to sit down and actually
> spend
> the time to improve/rewrite the code.  Over the years there has been
> an
> endless stream of fancy ideas, feature requests, and wishlists for
> the
> GC, but without anybody actually doing the work, nothing will
> actually
> happen.  We are all very well aware of the current GC's limitations
> and
> flaws for years now, and there has been some amount of improvements
> going into it over the years.  But again, talking about it won't
> magically make it better.  *Somebody* has to write the code, after
> all.
> 
> If anyone is interested to help, take a look at:
> 
> 	https://github.com/dlang/druntime/pull/1603

As mentioned previously I can schedule taking a look at this, but only from 2017-06-30T17:01+01:00 onwards.

> and review the code, give some feedback, run the benchmarks yourself, etc., to prod this PR along.
> 
> If you have other ideas for improving the GC (e.g., adapting ideas
> from
> Nim's GC), submitting PRs to that effect would be much more effective
> than merely talking about it.
> 
> 
> T
> 
-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

March 14, 2017
On Tuesday, 14 March 2017 at 10:05:54 UTC, Russel Winder wrote:
> [...]
>
> My gut feeling is that the D language execution and data model is not compatible with a "do not stop the world" GC. However this is opinion not really backed with evidence.

I've recently been made aware of [1] and [2]. GC seems to always be a question of what you're willing to sacrifice, and if you want low pause times AFAIK the only known way to get that is to sacrifice overall throughput. This seems to be contradictory to D's goals of efficiency and control to me.

>
> What needs to happen is for a group of people who like complaining about the GC to get together and gather evidence as to what needs to change in the D language to support a soft real-time compatible GC such as Go, Nim, Java G1, etc. You can't just transplant an algorithm since the GC has to fit with the language data and execution model and D is more like C than like Java or Go.
>
> If the result is that a change to the D execution or data model is needed then this has to be proposed and debated. If this is not something open to change, then there is no point in going any further.

The problem with changing D's execution and/or data model is that AFAIK to be viable for a better GC the necessary sacrifices will ensure that D cannot compete with C in terms of performance anymore. I'm not sure how the majority of the D community would feel about that, but I don't think I at least could still advocate D as a better drop-in replacement for C.

>
> I cannot commit to being involved in anything such as this until 2017- 06-30Y17:01+01:00, but from then there is a good possibility of getting me on board an effort to create a new GC for D (but note I really abhor the Phobos coding style with it's wrongly place opening braces).

OT: That's what (GIT) commit hooks are for. Write how you want, automatically commit as whatever non-Allman, wrong style the project uses.

[1] https://blog.plan99.net/modern-garbage-collection-911ef4f8bd8e?gi=78635e05a6ac#.6zz5an77a
[2] http://www.infognition.com/blog/2014/the_real_problem_with_gc_in_d.html