January 08, 2013
On 1/7/2013 6:23 PM, Brad Roberts wrote:
> My primary point being, blaming the GC when it's the application style
> that generates enough garbage to result in wanting to blame the GC for the
> performance cost is misplaced blame.

True dat. There is no such thing as a memory allocation technology that will enable users to code without thought of it and yet get optimal performance.

January 08, 2013
On 1/7/2013 3:11 PM, H. S. Teoh wrote:
> I think much of the aversion to GCs is misplaced.  I used to be very
> aversive of GCs as well, so I totally understand where you're coming
> from. I used to believe that GCs are for lazy programmers who can't be
> bothered to think through their code and how to manage memory properly,
> and that therefore GCs encourage sloppy coding. But then, after having
> used D extensively for my personal projects, I discovered to my surprise
> that having a GC actually *improved* the quality of my code -- it's much
> more readable because I don't have to keep fiddling with pointers and
> ownership (or worse, reference counts), and I can actually focus on how
> to make the algorithms better. Not to mention the countless frustrating
> hours spent chasing pointer bugs and memory leaks are all gone -- 'cos I
> don't have to use pointers directly anymore.


I had the same experience. For the first half of my programming career, I regarded GC as a crutch for loser programmers. After working on Symantec's Java compiler, and later implementing Javascript, I discovered that I was quite wrong about that, pretty much the same as you.

One thing I'd add is that a GC is *required* if you want to have a language that guarantees memory safety, which D aims to do. There's an inescapable reason why manual memory management in D is only possible in @system code.

Interestingly, carefully written code using a GC can be *faster* than manual memory management, for a number of rather subtle reasons.
January 08, 2013
Am 08.01.2013 16:25, schrieb H. S. Teoh:
> On Tue, Jan 08, 2013 at 10:29:26AM +0100, Paulo Pinto wrote:
>> On Monday, 7 January 2013 at 23:13:13 UTC, H. S. Teoh wrote:
>>> ...
>>>
>>> Crippling the language to cater to the 10% crowd who want to squeeze
>>> every last drop of performance from the hardware is the wrong
>>> approach IMO.
> [...]
>> Agreed.
>>
>> Having used GC languages for the last decade, I think the cases
>> where manual memory management is really required are very few.
>>
>> Even if one is forced to do manual memory management over GC, it is
>> still better to have the GC around than do everything manually.
>
> Yes, hence my idea of splitting up the performance-critical core of a
> game engine vs. the higher-level application stuff (like scripting,
> etc.) that aren't as performance-critical. The latter would be greatly
> helped by a GC -- it makes it easier for scripting people to use,
> whereas writing GC-less code demands a certain level of rigor and
> certainly requires more effort and care than is necessary for the most
> part.
>
>
>> But this is based on my experience doing business applications,
>> desktop and server side or services/daemons.
> [...]
>
> Well, business applications and server-side stuff (I assume it's
> web-based stuff) are exactly the kind of applications that benefit the
> most from a GC. In my mind, they are just modern incarnations of batch
> processing applications, where instant response isn't critical, and so
> the occasional GC pause is acceptable and, indeed, mostly unnoticeable.

Besides Web applications, I also took part in projects that ported high
performance C++ daemons to Java.

These were servers doing millions of data processing manipulations per
second of telecommunication data used in mobile networks.

In a famous Finn/German telecommunications company lots of server code has been migrated from C++ to Java in the last years.

--
Paulo
January 08, 2013
On Tuesday, 8 January 2013 at 15:27:21 UTC, H. S. Teoh wrote:
> But then again, considering the bulk of all software being written
> today, how much code is actually mission-critical real-time apps or game
> engine cores? I suspect real-time apps are <5% of all software, and
> while games are a rapidly growing market, I daresay less than 30-40% of
> game code actually needs to be pauseless (mainly just video-rendering
> code -- code that handles monster AI, for example, wouldn't fail
> horribly if it had to take a few extra frames to decide what to do next
> -- in fact, it may even be more realistic that way). Which, in my
> estimation, probably doesn't account for more than 10% of all software
> out there. The bulk of software being written today don't really need to
> be GC-less.
>

This is an horrible idea for multiplayer and replay standpoint.
January 08, 2013
Am 08.01.2013 17:12, schrieb Benjamin Thaut:
> Am 08.01.2013 16:46, schrieb H. S. Teoh:
>>> So how much experience do you have with game engine programming to
>>> make such statements?
>> [...]
>>
>> Not much, I'll admit. So maybe I'm just totally off here. But the last
>> two sentences weren't specific to game code, I was just making a
>> statement about software in general. (It would be a gross
>> misrepresentation to claim that only 10% of a game is performance
>> critical!)
>>
>>
>> T
>>
>
> So to give a little background about me. I'm currently doing my masters
> degree in informatics which is focused on media related programming.
> (E.g. games, applications with other visual output, mobile apps, etc).
>
> Besides my studies I'm working at Havok, the biggest middle ware company
> in the gaming industry. I'm working there since about a year. I also
> have some contacts to people working at Crytek.
>
> My impression so far: No one who is writing a tripple A gaming title or
> engine is only remotly interested in using a GC. Game engine programmers
> almost do anything to get better performance on a certain plattform.
> There are really elaborate taks beeing done just to get 1% more
> performance. And because of that, a GC is the very first thing every
> serious game engine programmer will kick. You have to keep in mind that
> most games run at 30 FPS. That means you only have 33 ms to do
> everything. Rendering, simulating physics, doing the game logic,
> handling network input, playing sounds, streaming data, and so on.
> Some games even try to get 60 FPS which makes it even harder as you only
> have 16 ms to compute everything. Everything is performance critical if
> you try to achive that.
>
> I also know that Crytek used Lua for game scripting in Crysis 1. It was
> one of the reasons they never managed to get it onto the Consoles (ps3,
> xbox 360). In Crysis 2 they removed all the lua game logic and wrote
> everything in C++ to get better performance.
>
> Doing pooling with a GC enabled, still wastes a lot of time. Because
> when pooling is used almost all data will survive a collection anyway
> (because most of it is in pools). So when the GC runs, most of the work
> it does is wasted, because its running over instances that are going to
> survive anyway. Pooling is just another way of manual memory management
> and I don't find this a valid argument for using a GC.
>
> Also my own little test case (a game I wrote for university) has shown
> that I get a 300% improvement by not using a GC. At the beginning when I
> wrote the game I was convinced that one could make a game work when
> using a GC with only a little performance impact (10%-5%). I already
> heavily optimized the game with some background knowdelge about how the
> GC works. I even did some manual memory mangement for memory blocks that
> were garantueed to not contain any pointers to GC data.
> Despite all this I got a 300% performance improvement after swichting to
> pure manual memory management and removing the GC from druntime.
>
> When D wants to get into the gaming space, there has to be a GC free
> option. Otherwise D will not even be considered when programming
> languages are evaluated.
>
> Kind Regards
> Benjamin Thaut
>

Without dismissing your experience in game development, I think that your experience was spoiled by D's GC quality.

After all, there are Java VMs driving missiles and ship battle systems, which have even higher timer requirements.

--
Paulo
January 08, 2013
On Tuesday, 8 January 2013 at 18:35:19 UTC, Peter Alexander wrote:
> You also need to consider the market for D. Performance is one of D's key selling points. If it had the performance of Python then D would be a much less interesting language, and I honestly doubt anyone would even look at it.
>
> Whether or not the bulk of software written is critically real-time is irrelevant. The question is whether the bulk of software written *in D* is critically real-time. I don't know what the % is, but I'd assume it is much larger than the average piece of software.

Well I for one looked at D *only* because the specifications claimed you'd get  performance comparable with C/C++, which is about as good as it gets, and also that I could compile standalone executables not dependent on a separate runtime environment. The fact that I can integrate systems level code along with high level code in a seamless and safe way sealed the deal.

The only major thing that concerns me is the lack of proper shared library support. I hope this omission is resolved soon.

--rt
January 08, 2013
On Tuesday, 8 January 2013 at 16:12:41 UTC, Benjamin Thaut wrote:
> My impression so far: No one who is writing a tripple A gaming title or engine is only remotly interested in using a GC. Game engine programmers almost do anything to get better performance on a certain plattform. There are really elaborate taks beeing done just to get 1% more performance. And because of that, a GC is the very first thing every serious game engine programmer will kick. You have to keep in mind that most games run at 30 FPS. That means you only have 33 ms to do everything. Rendering, simulating physics, doing the game logic, handling network input, playing sounds, streaming data, and so on.
> Some games even try to get 60 FPS which makes it even harder as you only have 16 ms to compute everything. Everything is performance critical if you try to achive that.
>

That is a real misrepresentation of the reality. Such people avoid the GC, but simply because they avoid all kind of allocation altogether, preferring allocating up-front.
January 08, 2013
On Tuesday, 8 January 2013 at 23:12:43 UTC, Rob T wrote:
> The only major thing that concerns me is the lack of proper shared library support. I hope this omission is resolved soon.

What do you need it for? Runtime loading of D shared objects? Or just linking to them (i.e. binding by ld/dyld at load time)? I'm trying to collect data on real-world use cases resp. expectations right now.

David
January 08, 2013
On 01/08/2013 10:43 PM, Jonathan M Davis wrote:
> std.container.Array and built-in arrays are _very_ different. Array is a
> container, not a range. You can slice it to get a range and operate on that,
> but it's not a range itself.

Is there a particular reason why Array can't have a range interface itself?

> On the other hand, built-in arrays aren't true containers. They don't own or
> manage their own memory in any way, shape, or form, and they're ranges.

Forgive the naive question, but what _is_ the definition of a 'true container'?  Is managing its own memory a necessary component?  Or just for D's concept of a container?
January 08, 2013
On 1/8/13 3:30 PM, David Nadlinger wrote:
> On Tuesday, 8 January 2013 at 23:12:43 UTC, Rob T wrote:
>> The only major thing that concerns me is the lack of proper shared
>> library support. I hope this omission is resolved soon.
>
> What do you need it for? Runtime loading of D shared objects? Or just
> linking to them (i.e. binding by ld/dyld at load time)? I'm trying to
> collect data on real-world use cases resp. expectations right now.
>
> David

I really need real runtime loading of D code (i.e. dlopen and friends) for a project I can't share much about for the time being.

Andrei