February 04, 2014
On 4 February 2014 17:31, Paulo Pinto <pjmlp@progtools.org> wrote:

> On Tuesday, 4 February 2014 at 02:05:07 UTC, Nick Sabalausky wrote:
>
>> On 2/3/2014 4:13 PM, H. S. Teoh wrote:
>>
>>> I've seen real-life
>>> examples of ARCs gone horribly, horribly wrong, whereas had a GC been
>>> used in the first place things wouldn't have gone down that route.
>>>
>>>
>> I'm curious to hear more about this.
>>
>
> An example is when you have a huge graph and the root reaches it count == 0.
>
> The time taken into a cascading deletes of the whole structure is similar to a stop-the-world GC pause.
>

Only if it's not deferred, and even then, if you're freeing a huge structure like that, it's at a particular time where you planned to do so. No realtime app I know of goes and free's a huge runtime graph mid-frame at some random moment. That's just silly.

It's still easy to defer deletion under ARC if that's what you want to do... it's all about _choice_ :)


February 04, 2014
On Tuesday, 4 February 2014 at 07:06:07 UTC, Eric Suen wrote:
> As long as other code is in managed code, there is GC running at
>  background, no matter your code write in whatever high performance
> language, it will be affect by GC anyway.  so that "So, Microsoft does
> not think that GC is suitable for real time > interactive graphics.
> And they are right." is only your opinion. you don't know the real reason
> behind MS's decision. some resource need release ASAP, so you can't
> rely on GC. or maybe bacause of overhead when call low API in managed
> code.

Anyone who has tried to use Direct 3D from managed code knows why. They had the foundation to do everything from manage code, but chose not to.

You can get smooth animations in GC javascript on browsers too, but only if relying on the C++ CSS transition engine which does the frame-by-frame interpolation for you. But I expect javascript to get better at this if/when using true isolates. If the memory pool is small enough GC can work out ok, but for very low latency/overhead you use the stack/pools, not even malloc is good enough then.



February 04, 2014
On Tuesday, 4 February 2014 at 03:43:53 UTC, ed wrote:
> On Tuesday, 4 February 2014 at 01:36:09 UTC, Adam Wilson wrote:
>> On Mon, 03 Feb 2014 17:04:08 -0800, Manu <turkeyman@gmail.com> wrote:
>>
>>> On 4 February 2014 06:21, Adam Wilson <flyboynw@gmail.com> wrote:
>>>
>>>> On Mon, 03 Feb 2014 12:02:29 -0800, Andrei Alexandrescu <
>>>> SeeWebsiteForEmail@erdani.org> wrote:
>>>>
>>>> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>>>>>
>>>>>> Anyone asking for the addition of ARC or owning pointers to D, gets
>>>>>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>>>>>> remember? People here seem to be more interested in diverting to
>>>>>> nullable, scope and GC optimization. Telling, indeed.
>>>>>>
>>>>>
>>>>> I thought I made it clear that GC avoidance (which includes considering
>>>>> built-in reference counting) is a major focus of 2014.
>>>>>
>>>>> Andrei
>>>>>
>>>>>
>>>> Andrei, I am sorry to report that anything other than complete removal of
>>>> the GC and replacement with compiler generated ARC will be unacceptable to
>>>> a certain, highly vocal, subset of D users. No arguments can be made to
>>>> otherwise, regardless of validity. As far as they are concerned the
>>>> discussion of ARC vs. GC is closed and decided. ARC is the only path
>>>> forward to the bright and glorious future of D. ARC most efficiently solves
>>>> all memory management problems ever encountered. Peer-Reviewed Research and
>>>> the Scientific Method be damned! ALL HAIL ARC!

>
> Most of us know and understand the issues with ARC and that with a GC. Many of us have seen how they play out in systems level development. There is a good reason all serious driver and embedded development is done in C/C++.
>
> A language is the compiler+std as one unit. If Phobos depends on the GC, D depends on the GC. If Phobos isn't systems level ready, D isn't systems level ready. I've heard arguments here that you can turn off the GC, but that equates to rewriting functions that already exists in Phobos and not using any third-party library.

At Sociomantic, that is exactly what we have done. Phobos is almost completely unusable at the present time.

I personally don't think that ARC would make much difference. The problem is that *far* too much garbage is being created. And it's completely unnecessary in most cases.

To take an extreme example, even a pauseless, perfect GC, wouldn't make std.json acceptable.


> Why would anyone seriously consider that as an option? Embedded C++ has std:: and third-party libraries where memory is under control?
>
> Realistically D as a systems language isn't even at the hobby stage.

We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.
February 04, 2014
On Tuesday, 4 February 2014 at 09:59:07 UTC, Don wrote:
> On Tuesday, 4 February 2014 at 03:43:53 UTC, ed wrote:
>> On Tuesday, 4 February 2014 at 01:36:09 UTC, Adam Wilson wrote:
>>> On Mon, 03 Feb 2014 17:04:08 -0800, Manu <turkeyman@gmail.com> wrote:
>>>
>>>> On 4 February 2014 06:21, Adam Wilson <flyboynw@gmail.com> wrote:
>>>>
>>>>> On Mon, 03 Feb 2014 12:02:29 -0800, Andrei Alexandrescu <
>>>>> SeeWebsiteForEmail@erdani.org> wrote:
>>>>>
>>>>> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>>>>>>
>>>>>>> Anyone asking for the addition of ARC or owning pointers to D, gets
>>>>>>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>>>>>>> remember? People here seem to be more interested in diverting to
>>>>>>> nullable, scope and GC optimization. Telling, indeed.
>>>>>>>
>>>>>>
>>>>>> I thought I made it clear that GC avoidance (which includes considering
>>>>>> built-in reference counting) is a major focus of 2014.
>>>>>>
>>>>>> Andrei
>>>>>>
>>>>>>
>>>>> Andrei, I am sorry to report that anything other than complete removal of
>>>>> the GC and replacement with compiler generated ARC will be unacceptable to
>>>>> a certain, highly vocal, subset of D users. No arguments can be made to
>>>>> otherwise, regardless of validity. As far as they are concerned the
>>>>> discussion of ARC vs. GC is closed and decided. ARC is the only path
>>>>> forward to the bright and glorious future of D. ARC most efficiently solves
>>>>> all memory management problems ever encountered. Peer-Reviewed Research and
>>>>> the Scientific Method be damned! ALL HAIL ARC!
>
>>
>> Most of us know and understand the issues with ARC and that with a GC. Many of us have seen how they play out in systems level development. There is a good reason all serious driver and embedded development is done in C/C++.
>>
>> A language is the compiler+std as one unit. If Phobos depends on the GC, D depends on the GC. If Phobos isn't systems level ready, D isn't systems level ready. I've heard arguments here that you can turn off the GC, but that equates to rewriting functions that already exists in Phobos and not using any third-party library.
>
> At Sociomantic, that is exactly what we have done. Phobos is almost completely unusable at the present time.
>
> I personally don't think that ARC would make much difference. The problem is that *far* too much garbage is being created. And it's completely unnecessary in most cases.
>
> To take an extreme example, even a pauseless, perfect GC, wouldn't make std.json acceptable.
>
>
>> Why would anyone seriously consider that as an option? Embedded C++ has std:: and third-party libraries where memory is under control?
>>
>> Realistically D as a systems language isn't even at the hobby stage.
>
> We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.

OK, I stand corrected! :D

Thanks,
ed
February 04, 2014
On Tuesday, 4 February 2014 at 09:59:07 UTC, Don wrote:
> On Tuesday, 4 February 2014 at 03:43:53 UTC, ed wrote:
>> On Tuesday, 4 February 2014 at 01:36:09 UTC, Adam Wilson wrote:
>>> On Mon, 03 Feb 2014 17:04:08 -0800, Manu <turkeyman@gmail.com> wrote:
>>>
>>>> On 4 February 2014 06:21, Adam Wilson <flyboynw@gmail.com> wrote:
>>>>
>>>>> On Mon, 03 Feb 2014 12:02:29 -0800, Andrei Alexandrescu <
>>>>> SeeWebsiteForEmail@erdani.org> wrote:
>>>>>
>>>>> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>>>>>>
>>>>>>> Anyone asking for the addition of ARC or owning pointers to D, gets
>>>>>>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>>>>>>> remember? People here seem to be more interested in diverting to
>>>>>>> nullable, scope and GC optimization. Telling, indeed.
>>>>>>>
>>>>>>
>>>>>> I thought I made it clear that GC avoidance (which includes considering
>>>>>> built-in reference counting) is a major focus of 2014.
>>>>>>
>>>>>> Andrei
>>>>>>
>>>>>>
>>>>> Andrei, I am sorry to report that anything other than complete removal of
>>>>> the GC and replacement with compiler generated ARC will be unacceptable to
>>>>> a certain, highly vocal, subset of D users. No arguments can be made to
>>>>> otherwise, regardless of validity. As far as they are concerned the
>>>>> discussion of ARC vs. GC is closed and decided. ARC is the only path
>>>>> forward to the bright and glorious future of D. ARC most efficiently solves
>>>>> all memory management problems ever encountered. Peer-Reviewed Research and
>>>>> the Scientific Method be damned! ALL HAIL ARC!
>
>>
>> Most of us know and understand the issues with ARC and that with a GC. Many of us have seen how they play out in systems level development. There is a good reason all serious driver and embedded development is done in C/C++.
>>
>> A language is the compiler+std as one unit. If Phobos depends on the GC, D depends on the GC. If Phobos isn't systems level ready, D isn't systems level ready. I've heard arguments here that you can turn off the GC, but that equates to rewriting functions that already exists in Phobos and not using any third-party library.
>
> At Sociomantic, that is exactly what we have done. Phobos is almost completely unusable at the present time.
>
> I personally don't think that ARC would make much difference. The problem is that *far* too much garbage is being created. And it's completely unnecessary in most cases.
>
> To take an extreme example, even a pauseless, perfect GC, wouldn't make std.json acceptable.
>
>
>> Why would anyone seriously consider that as an option? Embedded C++ has std:: and third-party libraries where memory is under control?
>>
>> Realistically D as a systems language isn't even at the hobby stage.
>
> We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.

Maybe I'm thinking too much embedded, which I admit isn't fair on D and at this stage maybe not a realistic comparison.

Our projects are Siemens medical devices so it is 90% embedded, a different level of system perhaps. They too would be on a global scale and I'd love to get D on them :-)

Cheers,
ed

February 04, 2014
On Tuesday, 4 February 2014 at 10:32:26 UTC, ed wrote:
> On Tuesday, 4 February 2014 at 09:59:07 UTC, Don wrote:
>> On Tuesday, 4 February 2014 at 03:43:53 UTC, ed wrote:
>>> On Tuesday, 4 February 2014 at 01:36:09 UTC, Adam Wilson wrote:
>>>> On Mon, 03 Feb 2014 17:04:08 -0800, Manu <turkeyman@gmail.com> wrote:
>>>>
>>>>> On 4 February 2014 06:21, Adam Wilson <flyboynw@gmail.com> wrote:
>>>>>
>>>>>> On Mon, 03 Feb 2014 12:02:29 -0800, Andrei Alexandrescu <
>>>>>> SeeWebsiteForEmail@erdani.org> wrote:
>>>>>>
>>>>>> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>>>>>>>
>>>>>>>> Anyone asking for the addition of ARC or owning pointers to D, gets
>>>>>>>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>>>>>>>> remember? People here seem to be more interested in diverting to
>>>>>>>> nullable, scope and GC optimization. Telling, indeed.
>>>>>>>>
>>>>>>>
>>>>>>> I thought I made it clear that GC avoidance (which includes considering
>>>>>>> built-in reference counting) is a major focus of 2014.
>>>>>>>
>>>>>>> Andrei
>>>>>>>
>>>>>>>
>>>>>> Andrei, I am sorry to report that anything other than complete removal of
>>>>>> the GC and replacement with compiler generated ARC will be unacceptable to
>>>>>> a certain, highly vocal, subset of D users. No arguments can be made to
>>>>>> otherwise, regardless of validity. As far as they are concerned the
>>>>>> discussion of ARC vs. GC is closed and decided. ARC is the only path
>>>>>> forward to the bright and glorious future of D. ARC most efficiently solves
>>>>>> all memory management problems ever encountered. Peer-Reviewed Research and
>>>>>> the Scientific Method be damned! ALL HAIL ARC!
>>
>>>
>>> Most of us know and understand the issues with ARC and that with a GC. Many of us have seen how they play out in systems level development. There is a good reason all serious driver and embedded development is done in C/C++.
>>>
>>> A language is the compiler+std as one unit. If Phobos depends on the GC, D depends on the GC. If Phobos isn't systems level ready, D isn't systems level ready. I've heard arguments here that you can turn off the GC, but that equates to rewriting functions that already exists in Phobos and not using any third-party library.
>>
>> At Sociomantic, that is exactly what we have done. Phobos is almost completely unusable at the present time.
>>
>> I personally don't think that ARC would make much difference. The problem is that *far* too much garbage is being created. And it's completely unnecessary in most cases.
>>
>> To take an extreme example, even a pauseless, perfect GC, wouldn't make std.json acceptable.
>>
>>
>>> Why would anyone seriously consider that as an option? Embedded C++ has std:: and third-party libraries where memory is under control?
>>>
>>> Realistically D as a systems language isn't even at the hobby stage.
>>
>> We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.
>
> Maybe I'm thinking too much embedded, which I admit isn't fair on D and at this stage maybe not a realistic comparison.

Yeah, I dunno what "systems language" means really. In practice it seems to mean "competes with C++" and that's how I use it. And C++ had some problems getting into the embedded market.

Though even D as "a better C" is a surprisingly nice language.


> Our projects are Siemens medical devices so it is 90% embedded, a different level of system perhaps. They too would be on a global scale and I'd love to get D on them :-)

Yeah. I can imagine that's a tough assignment.
February 04, 2014
On Tuesday, 4 February 2014 at 07:15:17 UTC, Adam Wilson wrote:
> On Mon, 03 Feb 2014 23:05:35 -0800, Eric Suen <eric.suen.tech@gmail.com> wrote:
>
>>
>> "Ola Fosheim Gr?stad" <ola.fosheim.grostad+dlang@gmail.com>">

>>> So, Microsoft does not think that GC is suitable for real time interactive graphics. And they are right.

>> And they are right." is only your opinion. you don't know the real reason
>> behind MS's decision. some resource need release ASAP, so you

> Actually the reason is that DirectX is a specialized Native COM-Like API that is NOT compatible with normal COM and therefore not compatible with .NET COM Interop. Milcore is a

Quoting
http://msdn.microsoft.com/en-us/library/ms750441(v=vs.110).aspx :

«Milcore is written in unmanaged code in order to enable tight integration with DirectX. All display in WPF is done through the DirectX engine, allowing for efficient hardware and software rendering. WPF also required fine control over memory and execution. The composition engine in milcore is extremely performance sensitive, and required giving up many advantages of the CLR to gain performance.»

Please note that Microsoft knows their stuff: «WPF also required fine control over memory and execution.» and «required giving up many advantages of the CLR to gain performance».

WPF maintains a retained mode shadow-tree of the composition elements of the scene graph to get more responsive applications. i.e. to avoid having client code blocking rendering:

«There is a very important architectural detail to notice here – the entire tree of visuals and drawing instructions is cached. In graphics terms, WPF uses a retained rendering system. This enables the system to repaint at high refresh rates without the composition system blocking on callbacks to user code. This helps prevent the appearance of an unresponsive application.»

But… I don't think it was a good idea to go for full back-to-front rendering.
February 04, 2014
On Tuesday, 4 February 2014 at 11:13:03 UTC, Don wrote:
> On Tuesday, 4 February 2014 at 10:32:26 UTC, ed wrote:
>> On Tuesday, 4 February 2014 at 09:59:07 UTC, Don wrote:
>>> On Tuesday, 4 February 2014 at 03:43:53 UTC, ed wrote:
>>>> On Tuesday, 4 February 2014 at 01:36:09 UTC, Adam Wilson wrote:
>>>>> On Mon, 03 Feb 2014 17:04:08 -0800, Manu <turkeyman@gmail.com> wrote:
>>>>>
>>>>>> On 4 February 2014 06:21, Adam Wilson <flyboynw@gmail.com> wrote:
>>>>>>
>>>>>>> On Mon, 03 Feb 2014 12:02:29 -0800, Andrei Alexandrescu <
>>>>>>> SeeWebsiteForEmail@erdani.org> wrote:
>>>>>>>
>>>>>>> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>>>>>>>>
>>>>>>>>> Anyone asking for the addition of ARC or owning pointers to D, gets
>>>>>>>>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>>>>>>>>> remember? People here seem to be more interested in diverting to
>>>>>>>>> nullable, scope and GC optimization. Telling, indeed.
>>>>>>>>>
>>>>>>>>
>>>>>>>> I thought I made it clear that GC avoidance (which includes considering
>>>>>>>> built-in reference counting) is a major focus of 2014.
>>>>>>>>
>>>>>>>> Andrei
>>>>>>>>
>>>>>>>>
>>>>>>> Andrei, I am sorry to report that anything other than complete removal of
>>>>>>> the GC and replacement with compiler generated ARC will be unacceptable to
>>>>>>> a certain, highly vocal, subset of D users. No arguments can be made to
>>>>>>> otherwise, regardless of validity. As far as they are concerned the
>>>>>>> discussion of ARC vs. GC is closed and decided. ARC is the only path
>>>>>>> forward to the bright and glorious future of D. ARC most efficiently solves
>>>>>>> all memory management problems ever encountered. Peer-Reviewed Research and
>>>>>>> the Scientific Method be damned! ALL HAIL ARC!
>>>
>>>>
>>>> Most of us know and understand the issues with ARC and that with a GC. Many of us have seen how they play out in systems level development. There is a good reason all serious driver and embedded development is done in C/C++.
>>>>
>>>> A language is the compiler+std as one unit. If Phobos depends on the GC, D depends on the GC. If Phobos isn't systems level ready, D isn't systems level ready. I've heard arguments here that you can turn off the GC, but that equates to rewriting functions that already exists in Phobos and not using any third-party library.
>>>
>>> At Sociomantic, that is exactly what we have done. Phobos is almost completely unusable at the present time.
>>>
>>> I personally don't think that ARC would make much difference. The problem is that *far* too much garbage is being created. And it's completely unnecessary in most cases.
>>>
>>> To take an extreme example, even a pauseless, perfect GC, wouldn't make std.json acceptable.
>>>
>>>
>>>> Why would anyone seriously consider that as an option? Embedded C++ has std:: and third-party libraries where memory is under control?
>>>>
>>>> Realistically D as a systems language isn't even at the hobby stage.
>>>
>>> We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.
>>
>> Maybe I'm thinking too much embedded, which I admit isn't fair on D and at this stage maybe not a realistic comparison.
>
> Yeah, I dunno what "systems language" means really.

For me it means you can write an OS with it, even if some tiny parts require the use of Assembly glue.

--
Paulo
February 04, 2014
On Monday, 3 February 2014 at 20:02:30 UTC, Andrei Alexandrescu
wrote:
> On 2/3/14, 6:57 AM, Frank Bauer wrote:
>> Anyone asking for the addition of ARC or owning pointers to D, gets
>> pretty much ignored. The topic is "Smart pointers instead of GC?",
>> remember? People here seem to be more interested in diverting to
>> nullable, scope and GC optimization. Telling, indeed.
>
> I thought I made it clear that GC avoidance (which includes considering built-in reference counting) is a major focus of 2014.
>
> Andrei

I hope the same mistakes are not repeated of the past:

This means that whatever is chosen, removing the hard dependence
on any automatic memory management should be avoided.

It would be nice to be able have D work in all areas of
programming. I think allowing one to turn off all automatic
memory management, or even allowing one to choose which
allocation method to use per function/class/module get about
99.99 of the use cases. You allow people that want the freedom of
now worrying about deallocation to use the AGC and those that
need every drop of performance can go the manual route.

Maybe such a general approach is difficult to implement? But
surely it would be worth it. From this thread, it's obvious that
there are plenty of people interested in its success.

Getting D runtime off any specific memory allocation scheme would
seem to be the priority? Then Phobos? then we party like it's
1999?

It would be nice if one could simply write some allocator, drop
it into D, and everything work out dandy. e.g., I want to try out
a new super fast AGC like metronome GC, I write the code for it,
tell D to use it, and then reap the benefits.



February 04, 2014
On 2014-02-03 23:00:22 +0000, woh said:
> 
> On Monday, 3 February 2014 at 21:42:59 UTC, Shammah Chancellor wrote:
>> On 2014-02-01 07:35:44 +0000, Manu said:
>> 
>>> On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com> wrote:
>>> On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:
>>> 
>>> On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:
>>> 
>>>  std.typecons.RefCounted!T
>>> 
>>> core.memory.GC.disable();
>>> 
>>> 
>>> Wow. That was easy.
>>> 
>>> I see, D's claim of being a multi-paradigm language is not false.
>>> 
>>> 
>>> It's not a realistic suggestion. Everything you want to link uses the GC,
>>> and the language its self also uses the GC. Unless you write software in
>>> complete isolation and forego many valuable features, it's not a solution.
>>> 
>>> 
>>>  Phobos does rely on the GC to some extent. Most algorithms and ranges do
>>> not though.
>>> 
>>> 
>>> Running (library) code that was written with GC in mind and turning GC off
>>> doesn't sound ideal.
>>> 
>>> But maybe this allows me to familiarise myself more with D. Who knows,
>>> maybe I can learn to stop worrying and love garbage collection.
>>> 
>>> Thanks for your help!
>>> 
>>> 
>>> I've been trying to learn to love the GC for as long as I've been around
>>> here. I really wanted to break that mental barrier, but it hasn't happened.
>>> In fact, I am more than ever convinced that the GC won't do. My current #1
>>> wishlist item for D is the ability to use a reference counted collector in
>>> place of the built-in GC.
>>> You're not alone :)
>>> 
>>> I write realtime and memory-constrained software (console games), and for
>>> me, I think the biggest issue that can never be solved is the
>>> non-deterministic nature of the collect cycles, and the unknowable memory
>>> footprint of the application. You can't make any guarantees or predictions
>>> about the GC, which is fundamentally incompatible with realtime software.
>>> Language-level ARC would probably do quite nicely for the miscellaneous
>>> allocations. Obviously, bulk allocations are still usually best handled in
>>> a context sensitive manner; ie, regions/pools/freelists/whatever, but the
>>> convenience of the GC paradigm does offer some interesting and massively
>>> time-saving features to D.
>>> Everyone will always refer you to RefCounted, which mangles your types and
>>> pollutes your code, but aside from that, for ARC to be useful, it needs to
>>> be supported at the language-level, such that the language/optimiser is
>>> able to optimise out redundant incref/decref calls, and also that it is
>>> compatible with immutable (you can't manage a refcount if the object is
>>> immutable).
>>> 
>>> The problem isn't GC's per se. But D's horribly naive implementation, games are written on GC languages now all the time (Unity/.NET). And let's be honest, games are kind of a speciality, games do things most programs will never do.
>>> 
>>> You might want to read the GC Handbook. GC's aren't bad, but most, like the D GC, are just to simplistic for common usage today.
>>> 
>>> Maybe a sufficiently advanced GC could address the performance non-determinism to an acceptable level, but you're still left with the memory non-determinism, and the conundrum that when your heap approaches full (which is _always_ on a games console), the GC has to work harder and harder, and more often to try and keep the tiny little bit of overhead available.
>>> A GC heap by nature expects you to have lots of memory, and also lots of FREE memory.
>>> 
>>> No serious console game I'm aware of has ever been written in a language with a GC. Casual games, or games that don't attempt to raise the bar may get away with it, but that's not the industry I work in.
>> 
>> You can always force the GC to run between cycles in your game, and
>> turn off automatic sweeps.  This is how most games operate nowadays.
>> It's also probably possible to create a drop-in replacement for the GC
>> to do something else.   I could see if being *VERY* useful to make the
>> GC take a compile-time parameter to select which GC engine is used.

>   ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb.  I bet they never tried to use a GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs when I allocate, oh what a fool I've been, please castigate me harder!
> 
>   Adam pls tell me more of this c# and that amazing gc it sounds so good


First of all, bottom quoting is evil.  Second of all, your response is immature.  Thirdly, I am not adam.  And fourthly, I specifically mention that many games are currently using garbage collection.

-S.