December 30, 2013
On Monday, 30 December 2013 at 02:18:34 UTC, develop32 wrote:
> In the end, AI/game logic uses the same mechanic as texture streaming - reuse of the previously allocated memory.

That is a possibility of course, but for a heterogenous environment you risk running out of slots. E.g. online games, virtual worlds, sandbox games where users build etc.

December 30, 2013
> That is a possibility of course, but for a heterogenous environment you risk running out of slots. E.g. online games, virtual worlds, sandbox games where users build etc.

No, not really, just allocate more. The memory is managed by a single closed class, I can do whatever I want with it.

> online games

MMO games are the source of this idea, components are easy to store in DB tables.

> virtual worlds

Remove unneeded render/physics components when entity is out of range, etc.

> sandbox games where users build

No idea how to fix a problem of not enough RAM.


The thing is, all of that game logic data takes a really surprisingly small amount of memory.
December 30, 2013
On Monday, 30 December 2013 at 02:44:27 UTC, develop32 wrote:
> The thing is, all of that game logic data takes a really surprisingly small amount of memory.

In that case you probably could use GC, so why don't you?

December 30, 2013
On Monday, 30 December 2013 at 02:48:30 UTC, Ola Fosheim Grøstad wrote:
> On Monday, 30 December 2013 at 02:44:27 UTC, develop32 wrote:
>> The thing is, all of that game logic data takes a really surprisingly small amount of memory.
>
> In that case you probably could use GC, so why don't you?

Because there is nothing for GC to free in my engine.

In other engine architectures it surely can be a possibility. In my experiments it took 3ms to run it at the end of each game loop.
December 30, 2013
Barry L.:

> Just saw this:  http://joeduffyblog.com/2013/12/27/csharp-for-systems-programming/

A little more info:
https://plus.google.com/+AleksBromfield/posts/SnwtcXUdoyZ

http://www.reddit.com/r/programming/comments/1tzk5j/the_m_error_model/

From the article:

>our language has excellent support for understanding side effects at compile time. Most contract systems demand that contract predicates be free of side effects, but have no way of enforcing this property. We do. If a programmer tries to write a mutating predicate, they will get a compile-time error. When we first enabled this feature, we were shocked at the number of places where people wrote the equivalent of “Debug.Assert(WriteToDisk())”. So, practically speaking, this checking has been very valuable.<

Bye,
bearophile
December 30, 2013
I'm kind of an outsider to this discussion, but take a look how
many games are written using GC-languages, Minecraft is written
in Java, Terraria in C# and all Unity3D games use Mono underneath
(usually C#). And these languages don't allow you to use malloc
even if you wanted to (you can do some of that stuff with NIO
buffers in java but it's a PITA). The best you can do in those
languages usually is to just not allocate stuff during the game.
So arguing that GC is useless for games is an overstatement.
Sure, a game engine of magnitude like Unreal Engine 3 might have
problems with use of GC, but for most other projects it will be
OK.
December 30, 2013
JN:

> take a look how
> many games are written using GC-languages, Minecraft is written
> in Java, Terraria in C#

But the Oracle JVM has a GC (more than one) way better then the current D one :-)

Bye,
bearophile
December 30, 2013
On Monday, 30 December 2013 at 11:23:22 UTC, JN wrote:
> I'm kind of an outsider to this discussion, but take a look how
> many games are written using GC-languages, Minecraft is written
> in Java, Terraria in C# and all Unity3D games use Mono underneath
> (usually C#). And these languages don't allow you to use malloc
> even if you wanted to (you can do some of that stuff with NIO
> buffers in java but it's a PITA). The best you can do in those
> languages usually is to just not allocate stuff during the game.
> So arguing that GC is useless for games is an overstatement.
> Sure, a game engine of magnitude like Unreal Engine 3 might have
> problems with use of GC, but for most other projects it will be
> OK.

As far as I know, Unreal Engine 3 has its own GC implemention for its scripting system.
December 30, 2013
On 2013-12-30 12:23, JN wrote:

> I'm kind of an outsider to this discussion, but take a look how
> many games are written using GC-languages, Minecraft is written
> in Java, Terraria in C# and all Unity3D games use Mono underneath
> (usually C#). And these languages don't allow you to use malloc
> even if you wanted to (you can do some of that stuff with NIO
> buffers in java but it's a PITA).

You can use malloc and friends via JNI in Java. The SWT source code is full of uses of malloc, although it's a bit more verbose than to use it from C or D since it doesn't have pointers.

-- 
/Jacob Carlborg
December 30, 2013
On Monday, 30 December 2013 at 10:27:43 UTC, bearophile wrote:
> http://www.reddit.com/r/programming/comments/1tzk5j/the_m_error_model/

As compared with D:

- unrecoverable errors crash immediately like they should. I like it since the most sensible reason to catch Error in D is to crash anyway (in eg. C callbacks).

- hence, unrecoverable errors exception hierarchy not represented.

- a throws keyword instead of nothrow. I expect it will lessen M# support for code made in a hurry (something that D shines particularly).

- same remark about the "try" keyword at call-site when calling a function which throw recoverable errors.