Thread overview
Nobady is going to complain about that...
Apr 22, 2020
welkam
Apr 22, 2020
Paulo Pinto
Apr 23, 2020
Arine
Apr 23, 2020
Paulo Pinto
Apr 23, 2020
Arine
Apr 24, 2020
Paulo Pinto
Apr 23, 2020
solidstate1991
Apr 25, 2020
bauss
Apr 25, 2020
rikki cattermole
April 22, 2020
https://www.youtube.com/watch?v=tK50z_gUpZI&t=1321

I think people in this forum will find these 11 secs as amusing
April 22, 2020
On Wednesday, 22 April 2020 at 14:54:58 UTC, welkam wrote:
> https://www.youtube.com/watch?v=tK50z_gUpZI&t=1321
>
> I think people in this forum will find these 11 secs as amusing

Meanwhile in real world:

https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Objects/Optimizations/index.html

https://unity3d.com/partners/microsoft/mixed-reality

https://stadia.dev/intl/de_de/blog/unity-production-ready-support-for-stadia-now-available/

https://developer.nintendo.com/tools

https://developers.google.com/ar/develop/unity

https://www.cryengine.com/tutorials/view/programming-and-project-guides/c-programming#

https://gvisor.dev/

https://gapid.dev/about/

So yeah, those guys on the videos are entitled to their opionion on how GCs are bad, yet Google, Nintendo, Microsoft, Epic, Crytech seem to be doing quite fine with them.


April 23, 2020
On Wednesday, 22 April 2020 at 18:21:13 UTC, Paulo Pinto wrote:
> On Wednesday, 22 April 2020 at 14:54:58 UTC, welkam wrote:
>> https://www.youtube.com/watch?v=tK50z_gUpZI&t=1321
>>
>> I think people in this forum will find these 11 secs as amusing
>
> Meanwhile in real world:
>
> https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Objects/Optimizations/index.html


They aren't using a generic garbage collector. They built a system specifically designed for their purposes. Without a doubt, they spent a lot of time optimizing that. And it definitely doesn't have to do as much as a generic GC that allocates memory. There only N amount of objects and they can only interact in a specific way. You don't have to scan memory and inaccurate accept a long type as a reference to a section of memory.

> https://unity3d.com/partners/microsoft/mixed-reality
>
> https://stadia.dev/intl/de_de/blog/unity-production-ready-support-for-stadia-now-available/
>
> https://developer.nintendo.com/tools
>
> https://developers.google.com/ar/develop/unity

Unity uses C#, but at its core it still uses C++. Sadly that portion is closed source. As well they use only a subset of C# which is severely crippled and effectively disables almsot every GC reliant feature.

> https://www.cryengine.com/tutorials/view/programming-and-project-guides/c-programming#

Not that familiar with cryengine, but I imagine they probably copied Unity here as they didn't originally use C#.


> https://gapid.dev/about/

I've used gapid in the past, its a piece of garbage that doesn't work right. I've had more success with Render Doc https://renderdoc.org/.


> So yeah, those guys on the videos are entitled to their opionion on how GCs are bad, yet Google, Nintendo, Microsoft, Epic, Crytech seem to be doing quite fine with them.

Yah posting a bunch of links to things you don't understand or how they work. You've just given all good examples against a generic GC used for memory management.

April 23, 2020
On Thursday, 23 April 2020 at 02:05:00 UTC, Arine wrote:
> On Wednesday, 22 April 2020 at 18:21:13 UTC, Paulo Pinto wrote:
>> On Wednesday, 22 April 2020 at 14:54:58 UTC, welkam wrote:
>>> https://www.youtube.com/watch?v=tK50z_gUpZI&t=1321
>>>
>>> I think people in this forum will find these 11 secs as amusing
>>
>> Meanwhile in real world:
>>
>> https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Objects/Optimizations/index.html
>
>
> They aren't using a generic garbage collector. They built a system specifically designed for their purposes. Without a doubt, they spent a lot of time optimizing that. And it definitely doesn't have to do as much as a generic GC that allocates memory. There only N amount of objects and they can only interact in a specific way. You don't have to scan memory and inaccurate accept a long type as a reference to a section of memory.
>
>> https://unity3d.com/partners/microsoft/mixed-reality
>>
>> https://stadia.dev/intl/de_de/blog/unity-production-ready-support-for-stadia-now-available/
>>
>> https://developer.nintendo.com/tools
>>
>> https://developers.google.com/ar/develop/unity
>
> Unity uses C#, but at its core it still uses C++. Sadly that portion is closed source. As well they use only a subset of C# which is severely crippled and effectively disables almsot every GC reliant feature.
>
>> https://www.cryengine.com/tutorials/view/programming-and-project-guides/c-programming#
>
> Not that familiar with cryengine, but I imagine they probably copied Unity here as they didn't originally use C#.
>
>
>> https://gapid.dev/about/
>
> I've used gapid in the past, its a piece of garbage that doesn't work right. I've had more success with Render Doc https://renderdoc.org/.
>
>
>> So yeah, those guys on the videos are entitled to their opionion on how GCs are bad, yet Google, Nintendo, Microsoft, Epic, Crytech seem to be doing quite fine with them.
>
> Yah posting a bunch of links to things you don't understand or how they work. You've just given all good examples against a generic GC used for memory management.

Quite on the contrary, while deconstructing every single example, you miss the point, yes some of them do use C++ underneath, yet what the large majority of  developers are writing makes use of some form GC, reference counted, tracing, simple, advanced tech, whatever.

The world moves forward, with all major graphics engine and OS vendors shipping tooling that to a certain extent makes use of GC based technologies.

All major desktop and mobile OSes make use of it in some form, and then there is the Web as the biggest OS out there using a GC enabled language.

It is the anti-GC crowd that doesn't understand that they eventually will be sitting in a city full of tumbleweeds.


April 23, 2020
On Thursday, 23 April 2020 at 05:49:58 UTC, Paulo Pinto wrote:
> On Thursday, 23 April 2020 at 02:05:00 UTC, Arine wrote:
>> On Wednesday, 22 April 2020 at 18:21:13 UTC, Paulo Pinto wrote:
>>> On Wednesday, 22 April 2020 at 14:54:58 UTC, welkam wrote:
>>>> https://www.youtube.com/watch?v=tK50z_gUpZI&t=1321
>>>>
>>>> I think people in this forum will find these 11 secs as amusing
>>>
>>> Meanwhile in real world:
>>>
>>> https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Objects/Optimizations/index.html
>>
>>
>> They aren't using a generic garbage collector. They built a system specifically designed for their purposes. Without a doubt, they spent a lot of time optimizing that. And it definitely doesn't have to do as much as a generic GC that allocates memory. There only N amount of objects and they can only interact in a specific way. You don't have to scan memory and inaccurate accept a long type as a reference to a section of memory.
>>
>>> https://unity3d.com/partners/microsoft/mixed-reality
>>>
>>> https://stadia.dev/intl/de_de/blog/unity-production-ready-support-for-stadia-now-available/
>>>
>>> https://developer.nintendo.com/tools
>>>
>>> https://developers.google.com/ar/develop/unity
>>
>> Unity uses C#, but at its core it still uses C++. Sadly that portion is closed source. As well they use only a subset of C# which is severely crippled and effectively disables almsot every GC reliant feature.
>>
>>> https://www.cryengine.com/tutorials/view/programming-and-project-guides/c-programming#
>>
>> Not that familiar with cryengine, but I imagine they probably copied Unity here as they didn't originally use C#.
>>
>>
>>> https://gapid.dev/about/
>>
>> I've used gapid in the past, its a piece of garbage that doesn't work right. I've had more success with Render Doc https://renderdoc.org/.
>>
>>
>>> So yeah, those guys on the videos are entitled to their opionion on how GCs are bad, yet Google, Nintendo, Microsoft, Epic, Crytech seem to be doing quite fine with them.
>>
>> Yah posting a bunch of links to things you don't understand or how they work. You've just given all good examples against a generic GC used for memory management.
>
> Quite on the contrary, while deconstructing every single example, you miss the point, yes some of them do use C++ underneath, yet what the large majority of  developers are writing makes use of some form GC, reference counted, tracing, simple, advanced tech, whatever.

For your UE4 example, that wouldn't be possible if they had used a GC only language like Java. I don't think you realize what you are agruing here. You are just relying on a false equivalency as you were with your examples. People aren't against simple, advanced tech, or whatever. UE4 is the perfect example of what can be achieved because you don't have control taken away from you. Your conflating GC with a multitude of ideas and then saying, oh because a GC is simple, advanced tech, or whatever then the people against GC must all be against every possible conceivable idea that could be categorized as simple, advanced tech, or whatever.

> The world moves forward, with all major graphics engine and OS vendors shipping tooling that to a certain extent makes use of GC based technologies.

You linked one engine that uses C#, and you just provided multiple links to the same engine. Somehow you translate that to "all major graphics engines"?

> All major desktop and mobile OSes make use of it in some form, and then there is the Web as the biggest OS out there using a GC enabled language.
>
> It is the anti-GC crowd that doesn't understand that they eventually will be sitting in a city full of tumbleweeds.

The programmers crowd that still write using language don't understand that eventually they'll be sitting in a city full of tumbleweeds. There will always be a need for none GC, as long as there is a need for programmers. Hell there's already computers that are doing programming that programmers wouldn't ever be able to do.

April 23, 2020
On Wednesday, 22 April 2020 at 18:21:13 UTC, Paulo Pinto wrote:
>
> Meanwhile in real world:
>
> https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Objects/Optimizations/index.html
>
> https://unity3d.com/partners/microsoft/mixed-reality
>
> https://stadia.dev/intl/de_de/blog/unity-production-ready-support-for-stadia-now-available/
>
> https://developer.nintendo.com/tools
>
> https://developers.google.com/ar/develop/unity
>
> https://www.cryengine.com/tutorials/view/programming-and-project-guides/c-programming#
>
> https://gvisor.dev/
>
> https://gapid.dev/about/
>
> So yeah, those guys on the videos are entitled to their opionion on how GCs are bad, yet Google, Nintendo, Microsoft, Epic, Crytech seem to be doing quite fine with them.

And I have experimented with GC in D for time critical applications like games. Here are my current recommendations:

* Label everything with `@nogc` that doesn't allocate on the heap. Unlabelled stuff sometimes can make the runtime to check those parts whether you've allocated with them or not.
* Use structs whenever you can.
* GC allocation with `new` has little to no performance penalty compared to using `malloc` or something like that, only the former is less of a hassle and have less of a chance for something to go wrong.
* Other forms of optimizations (Data-Oriented Design, etc.) are more important that fearing the GC and writing unsafe code using `malloc` and `free`.
* Currently I'm using external an API (SDL2_Sound) for audio, my workaround I've came up for CPU rendering might not work there once I'll start working on some advanced audio features.

A missing and very useful feature from D would be emulating `alloca`, or stack allocating. There's already something like that with @nogc exceptions, an equivalent for other classes would be useful from time to time.
April 24, 2020
On Thursday, 23 April 2020 at 16:01:32 UTC, Arine wrote:
> On Thursday, 23 April 2020 at 05:49:58 UTC, Paulo Pinto wrote:
>> On Thursday, 23 April 2020 at 02:05:00 UTC, Arine wrote:
>>> [...]
>>
>> Quite on the contrary, while deconstructing every single example, you miss the point, yes some of them do use C++ underneath, yet what the large majority of  developers are writing makes use of some form GC, reference counted, tracing, simple, advanced tech, whatever.
>
> For your UE4 example, that wouldn't be possible if they had used a GC only language like Java. I don't think you realize what you are agruing here. You are just relying on a false equivalency as you were with your examples. People aren't against simple, advanced tech, or whatever. UE4 is the perfect example of what can be achieved because you don't have control taken away from you. Your conflating GC with a multitude of ideas and then saying, oh because a GC is simple, advanced tech, or whatever then the people against GC must all be against every possible conceivable idea that could be categorized as simple, advanced tech, or whatever.
>
>> The world moves forward, with all major graphics engine and OS vendors shipping tooling that to a certain extent makes use of GC based technologies.
>
> You linked one engine that uses C#, and you just provided multiple links to the same engine. Somehow you translate that to "all major graphics engines"?
>
>> All major desktop and mobile OSes make use of it in some form, and then there is the Web as the biggest OS out there using a GC enabled language.
>>
>> It is the anti-GC crowd that doesn't understand that they eventually will be sitting in a city full of tumbleweeds.
>
> The programmers crowd that still write using language don't understand that eventually they'll be sitting in a city full of tumbleweeds. There will always be a need for none GC, as long as there is a need for programmers. Hell there's already computers that are doing programming that programmers wouldn't ever be able to do.

The multiple links to Unity was to prove the point that Unity is the go to engine sponsored by Microsoft, Google, Nintendo for indies games on their platforms and official AR/VR tooling.

Here I am happily combining the tracing GC of .NET Core/.NET Native with the reference counting GC from C++/CX used to write a couple of UWP components.

Sure there will be code written without any kind of GC, just like there are scenarios where people still write Assembly by hand.

GC enabled systems languages like D allow to be productive taking advantage of having a tracing GC, using reference counting GC algorithms, or just do plain old style C memory management when needed.

While some complain about GCs, the large majority just ships software.
April 25, 2020
On Thursday, 23 April 2020 at 19:48:33 UTC, solidstate1991 wrote:
> * GC allocation with `new` has little to no performance penalty compared to using `malloc` or something like that, only the former is less of a hassle and have less of a chance for something to go wrong.

The problems with GC and performance are usually not in allocations, in fact they can sometimes do faster allocations because they don't need to always ask the system for a new memory block. Not sure how the GC in D actually works entirely but I can imagine that it pre-allocates __some__ memory blocks.

The problem with a GC is when memory needs to be freed. Hence why you will probably __never__ notice any bottlenecks in allocating.

I'm not against nor for GC, in fact using a GC has its pros and cons, just like not using one has both.

I like that D lets you do both.
April 26, 2020
On 26/04/2020 4:05 AM, bauss wrote:
> 
> The problems with GC and performance are usually not in allocations, in fact they can sometimes do faster allocations because they don't need to always ask the system for a new memory block. Not sure how the GC in D actually works entirely but I can imagine that it pre-allocates __some__ memory blocks.

D's GC only collects upon allocation.

In context there is no difference here.