October 17, 2012
I recently revived an LDC optimization pass (originally written by Frits van Bommel) that can remove dead allocations and promote small GC allocations to stack memory by recognizing the related druntime calls.

However, I didn't commit it to the main branch yet because it causes a test case in the DMD test suite to fail: https://github.com/D-Programming-Language/dmd/blob/eaa03fefeb1a698f586d5f5a09068f3433bf4b29/test/runnable/testgc2.d

The test is intended to make sure that the garbage collector properly handles out-of-memory situations, but in doing so assumes that the compiler can't touch the array allocations. However, when building with optimizations on, LDC recognizes the fact that the allocations are unused and subsequently removes them, causing the "assert(0)"s to be hit:

———
try {
  long[] l = new long[ptrdiff_t.max];
  assert(0);
}
catch (OutOfMemoryError o){}
———

Thus, my question: Is it legal for a D compiler to eliminate dead GC allocations (given that there are no side effects due to constructors, …)?

I'd strongly argue in favor of such optimizations to be legal, because not allowing them would eliminate quite a few optimization possibilities. There is also precedent for such behavior; many C compilers treat malloc() in a similar fashion.

David
October 17, 2012
On 10/17/12 20:10, David Nadlinger wrote:
> I recently revived an LDC optimization pass (originally written by Frits van Bommel) that can remove dead allocations and promote small GC allocations to stack memory by recognizing the related druntime calls.
> 
> However, I didn't commit it to the main branch yet because it causes a test case in the DMD test suite to fail: https://github.com/D-Programming-Language/dmd/blob/eaa03fefeb1a698f586d5f5a09068f3433bf4b29/test/runnable/testgc2.d
> 
> The test is intended to make sure that the garbage collector properly handles out-of-memory situations, but in doing so assumes that the compiler can't touch the array allocations. However, when building with optimizations on, LDC recognizes the fact that the allocations are unused and subsequently removes them, causing the "assert(0)"s to be hit:
> 
> ———
> try {
>   long[] l = new long[ptrdiff_t.max];
>   assert(0);
> }
> catch (OutOfMemoryError o){}
> ———
> 
> Thus, my question: Is it legal for a D compiler to eliminate dead GC allocations (given that there are no side effects due to constructors, …)?
> 
> I'd strongly argue in favor of such optimizations to be legal, because not allowing them would eliminate quite a few optimization possibilities. There is also precedent for such behavior; many C compilers treat malloc() in a similar fashion.


Well, I think such optimizations are fine (as long as documented and there
exists alternatives), but note that this testcase checks for the case where
the object size calculation overflows. Ie it must not succeed.
Would ignoring the error when nothing accesses the object make sense?

artur
October 17, 2012
On Wednesday, 17 October 2012 at 20:37:53 UTC, Artur Skawina wrote:
> Well, I think such optimizations are fine (as long as documented and there
> exists alternatives), but note that this testcase checks for the case where
> the object size calculation overflows. Ie it must not succeed.

Could you elaborate on that? It strikes me that this is either a GC implementation detail or invalid D code in the first place (i.e. should not be expected to compile resp. is undefined behavior).

David
October 17, 2012
On 10/17/12 23:00, David Nadlinger wrote:
> On Wednesday, 17 October 2012 at 20:37:53 UTC, Artur Skawina wrote:
>> Well, I think such optimizations are fine (as long as documented and there exists alternatives), but note that this testcase checks for the case where the object size calculation overflows. Ie it must not succeed.
> 
> Could you elaborate on that? It strikes me that this is either a GC implementation detail or invalid D code in the first place (i.e. should not be expected to compile resp. is undefined behavior).

Well, eg on a 32-bit platform the newly allocated memory object would need to have a size of 8*2G == 16G. I guess you could see it as a GC implementation detail, but that allocation can never succeed, simply because such an object would be larger than the available address space, hence can't be mapped directly. The 'new long[ptrdiff_t.max]' case can be caught at compile time, but a different 'new long[runtime_variable_which_happens_to_be_2G]' can not, and then the GC MUST catch the overflow, instead of allocating a ((size_t)long.sizeof*2G) sized object Which is what I assume that test was meant to check.

But even in the constant, statically-checkable case, would it make sense to ignore the error if the allocation was "dead"? If nothing ever accesses the new object, ignoring the error seems harmless. But is it OK to allow faulty code to silently run long as the compiler can prove that the bug won't be triggered? Will every compiler make the same decision? Would a different optimization level cause the error to be thrown? For these reasons, silently optimizing away "harmless", but buggy code is not a good idea.

artur
October 17, 2012
On Wednesday, 17 October 2012 at 18:10:26 UTC, David Nadlinger wrote:
> Thus, my question: Is it legal for a D compiler to eliminate dead GC allocations (given that there are no side effects due to constructors, …)?

Walter confirmed that this should be allowed. Pull request fixing the test: https://github.com/D-Programming-Language/dmd/pull/1191

David
October 17, 2012
On 10/17/2012 11:10 AM, David Nadlinger wrote:
> Thus, my question: Is it legal for a D compiler to eliminate dead GC allocations
> (given that there are no side effects due to constructors, …)?

Yes.

The test case can be fixed by attempting to use the result of new.
Top | Discussion index | About this forum | D home