Thread overview
Good to the last drop.
Mar 31, 2022
WhatMeWorry
Mar 31, 2022
Paul Backus
March 31, 2022

I run the program (at the bottom) and get, as expected, the run-time out of memory error:

PS C:\D\sandbox> .\Reserve.exe
newCapacity = [ 1]
newCapacity = [ 3]
newCapacity = [ 5]
o o o
newCapacity = [ 905,207,293]
newCapacity = [ 905,207,805]
newCapacity = [ 905,208,317]

core.exception.OutOfMemoryError@src\core\lifetime.d(126): Memory allocation failed

Is there a way to programmatically determine the exact maximum memory size available to the DRuntime’s array implementation?

import std.stdio;
import std.format;

void main()
{
ulong[] big;
size_t newCapacity;
size_t oldCapacity;

foreach(i; 0..ulong.max)
{
    newCapacity = big.reserve(i);
    if(oldCapacity != newCapacity)
    {
        writeln("  newCapacity = ", format("[%15,3d]", newCapacity) );
        oldCapacity = newCapacity;
    }
}

}

March 31, 2022

On Thursday, 31 March 2022 at 19:44:23 UTC, WhatMeWorry wrote:

>

Is there a way to programmatically determine the exact maximum memory size available to the DRuntime’s array implementation?

Closest you can get is probably GC.stats, which will give you the total amount of free memory available to the GC.

If what you actually want to do is attempt to grow an array without crashing if there isn't enough memory, you can use GC.extend. There's an example in the linked documentation that shows how to do it.

March 31, 2022

On 3/31/22 3:44 PM, WhatMeWorry wrote:

>

I run the program (at the bottom) and get, as expected, the run-time out of memory error:

PS C:\D\sandbox> .\Reserve.exe
  newCapacity = [              1]
  newCapacity = [              3]
  newCapacity = [              5]
                o    o    o
  newCapacity = [    905,207,293]
  newCapacity = [    905,207,805]
  newCapacity = [    905,208,317]

core.exception.OutOfMemoryError@src\core\lifetime.d(126): Memory allocation failed

Is there a way to programmatically determine the exact maximum memory size available to the DRuntime’s array implementation?

Note that your code example is not hitting the limit of the largest contiguous block that could be allocated, but the sum of it and all the smaller blocks, as those are still used by the GC.

At any of those extensions, it might reallocate a new block, and copy the data there. In that case, the original block is abandoned (and might be freed to the GC), but it's not used in any subsequent allocations, because your new allocations won't fit! So the GC's usage of memory might be much higher than you think.

What I'd do is a binary search via new processes to see where it breaks.

-Steve