Jump to page: 1 2
Thread overview
Out of memory error (even when using destroy())
May 26, 2017
realhet
May 26, 2017
Jonathan M Davis
May 26, 2017
realhet
May 26, 2017
rikki cattermole
May 26, 2017
Guillaume Piolat
May 26, 2017
ag0aep6g
May 26, 2017
Mike B Johnson
May 26, 2017
Stanislav Blinov
May 26, 2017
H. S. Teoh
May 27, 2017
Mike B Johnson
May 27, 2017
Stanislav Blinov
May 27, 2017
nkm1
May 26, 2017
Jordan Wilson
May 26, 2017
realhet
May 26, 2017
Hi,

I'm kinda new to the D language and I love it already. :D So far I haven't got any serious problems but this one seems like beyond me.

import std.stdio;
void main(){
    foreach(i; 0..2000){
        writeln(i);
        auto st = new ubyte[500_000_000];
        destroy(st); //<-this doesnt matter
    }
}

Compiled with DMD 2.074.0 Win32 it produces the following output:
0
1
2
core.exception.OutOfMemoryError@src\core\exception.d(696): Memory allocation failed

It doesn't matter that I call destroy() or not. This is ok because as I learned: destroy only calls the destructor and marks the memory block as unused.

But I also learned that GC will start to collect when it run out of memory but in this time the following happens:
3x half GB of allocations and deallocations, and on the 4th the system runs out of the 2GB
 limit which is ok. At this point the GC already has 1.5GB of free memory but instead of using that, it returns a Memory Error. Why?

Note: This is not a problem when I use smaller blocks (like 50MB).
But I want to use large blocks, without making a slow wrapper that emulates a large block by using smaller GC allocated blocks.

Is there a solution to this?

Thank You!
May 26, 2017
On Friday, May 26, 2017 06:31:49 realhet via Digitalmars-d-learn wrote:
> Hi,
>
> I'm kinda new to the D language and I love it already. :D So far I haven't got any serious problems but this one seems like beyond me.
>
> import std.stdio;
> void main(){
>      foreach(i; 0..2000){
>          writeln(i);
>          auto st = new ubyte[500_000_000];
>          destroy(st); //<-this doesnt matter
>      }
> }
>
> Compiled with DMD 2.074.0 Win32 it produces the following output:
> 0
> 1
> 2
> core.exception.OutOfMemoryError@src\core\exception.d(696): Memory
> allocation failed
>
> It doesn't matter that I call destroy() or not. This is ok because as I learned: destroy only calls the destructor and marks the memory block as unused.
>
> But I also learned that GC will start to collect when it run out
> of memory but in this time the following happens:
> 3x half GB of allocations and deallocations, and on the 4th the
> system runs out of the 2GB
>   limit which is ok. At this point the GC already has 1.5GB of
> free memory but instead of using that, it returns a Memory Error.
> Why?
>
> Note: This is not a problem when I use smaller blocks (like 50MB). But I want to use large blocks, without making a slow wrapper that emulates a large block by using smaller GC allocated blocks.

It's likely an issue with false pointers. The GC thinks that the memory is referenced when it isn't, because some of the values match the pointers that would need to be freed.

> Is there a solution to this?

Use 64-bit. False pointers don't tend to be a problem with 64-bit, whereas they can be with 32-bit - especially when you're allocating large blocks of memory like that.

- Jonathan M Davis

May 26, 2017
Thanks for the answer!

But hey, the GC knows that is should not search for any pointers in those large blocks.
And the buffer is full of 0-s at the start, so there can't be any 'false pointers' in it. And I think the GC will not search in it either.

The only reference to the buffer is 'st' which will die shortly after it has been allocated.

64bit is not a solution because I need to produce a 32bit dll, and I also wanna use 32bit asm objs.
The total 2GB amount of memory is more than enough for the problem.
My program have to produce 300..500 MB of continuous data frequently. This works in MSVC32, but with D's GC it starts to eat memory and fails at the 4th iteration. Actually it never releases the previous blocks even I say so with destroy().

At this point I only can think of:
a) Work with the D allocator but emulate large blocks by virtually stitching small blocks together. (this is unnecessary complexity)
b) Allocating memory by Win32 api and not using D goodies anymore (also unnecessary complexity)

But these are ugly workarounds. :S

I also tried to allocate smaller blocks than the previous one, so it would easily fit to the prevouisly released space, and yet it keeps eating memory:

void alloc_dealloc(size_t siz){
    auto st = new ubyte[siz];
}

void main(){
    foreach(i; 0..4) alloc_dealloc(500_000_000 - 50_000_000*i);
}
May 26, 2017
On 26/05/2017 9:15 AM, realhet wrote:
> Thanks for the answer!
> 
> But hey, the GC knows that is should not search for any pointers in those large blocks.
> And the buffer is full of 0-s at the start, so there can't be any 'false pointers' in it. And I think the GC will not search in it either.
> 
> The only reference to the buffer is 'st' which will die shortly after it has been allocated.
> 
> 64bit is not a solution because I need to produce a 32bit dll, and I also wanna use 32bit asm objs.
> The total 2GB amount of memory is more than enough for the problem.
> My program have to produce 300..500 MB of continuous data frequently. This works in MSVC32, but with D's GC it starts to eat memory and fails at the 4th iteration. Actually it never releases the previous blocks even I say so with destroy().
> 
> At this point I only can think of:
> a) Work with the D allocator but emulate large blocks by virtually stitching small blocks together. (this is unnecessary complexity)
> b) Allocating memory by Win32 api and not using D goodies anymore (also unnecessary complexity)
> 
> But these are ugly workarounds. :S
> 
> I also tried to allocate smaller blocks than the previous one, so it would easily fit to the prevouisly released space, and yet it keeps eating memory:
> 
> void alloc_dealloc(size_t siz){
>      auto st = new ubyte[siz];
> }
> 
> void main(){
>      foreach(i; 0..4) alloc_dealloc(500_000_000 - 50_000_000*i);
> }

If you have to use such large amounts frequently, you really have to go with buffers of memory that you control, not the GC. Memory allocation is always expensive, if you can prevent it all the better.
May 26, 2017
On Friday, 26 May 2017 at 06:31:49 UTC, realhet wrote:
> Hi,
>
> I'm kinda new to the D language and I love it already. :D So far I haven't got any serious problems but this one seems like beyond me.
>
> import std.stdio;
> void main(){
>     foreach(i; 0..2000){
>         writeln(i);
>         auto st = new ubyte[500_000_000];
>         destroy(st); //<-this doesnt matter
>     }
> }
>
> Compiled with DMD 2.074.0 Win32 it produces the following output:
> 0
> 1
> 2
> core.exception.OutOfMemoryError@src\core\exception.d(696): Memory allocation failed
>
> It doesn't matter that I call destroy() or not. This is ok because as I learned: destroy only calls the destructor and marks the memory block as unused.
>
> But I also learned that GC will start to collect when it run out of memory but in this time the following happens:
> 3x half GB of allocations and deallocations, and on the 4th the system runs out of the 2GB
>  limit which is ok. At this point the GC already has 1.5GB of free memory but instead of using that, it returns a Memory Error. Why?
>
> Note: This is not a problem when I use smaller blocks (like 50MB).
> But I want to use large blocks, without making a slow wrapper that emulates a large block by using smaller GC allocated blocks.
>
> Is there a solution to this?
>
> Thank You!

I believe the general solution would be to limit allocation within loops (given the issue Johnathan mentioned).

This I think achieves the spirit of your code, but without the memory exception:
    ubyte[] st;
    foreach(i; 0..2000){
        writeln(i);
        st.length=500_000_000; // auto = new ubyte[500_000_000];
        st.length=0; // destory(st)
        st.assumeSafeAppend;
// prevent allocation by assuming it's ok to overrwrite what's currently in st
    }
May 26, 2017
On Friday, 26 May 2017 at 08:15:49 UTC, realhet wrote:
> 64bit is not a solution because I need to produce a 32bit dll, and I also wanna use 32bit asm objs.
> The total 2GB amount of memory is more than enough for the problem.
> My program have to produce 300..500 MB of continuous data frequently. This works in MSVC32, but with D's GC it starts to eat memory and fails at the 4th iteration. Actually it never releases the previous blocks even I say so with destroy().
>

If you have issues with false pointers, you can use malloc instead of the GC to use much less memory.

May 26, 2017
> Jordan Wilson wrote:
> This I think achieves the spirit of your code, but without the memory exception:
>     ubyte[] st;
>     foreach(i; 0..2000){
>         writeln(i);
>         st.length=500_000_000; // auto = new ubyte[500_000_000];
>         st.length=0; // destory(st)
>         st.assumeSafeAppend;
> // prevent allocation by assuming it's ok to overrwrite what's currently in st
>     }

Yea, that's the perfect solution. It uses exactly the amount of memory that is required and still I'm using D things only.
The only difference is that I need only one variable outside of the loop, but it's well worth it because I only need one large buffer at a time.
Also refreshed my knowledge about assumeSafeAppend() which is now clear to me, thanks to You.

Using this information I'll be able to do a BigArray class that will hold large amount of data without worrying that the program uses 3x more memory than needed :D

Thanks for everyone,
Such a helping community you have here!
May 26, 2017
On 05/26/2017 10:15 AM, realhet wrote:
> But hey, the GC knows that is should not search for any pointers in those large blocks.
> And the buffer is full of 0-s at the start, so there can't be any 'false pointers' in it. And I think the GC will not search in it either.

The issue is not that the block contains a false pointer, but that there's a false pointer elsewhere that points into the block. The bigger the block, the more likely it is that something (e.g. an int on the stack) is mistaken for a pointer into it.
May 26, 2017
On Friday, 26 May 2017 at 14:05:34 UTC, ag0aep6g wrote:
> On 05/26/2017 10:15 AM, realhet wrote:
>> But hey, the GC knows that is should not search for any pointers in those large blocks.
>> And the buffer is full of 0-s at the start, so there can't be any 'false pointers' in it. And I think the GC will not search in it either.
>
> The issue is not that the block contains a false pointer, but that there's a false pointer elsewhere that points into the block. The bigger the block, the more likely it is that something (e.g. an int on the stack) is mistaken for a pointer into it.

Wow, if that is the case then the GC has some real issues. The GC should be informed about all pointers and an int is not a pointer.
May 26, 2017
On Friday, 26 May 2017 at 18:06:42 UTC, Mike B Johnson wrote:
> On Friday, 26 May 2017 at 14:05:34 UTC, ag0aep6g wrote:
>> On 05/26/2017 10:15 AM, realhet wrote:
>>> But hey, the GC knows that is should not search for any pointers in those large blocks.
>>> And the buffer is full of 0-s at the start, so there can't be any 'false pointers' in it. And I think the GC will not search in it either.
>>
>> The issue is not that the block contains a false pointer, but that there's a false pointer elsewhere that points into the block. The bigger the block, the more likely it is that something (e.g. an int on the stack) is mistaken for a pointer into it.
>
> Wow, if that is the case then the GC has some real issues. The GC should be informed about all pointers and an int is not a pointer.

What is a pointer if not an int? :)

That is not an issue. The GC holds off releasing memory if there's even a suspicion that someone might be holding on to it. In most problems, ints are small. Pointers are always big, so there's not much overlap there. Accidents do happen occasionally, but it's better to have a system that is too cautious than one that ruins your data.

Working with huge memory chunks isn't really a domain for GC though.
« First   ‹ Prev
1 2