December 10, 2012
On Friday, 7 December 2012 at 18:44:08 UTC, Jonathan M Davis wrote:
> On Friday, December 07, 2012 18:18:43 monarch_dodra wrote:
>> I had actually been through this before, and someone told me
>> about that. The problem at this point is that this isn't even an
>> option anymore, since std/algorithm.d is in a group *alone*.
>
> Then I'd have two suggestions then:
>
> [SNIP]
>
> - Jonathan M Davis

I just tried to version each unittest (for win32) into two different objects (using version blocks). This worked relatively well, up until the final link step, where I was greeted with:

//----
unittest5b.obj(unittest5b)  Offset 00D68H Record Type 0091
 Error 1: Previous Definition Different : _D3std9algorithm6EditOp6__initZ
unittest5b.obj(unittest5b)  Offset 685C4H Record Type 0091
 Error 1: Previous Definition Different : _D3std9algorithm12__ModuleInfoZ
unittest5b.obj(unittest5b)  Offset 7ACEBH Record Type 00C3
 Error 1: Previous Definition Different : __D3std9algorithm9__modtestFZv
unittest5b.obj(unittest5b)  Offset 7AFB4H Record Type 00C3
 Error 1: Previous Definition Different : _D3std9algorithm7__arrayZ
unittest5b.obj(unittest5b)  Offset 7AFE0H Record Type 00C3
 Error 1: Previous Definition Different : _D3std9algorithm8__assertFiZv
unittest5b.obj(unittest5b)  Offset 7B00CH Record Type 00C3
 Error 1: Previous Definition Different : _D3std9algorithm15__unittest_failFiZv
//----

The one I'm *really* concerned about is "ModuleInfo": My guess is that I'll never get rid of this error :/

I figure the "easy workaround", it to create a new dedicated executable, which tests just algorithm...?

I don't think deactivating unit tests is a great move anyways...
December 10, 2012
On 12/7/2012 2:51 PM, deadalnix wrote:
> I'm working on a program that now require more than 2.5Gb of RAM to compile,
> where separate compilation is not possible due to bug 8997 and that randomly
> fails to compile due to bug 8596. It is NOT fast and that insane memory
> consumption is a major cause of slowness.

I'm pretty sure the memory consumption happens with CTFE and Don is working on it.

December 10, 2012
On Monday, December 10, 2012 13:34:19 monarch_dodra wrote:
> On Friday, 7 December 2012 at 18:44:08 UTC, Jonathan M Davis
> 
> wrote:
> > On Friday, December 07, 2012 18:18:43 monarch_dodra wrote:
> >> I had actually been through this before, and someone told me
> >> about that. The problem at this point is that this isn't even
> >> an
> >> option anymore, since std/algorithm.d is in a group *alone*.
> > 
> > Then I'd have two suggestions then:
> > 
> > [SNIP]
> > 
> > - Jonathan M Davis
> 
> I just tried to version each unittest (for win32) into two
> different objects (using version blocks). This worked relatively
> well, up until the final link step, where I was greeted with:
[snip]

Different versions of the same module have to be done in separate builds. They couldn't all be in the same build. The Windows build is one executable, and it's been rejected to change that (for some good reasons - though it does cause problems here), so versioning the tests means that some of them won't be run as part of the normal unittest build. They'll have to be run manually or will only be run on other OSes which can handle the memory consumption. For quite a while, a lot of std.datetime's unit tests were just outright disabled on Windows, because the Windows version of dmd couldn't handle it. It's sounding like we're going to have to do the same with some of std.algorithm's unit tests until dmd's issues can be sorted out.

- Jonathan M Davis
December 10, 2012
On Monday, 10 December 2012 at 12:46:19 UTC, Walter Bright wrote:
> On 12/7/2012 2:51 PM, deadalnix wrote:
>> I'm working on a program that now require more than 2.5Gb of RAM to compile,
>> where separate compilation is not possible due to bug 8997 and that randomly
>> fails to compile due to bug 8596. It is NOT fast and that insane memory
>> consumption is a major cause of slowness.
>
> I'm pretty sure the memory consumption happens with CTFE and Don is working on it.

I don't have a lot of it, but surely some.
December 11, 2012
On 12/10/12 06:45, Walter Bright wrote:
> On 12/7/2012 2:51 PM, deadalnix wrote:
>> I'm working on a program that now require more than 2.5Gb of RAM to
>> compile,
>> where separate compilation is not possible due to bug 8997 and that
>> randomly
>> fails to compile due to bug 8596. It is NOT fast and that insane memory
>> consumption is a major cause of slowness.
> 
> I'm pretty sure the memory consumption happens with CTFE and Don is working on it.
> 
The following quote:

  it also gives very detailed information that indicates which parts of
  your program are responsible for allocating the heap memory.


from here:

  http://valgrind.org/docs/manual/ms-manual.html

suggests massif might be some help in narrowing down the cause.


December 11, 2012
On Friday, December 07, 2012 15:23:36 monarch_dodra wrote:
> In particular, when compiling "-unittest std\algorithm.d", dmd uses *nearly* 1 GB (it uses about 1,051,176K on my machine).
> 
> Problem is that when it reaches 1GB, it crashes. I have a pull request which adds a few unittests to algorithm, and it is consistently crashing on win32 with an out of memory error.
> 
> In layman's terms: std\algorithm.d is full. You literally can't add any more unittests to it, without crashing dmd on win32.
> 
> I'd have recommended splitting the unittests in sub-modules or whatnot, but come to think about it, I'm actually more concern that a module could *singlehandedly* make the compiler crash on out of memory...
> 
> Also, I'm no expert, but why is my dmd limited to 1 GB memory on my 64 bit machine...?

Don just made some changes to Tuple so that it no longer uses std.metastrings.Format (which is _insanely_ inefficient), which may fix the memory problem that you were running into. Obviously, dmd still has issues, but a major one in the library was fixed (though we should really look at killing std.metastrings entirely, because it's too inefficient to reasonably use and the new format should now work with CTFE, unlike the old one - which is why Format exists in the first place).

- Jonathan M Davis
December 11, 2012
On Tuesday, 11 December 2012 at 04:30:38 UTC, evansl wrote:
> On 12/10/12 06:45, Walter Bright wrote:
>> On 12/7/2012 2:51 PM, deadalnix wrote:
>>> I'm working on a program that now require more than 2.5Gb of RAM to
>>> compile,
>>> where separate compilation is not possible due to bug 8997 and that
>>> randomly
>>> fails to compile due to bug 8596. It is NOT fast and that insane memory
>>> consumption is a major cause of slowness.
>> 
>> I'm pretty sure the memory consumption happens with CTFE and Don is
>> working on it.
>> 
> The following quote:
>
>   it also gives very detailed information that indicates which parts of
>   your program are responsible for allocating the heap memory.
>
>
> from here:
>
>   http://valgrind.org/docs/manual/ms-manual.html
>
> suggests massif might be some help in narrowing down the cause.

The problem with valgrind is that it does increase quite a lot the memory consumption of the program.
December 11, 2012
On 12/7/2012 6:23 AM, monarch_dodra wrote:
> Also, I'm no expert, but why is my dmd limited to 1 GB memory on my 64 bit
> machine...?

The latest beta I uploaded increases the limit to 2 GB (thanks to a patch by Rainer Schuetze).
January 14, 2013
On Tuesday, 11 December 2012 at 22:11:55 UTC, Walter Bright wrote:
> On 12/7/2012 6:23 AM, monarch_dodra wrote:
>> Also, I'm no expert, but why is my dmd limited to 1 GB memory on my 64 bit
>> machine...?
>
> The latest beta I uploaded increases the limit to 2 GB (thanks to a patch by Rainer Schuetze).

Is this patch in the main github release, or is there something special to change in the DMD makefile?

I'm still having trouble, and am now having to deactivate some of algorithm's unittest to compile it, even without any changes.

Any idea what I'm doing wrong?
January 14, 2013
On 1/14/2013 1:35 PM, monarch_dodra wrote:
> On Tuesday, 11 December 2012 at 22:11:55 UTC, Walter Bright wrote:
>> On 12/7/2012 6:23 AM, monarch_dodra wrote:
>>> Also, I'm no expert, but why is my dmd limited to 1 GB memory on my 64 bit
>>> machine...?
>>
>> The latest beta I uploaded increases the limit to 2 GB (thanks to a patch by
>> Rainer Schuetze).
>
> Is this patch in the main github release, or is there something special to
> change in the DMD makefile?
>
> I'm still having trouble, and am now having to deactivate some of algorithm's
> unittest to compile it, even without any changes.
>
> Any idea what I'm doing wrong?

Durn, I don't remember what the patch was.