> Well, impossible is a rather strong word, considering that the win32 auto-tester seems to be doing it's job successfully.

It is impossible on my system. It will soon be impossible on all systems.
From what I've seen, the autotester is not "doing its job successfully". It's been failing intermittently over the past week. I believe it is right on the edge, right now. It only takes a very small change to the compiler to push it over.
Depending on the system you're on, it can go off the cliff a bit earlier, but it's still inevitable.
There's nothing new, in the sense that it's been getting steadily worse for years. We've to split it the Win32 build, multiple times. That solution no longer works.

> There's approximately 10M of source for druntime+phobos and some how it can't fit in a couple _gigs_ of ram?  And looking at std.algorithm, that's a mere 342k of source code.  Yes, it grows, but 4 orders of magnitude?


Yeah. 4 orders of magnitude. Our codebase at sociomantic is a bit larger than Phobos + druntime, but it compiles in just a few seconds.
The problem is, that because of templates, the memory consumption isn't linear with source size.

dmd -unittest -o- std/algorithm

instantiates 344150 templates. Yes 344K. More than a third of a million. More than the number of lines of source in the module.
And yet there are only 1305 asserts in that module -- the tests are not particularly comprehensive.

Improved memory management would not change the number of template instantiations that happen, or even how much memory is used, it would just change how good it is at collecting and reclaiming the memory.

The irony with this is, I've run into this problem while trying to reduce the memory usage of CTFE.

And I'm stuck. The autotester reports passes on all systems except win32. On Win32,  after five minutes, it reports:
unittest
std.exception.ErrnoException@std\stdio.d(1390):  (No error)
----------------
0x004AA84F in pure @safe int std.exception.errnoEnforce!(int, "std\stdio.d", 1390u).errnoEnforce(int, lazy immutable(char)[])
0x00A9E8DE in void std.stdio.File.writeln!(immutable(char)[], immutable(uint)).writeln(immutable(char)[], immutable(uint))
0x00AB8918 in void std.parallelism.__modtest()
0x00B064BB in int rt.minfo.moduleinfos_apply(scope int delegate(ref object.ModuleInfo*)).int __foreachbody555(ref rt.sections_win32.SectionGroup)
0x00B03665 in _d_run_main
0x00AF8AB8 in main
0x76BFD2E9 in BaseThreadInitThunk
0x76E91603 in RtlInitializeExceptionChain
0x76E915D6 in RtlInitializeExceptionChain
'qwertyuiop09813478' is not recognized as an internal or external command,
operable program or batch file.
and I can't even compile Phobos on my win32 system, so I can't reduce this.
For me, this is a blocker. I really don't know what to do.


@David:
> […] Every import has a cost, and that cost is far from zero.
> Are imports really the primary cause for this? Might want to (double)check this before jumping to conclusions.

No, the imports are the cause of the "big ball of mud" design of Phobos, but they're not the cause of the memory usage. It's the sheer number of template instantiations which seems to be the primary problem.



On 8 June 2013 20:32, Brad Roberts <braddr@puremagic.com> wrote:
Well, impossible is a rather strong word, considering that the win32 auto-tester seems to be doing it's job successfully.

I _do_ consider it a compiler memory management issue.  There's no reason that the entirety of phobos (or pretty much any app) ought to be compilable in one shot.  There's approximately 10M of source for druntime+phobos and some how it can't fit in a couple _gigs_ of ram?  And looking at std.algorithm, that's a mere 342k of source code.  Yes, it grows, but 4 orders of magnitude?

As to what to do, how much memory do you have and are you using the snn.lib update that Walter released a few years ago that fixed the memory allocator in it to not suck so bad?

  md5: 9357508e541067ea34056dade4381dd8 dmc/dm/lib/snn.lib


On 6/8/13 11:12 AM, Don Clugston wrote:
With win32, I can no longer run unittests:
------------------
dmd -O -w -d -property -L/co -c -unittest -ofunittest5.obj std\algorithm.d
Error: out of memory
------------------
This has happened many times before, and we dealt with it by reducing the number of modules we
compiled into each object file.  We once had 30 modules per obj file. Then fifteen. Then five. But
now we're at one, that workaround can no longer be used.
The idea that we can continue to throw billions of templates and imports into every Phobos module,
is leading us towards catastrophe.

Sure, you can say, the compiler should improve its memory management. But I don't think that's
really the problem. If it was better, then compiler might not run out of memory, but it would run
unusably slowly. I think most people have no idea of just how many templates the compiler is being
asked to instantiate.
Every import has a cost, and that cost is far from zero.

I'm stuck, and I don't know what to do.


_______________________________________________
dmd-internals mailing list
dmd-internals@puremagic.com
http://lists.puremagic.com/mailman/listinfo/dmd-internals


_______________________________________________
dmd-internals mailing list
dmd-internals@puremagic.com
http://lists.puremagic.com/mailman/listinfo/dmd-internals