November 20, 2012
On Tuesday, 20 November 2012 at 09:53:31 UTC, Walter Bright wrote:
> On 11/19/2012 11:17 PM, Jacob Carlborg wrote:
>> On 2012-11-20 04:01, Walter Bright wrote:
>>
>>> I know. I just pointed this out as I suspect this will not improve
>>> compile times more than precompiled headers do.
>>
>> The compiler will compile the header and create a some kind of map file from it.
>> This map file will be cached and later used during the compilation process. I
>> don't know how this compares to precompiled headers.
>
> It's really what precompiled headers are.

Except there is no standard way of doing it.

The work being paved by clang as base for the C++ modules, is a way to use map files as transition into a full module system.

It is to be expected that if C++17 gets modules, and C++ by that time still matters, header and map files could possibly be ditched way (deprecated) in the following standard.

Personally I hope that in 2017 we already something much better in place, like D. :)

--
Paulo
November 20, 2012
 I'd expect it to outperform pre-compiled headers, with PCH you can only add certain headers that you rarely change, since they all get packaged up into a single glob(sloow).  With this new module stuff you effectively get PCH for each module separately, so even files that are changed often will benefit from PCH.

 And no more dicking around with PCH and having to include 'sdffx' everywhere etc..
November 20, 2012
On 11/20/2012 4:33 AM, Paulo Pinto wrote:
> On Tuesday, 20 November 2012 at 09:53:31 UTC, Walter Bright wrote:
>> On 11/19/2012 11:17 PM, Jacob Carlborg wrote:
>>> On 2012-11-20 04:01, Walter Bright wrote:
>>>
>>>> I know. I just pointed this out as I suspect this will not improve
>>>> compile times more than precompiled headers do.
>>>
>>> The compiler will compile the header and create a some kind of map file from it.
>>> This map file will be cached and later used during the compilation process. I
>>> don't know how this compares to precompiled headers.
>>
>> It's really what precompiled headers are.
>
> Except there is no standard way of doing it.

Exactly, hence my comment about it "legitimizing" them.


> The work being paved by clang as base for the C++ modules, is a way to use map
> files as transition into a full module system.
>
> It is to be expected that if C++17 gets modules, and C++ by that time still
> matters, header and map files could possibly be ditched way (deprecated) in the
> following standard.
>
> Personally I hope that in 2017 we already something much better in place, like
> D. :)

Since people already use precompiled headers with C++, I don't think this change has much chance of making it compile faster.

November 20, 2012
Am 20.11.2012 21:57, schrieb Walter Bright:
>
> Since people already use precompiled headers with C++, I don't think
> this change has much chance of making it compile faster.
>

Is it really so?

I would expect that with proper modules C++ compilers could achieve compile times similar to what other module based languages offer. Specially if templates are also stored in a module friendly format.

But then again I lack enough compiler development experience to be able to judge that.

Assuming you're right, then C++ is really a lost cause, and the current trend of standards might follow what happened to Extended ISO Pascal, which vendors ignored in favour of Turbo Pascal as the defacto standard.

--
Paulo
November 20, 2012
On Tuesday, November 20, 2012 23:32:47 Paulo Pinto wrote:
> Am 20.11.2012 21:57, schrieb Walter Bright:
> > Since people already use precompiled headers with C++, I don't think this change has much chance of making it compile faster.
> 
> Is it really so?
> 
> I would expect that with proper modules C++ compilers could achieve compile times similar to what other module based languages offer. Specially if templates are also stored in a module friendly format.
> 
> But then again I lack enough compiler development experience to be able to judge that.
> 
> Assuming you're right, then C++ is really a lost cause, and the current trend of standards might follow what happened to Extended ISO Pascal, which vendors ignored in favour of Turbo Pascal as the defacto standard.

You should read this:

http://www.drdobbs.com/cpp/c-compilation-speed/228701711

It's an article by Walter explaining why C++ compilation speeds are so slow. Pre-compiled headers would help in some circumstances, but in others, they can't (because recompilation is required due to different preprocessor macros or whatnot). And there are issues intrinsic to the lanugage which make compilation slower even if you were able to compile each file only once. C/C++ are just plain badly designed when it comes to compilation speed. Textual inclusion is a horrible idea in that regard (though it may have been required at the time those languages were created due to the memory constraints of the systems at the time).

So, while smart people may be able to make some improvements to C++ to shorten compilation times (and we all hope that they succeed), they can never entirely fix them. For that, among other things, you'd need a language which didn't use textual inclusion, and those sorts of changes would be too big for C/C++ at this point. I think that that's on the list of changes which are pointless to make to C/C++, because if you were going to break backwards compatibility on that level, you might as well just create a new language. The challenge that the C/C++ folks have is improving it without breaking backwards compatibility, and that's incredibly constraining, since so many of C/C++'s problems are very intrinsic to how they're designed.

- Jonathan M Davis
November 20, 2012
On 11/20/2012 2:45 PM, Jonathan M Davis wrote:
> http://www.drdobbs.com/cpp/c-compilation-speed/228701711

One thing I neglected to mention is that template instantiation is terribly, fundamentally, slow. This problem comes to the fore when templates are used to implement computations, such as doing a compile time factorial using a template.

No amount of module design can possibly fix that.

D fixes it by using CTFE for compile time computations. (You can still do it with templates, but you'll be sorry.) CTFE is still slow, but it's a hundred times (made up number) faster than using templates.
November 21, 2012
Jonathan M Davis, el 20 de November a las 14:45 me escribiste:
> On Tuesday, November 20, 2012 23:32:47 Paulo Pinto wrote:
> > Am 20.11.2012 21:57, schrieb Walter Bright:
> > > Since people already use precompiled headers with C++, I don't think this change has much chance of making it compile faster.
> > 
> > Is it really so?
> > 
> > I would expect that with proper modules C++ compilers could achieve compile times similar to what other module based languages offer. Specially if templates are also stored in a module friendly format.
> > 
> > But then again I lack enough compiler development experience to be able to judge that.
> > 
> > Assuming you're right, then C++ is really a lost cause, and the current trend of standards might follow what happened to Extended ISO Pascal, which vendors ignored in favour of Turbo Pascal as the defacto standard.
> 
> You should read this:
> 
> http://www.drdobbs.com/cpp/c-compilation-speed/228701711
> 
> It's an article by Walter explaining why C++ compilation speeds are so slow. Pre-compiled headers would help in some circumstances, but in others, they can't (because recompilation is required due to different preprocessor macros or whatnot).

Did you ever cared about reading those slides?!?!? You keep talking about problems with pre-compiled headers and what Doug Gregor is suggesting are NOT pre-compiled headers. Those are already in clang AFAIK.

What he is proposing is a real module system, macros will not be re-evaluated inside modules. The symbols being global have nothing to do with this being pre-compiled headers.

Will this solve all the problems from C++ and make its compile time blazingly fast? Probably not, but will sure help, not only to avoid reading the same header over and over, but also by saving memory. But one thing is certain, THIS IS NOT PRE-COMPILED HEADERS (he even mention pre-compiler headers in the slides).

For f*ck sake... Please, stop this misinformation madness.

Thanks :)

-- 
November 21, 2012
Walter Bright:

> CTFE is still slow,

I don't agree. Given that what's run at compile-time is statically typed code that's easy to optimize, then with techniques that are simpler that ones used by LuaJIT CTFE or JavaScript V8 JIT (that work on dynamically tyed code, that requires lot of work to optimize well), it is able to become almost as fast as regular D code (such JITs compile only if a computation/loop takes a large enough amount of time, otherwise they interpret efficiently).

Bye,
bearophile
November 21, 2012
On 11/20/2012 3:51 PM, Leandro Lucarella wrote:
> Did you ever cared about reading those slides?!?!? You keep talking about
> problems with pre-compiled headers and what Doug Gregor is suggesting are NOT
> pre-compiled headers. Those are already in clang AFAIK.
>
> What he is proposing is a real module system, macros will not be re-evaluated
> inside modules. The symbols being global have nothing to do with this being
> pre-compiled headers.


Modules *are* a form of precompiled headers.


> Will this solve all the problems from C++ and make its compile time blazingly
> fast? Probably not, but will sure help, not only to avoid reading the same
> header over and over, but also by saving memory. But one thing is certain,
> THIS IS NOT PRE-COMPILED HEADERS (he even mention pre-compiler headers in the
> slides).
>
> For f*ck sake... Please, stop this misinformation madness.

Precompiled headers are:

1. compile a bunch of .h files into a symbol table
2. cache the symbol table (in memory or on disk)
3. read the symbol table instead of reparsing the .h files

Modules are:

1. compile a bunch of .h files into a symbol table
2. cache the symbol table (in memory or on disk)
3. read the symbol table instead of reparsing the .h files

Yes, I understand that there are semantic differences, and many differences in detail. C++11 does not support precompiled headers; all ph implementations are a kludge and are not standard compliant. The module proposal "legitimizes" them, i.e. changes the standard so that ph can be compliant. Yes, additional goodies are added like a separate scope for macros, an explicit syntax for them, etc.

The speed improvement should be comparable to what can be achieved with a good ph system.

Is this module proposal an improvement? I'd say yes. Is it going to solve C++'s compile speed problems? I doubt it. Is it a "true" module system? I don't know about that, but it doesn't address things like name collisions that are imported from different modules, at least not from what I saw in the slides.

November 21, 2012
On Monday, 19 November 2012 at 20:43:21 UTC, Walter Bright wrote:
> On 11/17/2012 3:30 AM, bearophile wrote:
>> http://llvm.org/devmtg/2012-11/Gregor-Modules.pdf
>
> One thing to note what it doesn't do - it doesn't produce a "module" scope. As far as I can tell, the symbols in imported modules all go into the global scope.
>
> It seems to be mainly a way of legitimizing precompiled headers.

On slide #39 there is `public:`. So I guess they consider module scope at least in future versions of module system.