September 17, 2011
On 09/17/2011 12:55 PM, Peter Alexander wrote:
> On 17/09/11 4:28 PM, Andrei Alexandrescu wrote:
>> On 9/17/11 10:08 AM, Peter Alexander wrote:
>>> On 17/09/11 6:53 AM, Nick Sabalausky wrote:
>>>> And then there's the enurmous savings in build times alone. Full
>>>> recompiles
>>>> of AAA C++ games are known to take upwards of a full day (not sure
>>>> whether
>>>> that's using a compile farm, but even if it is, D could still cut
>>>> down on
>>>> compile farm expenses, or possibly even the need for one).
>>>
>>> This is false. You can easily build several million lines of code in
>>> several minutes using unity files and distributed building. There need
>>> not be any build farm expenses, the build machines can just be
>>> everyone's dev machines.
>>
>> Then Facebook would love your application (I'm not kidding; send me
>> private email if interested). We have a dedicated team of bright
>> engineers who worked valiantly on this (I also collaborated), and a
>> dedicated server farm. Compile times are still a huge bottleneck.
>>
>> Andrei
>
> It's not my application.

I meant employment application.

> We use Incredibuild: http://www.xoreax.com/

Is it available on Linux?

> And I'm sure you know what unity builds are. For those that don't:
> http://buffered.io/2007/12/10/the-magic-of-unity-builds/

That thing when you concatenate everything to be compiled, right? We don't do that although the technique is known. I'll ask why. Off the top of my head, incremental compilation is difficult. I also wonder how the whole thing can be distributed if it's all in one file.


Thanks,

Andrei
September 17, 2011
On Sat, 17 Sep 2011 20:55:38 +0300, Peter Alexander <peter.alexander.au@gmail.com> wrote:

> And I'm sure you know what unity builds are. For those that don't: http://buffered.io/2007/12/10/the-magic-of-unity-builds/

I've always wondered if the overhead that unity builds are supposed to reduce was mainly because of all the build tools which force the OS to flush intermediate files to disk. Has anyone compared the performance advantages of using unity builds versus building everything on a RAM drive?

-- 
Best regards,
 Vladimir                            mailto:vladimir@thecybershadow.net
September 17, 2011
On 17/09/11 8:52 PM, Andrei Alexandrescu wrote:
> On 09/17/2011 12:55 PM, Peter Alexander wrote:
>> On 17/09/11 4:28 PM, Andrei Alexandrescu wrote:
>>> On 9/17/11 10:08 AM, Peter Alexander wrote:
>>>> On 17/09/11 6:53 AM, Nick Sabalausky wrote:
>>>>> And then there's the enurmous savings in build times alone. Full
>>>>> recompiles
>>>>> of AAA C++ games are known to take upwards of a full day (not sure
>>>>> whether
>>>>> that's using a compile farm, but even if it is, D could still cut
>>>>> down on
>>>>> compile farm expenses, or possibly even the need for one).
>>>>
>>>> This is false. You can easily build several million lines of code in
>>>> several minutes using unity files and distributed building. There need
>>>> not be any build farm expenses, the build machines can just be
>>>> everyone's dev machines.
>>>
>>> Then Facebook would love your application (I'm not kidding; send me
>>> private email if interested). We have a dedicated team of bright
>>> engineers who worked valiantly on this (I also collaborated), and a
>>> dedicated server farm. Compile times are still a huge bottleneck.
>>>
>>> Andrei
>>
>> It's not my application.
>
> I meant employment application.
>
>> We use Incredibuild: http://www.xoreax.com/
>
> Is it available on Linux?

From what I can tell from the website, it's Windows only.


>> And I'm sure you know what unity builds are. For those that don't:
>> http://buffered.io/2007/12/10/the-magic-of-unity-builds/
>
> That thing when you concatenate everything to be compiled, right? We
> don't do that although the technique is known. I'll ask why. Off the top
> of my head, incremental compilation is difficult. I also wonder how the
> whole thing can be distributed if it's all in one file.

You don't need to concatenate everything into a single file. Just put maybe 50 source files per unity file (group ones that #include common headers) and then compile all the unity files separately. That way you can distribute individual unity files, and it also means that you don't have to rebuild the entire solution when changing a single source file.

It's a bit of a balancing act getting the right number of unity files.

September 17, 2011
On 17/09/11 8:59 PM, Vladimir Panteleev wrote:
> On Sat, 17 Sep 2011 20:55:38 +0300, Peter Alexander
> <peter.alexander.au@gmail.com> wrote:
>
>> And I'm sure you know what unity builds are. For those that don't:
>> http://buffered.io/2007/12/10/the-magic-of-unity-builds/
>
> I've always wondered if the overhead that unity builds are supposed to
> reduce was mainly because of all the build tools which force the OS to
> flush intermediate files to disk. Has anyone compared the performance
> advantages of using unity builds versus building everything on a RAM drive?

I have no idea if anyone has done that comparison.

That's certainly not the only advantage though. If you went with the single unity file approach then it means that each header needs to only be parsed once. It also means that unique template instantiations are only emitted once, making things easier for the linker. I once measured that it takes GCC 30ms on my laptop to emit an std::sort of an int array.
September 17, 2011
On 17/09/11 7:52 PM, Marco Leise wrote:
> Am 17.09.2011, 20:01 Uhr, schrieb Adam D. Ruppe
> <destructionator@gmail.com>:
>
>> Ah, that explains it. I usually don't use the -O switch.
>
> During development when you recompile several times an hour, you really
> don't need -O either. For my hobby projects I usually set up the IDE
> with a compile command without -O and place a Makefile in the directory
> that has optimizations enabled. Sure, there are times when you run
> performance tests, but they aren't the usual case, so I think it is fair
> to compare compiles without optimizations in this context.

I suppose that's true for most people, but not for games developers. When testing changes, you need the game to be running at interactive framerates, and it's very difficult to achieve that in a debug build, so we generally always run optimized, and just use MSVC++'s #pramga optimize(off, "") directive to unoptimize specific sections of code for debugging.

To be fair, my hobby project still runs fast without optimizations, but I definitely need to repeatedly compile with optimizations on when performance tuning.
September 17, 2011
On Sep 16, 2011, at 11:24 PM, Nick Sabalausky wrote:
> 
> Conclusion: High schools specifically cultivate sheeple, which is a quality preferred by "respectable" colleges.

Depends on the college and even on the professor, though it's obviously difficult for universities with large class sizes to accommodate any other form of teaching and grading than a basic regurgitation of facts.  Particularly in entry-level courses where the point of the course is to gain a basic knowledge of the field.
September 17, 2011
On Sep 17, 2011, at 8:08 AM, Peter Alexander wrote:

> On 17/09/11 6:53 AM, Nick Sabalausky wrote:
> 
>> And then there's the enurmous savings in build times alone. Full recompiles of AAA C++ games are known to take upwards of a full day (not sure whether that's using a compile farm, but even if it is, D could still cut down on compile farm expenses, or possibly even the need for one).
> 
> This is false. You can easily build several million lines of code in several minutes using unity files and distributed building. There need not be any build farm expenses, the build machines can just be everyone's dev machines.

Linking can still be pretty time consuming.  My last big project was several million LOC broken into a tree of projects that all eventually produced a handful of large applications.  This was a originally SPARC Solaris app so we couldn't spread the build across PCs, but rather built in parallel on big fancy machines.  When I started, a full build took the better part of a work day, and before I left a full build was perhaps 30 minutes (my memory is a bit fuzzy here, but that sounds like a reasonable ballpark).  The average level of parallelism was 10-20 cores working on the build using make -j.  I suppose it's worth mentioning that building on Opteron was significantly faster, even using fewer cores.  The code used almost no templates, which is a significant factor in total compile time.

> In contrast, my D hobby project at only a few thousand lines of code already takes 11s to build and doesn't do any fancy metaprogramming or use CTFE. I am unaware of any distributed, incremental build systems for D, so I see no particular speed advantage to using D (certainly not orders of magnitude anyway).

My current project builds in 15 minutes on its current, ancient build machine at work.  Written in C.  A full build on my PC is under 5 minutes.  Obviously, compile time isn't the only reason to choose D over some other language though.

>> I'm sure there are smaller reasons too, but I'm convinced the primary reason why AAA game dev is C++ instead of D is ultimately because of inertia, not the languages themselves, or even the tools (If the AAA game dev industry genuinely wanted to be using D, you can bet that any tools they needed would get made).
> 
> Tools are not free. Don't assume just because a company is large that it has unlimited funds. Creating tools, converting libraries all take lots of time and money that have to be justified.
> 
> I work at a very large game studio and I can assure you that I would *never* be able to justify using D for a project. Even if all our code magically transformed into D, and all our programmers knew D, I still wouldn't be able to justify the creation of all the necessary tools and dev systems to do something that we can already do.

It's absolutely about the toolchain.  I'd be curious to hear from anyone who has tried developing for the Xbox 360 using D, or any of the big 3 consoles, really.
September 17, 2011
On Sep 16, 2011, at 10:28 PM, Josh Simmons wrote:

> On Sat, Sep 17, 2011 at 2:55 PM, Sean Kelly <sean@invisibleduck.org> wrote:
>> On Sep 16, 2011, at 7:09 PM, Xavier wrote:
>> 
>>> Peter Alexander wrote:
>>>> I recently stumbled across this (old) blog post:
>>>> http://prog21.dadgum.com/13.html
>>>> 
>>>> In summary, the author asks if you were offered $100,000,000 for some big software project,
>>> 
>>> While this is a "silly little hypothetical thread" (and it is Friday afterall so that probably explains the OP), I cannot fathom that amount being spent on just software on one project (though I've worked on one system, i.e., software + hardware, project worth 10's of millions). Maybe someone here can? Examples please, or give the largest one you can think of (it can be hypothetical). Remember, it's just software, not a system.
>> 
>> Top-tier computer game budgets are tens of millions of dollars.
>> 
> 
> Writing a AAA game in D would mean fixing a whole bunch of D, way easier to stick to what's proven.
> 
> You'd have to disable the collector or make it better than every existing one, which in turn means you're not using most of the standard library. This is OK though since AAA games generally don't use standard library stuff anyway. You'd have to fix the codegen too (or maybe develop further ldc or gdc) and build new tools for just about everything.
> 
> So basically sure you could do anything with enough money, but why would you do it the hard way?

I didn't say I would.  That was merely an example of a multi-million dollar software project.
September 18, 2011
"Nick Sabalausky" <a@a.a> wrote in message news:j51mq9$2r1t$1@digitalmars.com...
> "Xavier" <xman@nospam.net> wrote in message news:j51jsp$2lln$2@digitalmars.com...
>>
>> "Nick Sabalausky" <a@a.a> wrote in message news:j51h52$2h0e$1@digitalmars.com...
>>> "Jonathan M Davis" <jmdavisProg@gmx.com> wrote in message news:mailman.2921.1316239886.14074.digitalmars-d@puremagic.com...
>>>> On Saturday, September 17, 2011 01:53:07 Nick Sabalausky wrote:
>>>>> People who are *good* at C++ are hard to find, and even harder to
>>>>> cultivate.
>>>>> And that's never going to change. It's a fundamental limitation of
>>>>> the
>>>>> langauge (at least until the Vulcans finally introduce themselves
>>>>> to us).
>>>>> But D's a lot easier for people to become good at.
>>>>
>>>> It's a _lot_ easier to find good C++ programmers than good D programmers,
>>>
>>> Oh, definitely. But what I meant was that good D programmers can be cultivated. People can learn to be good at D. And while the same might *technically* be true of C++, the curve is so steep that it may as well be "what's out there is what's out there". It's, more or less, a non-renewable resource.
>>
>> It's not nearly as "steep" as it used to be, for C++, the tools, the techniques, the documentation, the users have matured and one need not struggle through everything on one's own anymore while learning it, but rather just go look up or ask for the answer, and it is still improving. Sure, if one exploits every "stupid template trick" and similarly with the other language features, then you will have "steep", but it is quite tractable these days if one isn't overzealous and able to separate all the jabber about "metaprogramming" and the like from the meat of the language. It will always have its warts, but D has many of the same ones.
>>
>
> In other words, "C++ is easy^H^H^H^Hless hard than it used to be, as long as you don't use any of the advanced features that are already trivial in D anyway."

No, but rather that most programmers don't know how to program yet and they think they need those things all the time.

>
>
>>> I realize I've said this other times in the past, but I find that the compiler bugs in DMD are much less severe than the language deficiencies of a fully-bug-free C++ implementation.
>>>
>>
>> That's an interesting, if not odd, statement considering that C++ are more alike than they are different.
>>
>
> I don't understand what you're saying here. Did you mean "D and C++ are more alike than different", or "C++ implementations are more alike than are different". Either way, it doesn't make much sense.

The first one.

>
>
>>> Plus there's the idea of investing in the future to keep in mind: It's like the old quote: "I may be fat, but you're stupid. I can excersise and diet, but stupid will always be stupid."
>>
>> The truth of the matter is, though, that she won't exercise to any significant degree and has been on a diet her whole life and her weight has continually increased. On top of that, the fact that one can study, research and learn escapes the fat dumb blonde bimbo because she indeed is stupid, and that's why her "dieting" causes her to gain weight instead of lose it.
>>
>
> You've just completely broken the analogy because D's bugs *DO* get fixed. And they're getting fixed rather quickly now, too.

To be honest, I was just spouting on, and having fun with, the phrase and not it's applicability to anything in this thread.

>
>
>>>  D may have some bugs, but investing the effort to deal with them
>>> will lead to further improvements. Dealing with C++'s problems, OTOH,
>>> will hardly do a damn thing.
>>
>> Again, I find that a curious statement for reason noted. The language names even fit together: C/C++/D. There is no denying that they are all related. Just look at those noses! C'mon!
>>
>
> Umm, yea, they're related. So what? Don't tell me you're trying to imply that just because they're related they're inherently equal in everything but implementation.

Ah, see now you're backing down. Now you are just trying to prove unequality rather than significant difference.

>
>
>>> Sure, a few things can be mitigated somewhat, such as the C++0x^H^H1x^H^H2x^H^H3x improvents. But in general, investing the effort to deal with C++'s shortcomings won't lead to significant improvements - it *can't* because it's constrained by its existing legacy design (not that that won't eventually happen to D, too, but D is one generation past C++).
>>
>> One generation away, but still the same family. So what?
>>
>>>  Ie., D may be buggy, but C++ is crappy. Bugs can be fixed, but
>>> crappy will always be crappy.
>>
>> All adolescents conflict with their parents and say things like that. When D grows up, the D++ or E kids will be maligning D and then D will remember back how it was just the same when it was just a youngster.
>>
>
> Are you seriously trying say that that implies each successive one is inherently no better than the previous?

I was alluding to the fact that you are overstating the significance of the difference between the two languages. That is to say that the differences are rather "cosmetic" more than they are meaningful. Oh, in a few areas, sure, there are differences, but overall, "same old stuff".

>  If so, then that's just patently absurd. If not, then what in the
> world *is* your point? Just to troll?

>>>>
>>>> I definitely prefer D to C++, but I honestly think that your hatred
>>>> of C++
>>>> (which you have expressed on several occasions) clouds your
>>>> judgement on the
>>>> matter.
>>>
>>> FWIW, I had been a huge fan of C++ for many years and used it extensively ('course, that was quite awhile ago now...). And I *do* think it was a great language back in it's time. I just think that time is long since past.
>>
>> I think C++ is now coming into it's own and it sucked in the past much more. D is now in it's sucky period IMO, and may have it's day in the future. Time will tell.
>>
>
> Well, like I've said, I'd rather something with better language design and a few implementation worts, than something with inferior language design and perfect implementation.
>

Yeah, yeah.. OK "fan boy". ;)

>
>>> When I say "C++ is crappy", I mean "within today's context, and moving forward from here".
>>
>> Tomorrow is surely something else, probably not D, IMO, but today is all C++.
>>
>>> I'm certainly aware of all that, and I do understand. But the question here wasn't "Do you think OTHER people feel language X is suitable for serious work?" It was "Do YOU think language X is suitable for serious work?" I don't doubt other people would disagree with me (especially people who haven't used D, and even probably some who have), but my own answer is "Yes, I think D is suitable for such projects, and in such a situation, yes, I would be willing to put my money where my mouth is."
>>
>> Ha! I inadvertently just answered those questions. Well, I guess you know what I think now (not that I was going to hide it).
>
> You mean that you're just here to troll?
>
>
> 


September 18, 2011
"Josh Simmons" <simmons.44@gmail.com> wrote in message news:mailman.2925.1316249875.14074.digitalmars-d@puremagic.com...
> On Sat, Sep 17, 2011 at 6:46 PM, Nick Sabalausky <a@a.a> wrote:
>>
>> Are you seriously trying say that that implies each successive one is
>> inherently no better than the previous? If so, then that's just
>> patently
>> absurd. If not, then what in the world *is* your point? Just to troll?
>>
>
> No I believe the implication is that absolute quality is so absurdly impossible to define that it's somewhat irrelevant to even contemplate it. And it's certainly overly simplistic to consider it without putting it in the context of a given problem.
>
> Yes C++ is crap, but so is D, they're both crappy in their own ways, to suggest otherwise is to assume that you're so much more intelligent than all that have come before you that you've managed to create a perfect product when all else have failed. To make analogy, it's like saying that OOP is inherently better than any paradigm before it.
>
> Ultimately though the issue is that C++'s crap is well explored and known, D's crap is significantly less so. Whether this is an issue for you depends entirely on your context.

See Nick, I'm not the only one thinking it.