September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander | Peter Alexander wrote:
> In contrast, my D hobby project at only a few thousand lines of code already takes 11s to build and doesn't do any fancy metaprogramming or use CTFE.
Curious, did you use a library like QtD?
My slowest D compile except my one attempt into qtd is about 30,000 lines of template using code that takes about 5 seconds on my computer. (I compile and link it all at once)
I could see this getting annoying if it continued to scale that way to 3 million, but that's still the exception in my experience: my typical D program builds in under one second, including a 14,000 line hobby game.
| |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | On Sun, Sep 18, 2011 at 1:32 AM, Adam D. Ruppe <destructionator@gmail.com> wrote:
> Peter Alexander wrote:
>> In contrast, my D hobby project at only a few thousand lines of code already takes 11s to build and doesn't do any fancy metaprogramming or use CTFE.
>
> Curious, did you use a library like QtD?
>
> My slowest D compile except my one attempt into qtd is about 30,000 lines of template using code that takes about 5 seconds on my computer. (I compile and link it all at once)
>
>
> I could see this getting annoying if it continued to scale that way to 3 million, but that's still the exception in my experience: my typical D program builds in under one second, including a 14,000 line hobby game.
>
As a general rule I think, most things don't scale linearly, they scale considerably worse.
| |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Josh Simmons | Josh Simmons wrote: > As a general rule I think, most things don't scale linearly, they scale considerably worse. Let's try something: === import std.file; import std.conv; import std.string; void main() { string m; foreach(i; 0 .. 1000) { string code; foreach(a; 0 .. 1000) { code ~= format(q{ int someGlobal%d; void someFunction%d() { someGlobal%d++; } }, a, a, a); } std.file.write("file" ~ to!string(i) ~ ".d", code); m ~= "import file" ~ to!string(i) ~ ";"; } std.file.write("main.d", m ~ " void main() {} "); } === $ ./generate $ wc *.d 5000000 7001004 83684906 total $ time dmd *.d Segmentation fault real 0m29.915s user 0m26.208s sys 0m3.684s Holy shit, 3.6 GB of memory used! Aaaand segmentation fault. OK, something's not good here :-P Let's try this: $ time ../dmd2/linux/bin64/dmd *.d real 0m45.363s user 0m26.009s sys 0m14.193s About 6 GB total memory used at the peak. Wow. I never thought I'd actually use that much ram (but it was cheap). $ ls -lh *.o -rw-r--r-- 1 me users 371M 2011-09-17 12:40 file0.o For some reason, there's no executable... $ time ld file0.o ../dmd2/linux/lib64/libphobos2.a -lm -lpthread ld: warning: cannot find entry symbol _start; defaulting to 0000000000400f20 real 0m10.439s user 0m9.304s sys 0m0.915s $ ls -lh ./a.out -rwxr-xr-x 1 me users 102M 2011-09-17 12:43 ./a.out Wow. Anyway, one minute to compile 5,000,000 lines of (bullshit) code isn't really bad. It took a lot of memory, but that's not a dealbreaker - I got this 8 gb dirt cheap, and the price has gone down even more since then. Worst case, we can just throw more hardware at it. This code is nothing fancy, of course. Now, what about an incremental build. $ echo 'rm *.o; for i in *.d; do dmd -c ; done; dmd *.o' > build $ time bash build waiting... lots of hard drive activity here. (BTW, I realize in a real build, you could do a lot of this in parallel, so this isn't really a fair scenario. I'm just curious how it will turn out.) ld is running now. I guess dmd did it's thing in about one minute Anyway, it's complete: $ time bash build.sh rm: cannot remove `*.o': No such file or directory real 1m44.632s user 1m17.358s sys 0m10.275s Two minutes for compile+link incrementally. The memory usage never became significant. $ ls -l file0 -rwxr-xr-x 1 me users 214M 2011-09-17 12:50 file0 This is probably double unrealistic since I didn't have any of the modules import other modules. But, I did feed 5,000,000 lines of code spread over 1,000 modules to the D compiler, and it managed to work in a fairly reasonable time - one minute is decent. Of course, if you bring in fancier things than this trivial example, who knows. | |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On 17/09/11 4:28 PM, Andrei Alexandrescu wrote: > On 9/17/11 10:08 AM, Peter Alexander wrote: >> On 17/09/11 6:53 AM, Nick Sabalausky wrote: >>> And then there's the enurmous savings in build times alone. Full >>> recompiles >>> of AAA C++ games are known to take upwards of a full day (not sure >>> whether >>> that's using a compile farm, but even if it is, D could still cut >>> down on >>> compile farm expenses, or possibly even the need for one). >> >> This is false. You can easily build several million lines of code in >> several minutes using unity files and distributed building. There need >> not be any build farm expenses, the build machines can just be >> everyone's dev machines. > > Then Facebook would love your application (I'm not kidding; send me > private email if interested). We have a dedicated team of bright > engineers who worked valiantly on this (I also collaborated), and a > dedicated server farm. Compile times are still a huge bottleneck. > > Andrei It's not my application. We use Incredibuild: http://www.xoreax.com/ And I'm sure you know what unity builds are. For those that don't: http://buffered.io/2007/12/10/the-magic-of-unity-builds/ | |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | On 17/09/11 4:32 PM, Adam D. Ruppe wrote:
> Peter Alexander wrote:
>> In contrast, my D hobby project at only a few thousand lines of code
>> already takes 11s to build and doesn't do any fancy metaprogramming
>> or use CTFE.
>
> Curious, did you use a library like QtD?
Nope.
I use some parts of the standard library, not much of it though. I also use Derelict, but again, not much of it.
I'm talking about a -release -inline -O build btw.
For a normal build it's only 1.7 seconds.
| |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander | Ah, that explains it. I usually don't use the -O switch. | |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Attachments:
| Hmmm. If $100M was on the line, the project code base must be extremely large. Correct? With a code base of that size, more than half would be common or boilerplate functionality, e.g. read a config file, read a data file, write/update a file, parse the command line, maintain a list, put up a window, etc. All been done before, all mundane, all "boring". There would be 20-30% of truly new code specific to the project. Not really boring code, not really exciting code. Probably wouldn't require any specific language features. I would worry about large-scale project support from the language, e.g. package/module isolation, minimal compilation time, debug support, etc. IDE availability and tool support in general would be factors too. And finally there would be 5-10%, perhaps less, of the code base which was "exciting". It would require certain language features or capabilities that only a language like D could provide. Given that project layout, what I would want then is a language *and* development kit that had the full project requirements covered. If the "exciting" stuff could be covered by Java or C#, I'd use Java/C# since the vast majority of the "boring" functionality would be already available to me in the JDK/CLR. If the "exciting" stuff, could only be covered by D, then I'd worry how I was going to write all that "boring" code in time, especially if I had to guarantee some level of defect rate. The JDK/CLR rides on top of Java/C#, both OK languages with OK features. Having the JDK/CLR available and tested by millions of developers? Very, very appealing. I could of course redevelop or convert, for example, a DOM XML parser in D, but that takes time. Would I want to spend the development time and debug time in this project to hit a low defect rate on boring code? Or would I just go with a language & development kit that already had a wide code base and known defect rate. Generally speaking I believe low defect rates are due to time passing - get a large number of people kicking at a bunch of code over a long period of time, eventually the bugs get fixed. The concern is will there be enough time in the project for that effect to naturally run its course? So D right now has Phobos and Tango. Both are good, but not fully featured and, relatively speaking, untried. I could plan for a roll-yer-own development kit from scratch. Daunting. I could plan to patch together a whole set of converted C/C++ libraries. I could start with a conversion of Boost or something similar to D and add to it as I needed. But all this pales when compared to the 5,000/3,000 classes already written for me in JDK/CLR. That's a heck of a lot of code, all with a relatively low (and at least known) defect rate that I don't have to write. The less code I write, the more of that $100M stays in my pocket, right? In short, it's not D itself that would drive my decision to use or not use D. It is the extent and quality of the development kit that goes along with it. Of course, if the "exciting" part of the project was a solid fit with D then my decision would naturally swing that way. But if a language like Java/C# could do that part for me, I'd go with it and its JDK/CLR in a heartbeat. As a side note: the interesting twist here to me is that D language features themselves promote the possibility of a very high quality DeeDK. It would certainly be faster, and with enough unit testing and diligence, of a better quality than JDK/CLR could ever hope to be. John | |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | Am 17.09.2011, 20:01 Uhr, schrieb Adam D. Ruppe <destructionator@gmail.com>:
> Ah, that explains it. I usually don't use the -O switch.
During development when you recompile several times an hour, you really don't need -O either. For my hobby projects I usually set up the IDE with a compile command without -O and place a Makefile in the directory that has optimizations enabled. Sure, there are times when you run performance tests, but they aren't the usual case, so I think it is fair to compare compiles without optimizations in this context.
| |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | "Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:j52eia$133v$1@digitalmars.com... > Peter Alexander wrote: >> In contrast, my D hobby project at only a few thousand lines of code already takes 11s to build and doesn't do any fancy metaprogramming or use CTFE. > > Curious, did you use a library like QtD? > > My slowest D compile except my one attempt into qtd is about 30,000 lines of template using code that takes about 5 seconds on my computer. (I compile and link it all at once) > DDMD takes 1-2 minutes to build for me. | |||
September 17, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander | On 9/17/2011 8:08 AM, Peter Alexander wrote:
> I work at a very large game studio and I can assure you that I would *never* be
> able to justify using D for a project. Even if all our code magically
> transformed into D, and all our programmers knew D, I still wouldn't be able to
> justify the creation of all the necessary tools and dev systems to do something
> that we can already do.
This is true in any industry that has a large investment in an existing technology. Eventually the pressure to make the change becomes overwhelming.
| |||
Copyright © 1999-2021 by the D Language Foundation
Permalink
Reply