September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Timon Gehr | Timon Gehr wrote:
> On 09/17/2011 10:57 AM, Josh Simmons wrote:
>> On Sat, Sep 17, 2011 at 6:46 PM, Nick Sabalausky<a@a.a> wrote:
>>>
>>> Are you seriously trying say that that implies each successive one is inherently no better than the previous? If so, then that's just patently absurd. If not, then what in the world *is* your point? Just to troll?
>>
>> No I believe the implication is that absolute quality is so absurdly impossible to define that it's somewhat irrelevant to even contemplate it. And it's certainly overly simplistic to consider it without putting it in the context of a given problem.
>
> Well, my pragmatic and simplistic definition of language quality is
Oh curb it already.
| |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to J Arrizza | J Arrizza wrote:
> Hmmm. If $100M was on the line, the project code base must be extremely large. Correct?
Hello. Next!
| |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander | Peter Alexander wrote:
> This is false.
| |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | Andrei Alexandrescu wrote:
> My doctorate
And what about it?
| |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander | Am 17.09.2011, 22:33 Uhr, schrieb Peter Alexander <peter.alexander.au@gmail.com>: > On 17/09/11 7:52 PM, Marco Leise wrote: >> Am 17.09.2011, 20:01 Uhr, schrieb Adam D. Ruppe >> <destructionator@gmail.com>: >> >>> Ah, that explains it. I usually don't use the -O switch. >> >> During development when you recompile several times an hour, you really >> don't need -O either. For my hobby projects I usually set up the IDE >> with a compile command without -O and place a Makefile in the directory >> that has optimizations enabled. Sure, there are times when you run >> performance tests, but they aren't the usual case, so I think it is fair >> to compare compiles without optimizations in this context. > > I suppose that's true for most people, but not for games developers. When testing changes, you need the game to be running at interactive framerates, and it's very difficult to achieve that in a debug build, so we generally always run optimized, and just use MSVC++'s #pramga optimize(off, "") directive to unoptimize specific sections of code for debugging. > > To be fair, my hobby project still runs fast without optimizations, but I definitely need to repeatedly compile with optimizations on when performance tuning. May I ask how much slower the frame rate is with the debug build? I would think of certain modules like physics or vegetation to be in precompiled external libraries (or do you use source code versions?) and the graphic card doing a lot of work. Also I remember when some game developer said a while back "we are not cpu bound at all". Has that changed again? Or is it maybe because your project has not been optimized yet? Or is it simply that you are developing for the next generation of hardware? My definition of interactive frame rate starts at ~20 FPS I guess, sounds like it would be easy to achieve. Then again the only game I ever finished was a random maze pac man clone with 16 colors Windows .ico graphics *g* | |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Marco Leise | On 18/09/11 8:59 AM, Marco Leise wrote: > Am 17.09.2011, 22:33 Uhr, schrieb Peter Alexander > <peter.alexander.au@gmail.com>: > >> On 17/09/11 7:52 PM, Marco Leise wrote: >>> Am 17.09.2011, 20:01 Uhr, schrieb Adam D. Ruppe >>> <destructionator@gmail.com>: >>> >>>> Ah, that explains it. I usually don't use the -O switch. >>> >>> During development when you recompile several times an hour, you really >>> don't need -O either. For my hobby projects I usually set up the IDE >>> with a compile command without -O and place a Makefile in the directory >>> that has optimizations enabled. Sure, there are times when you run >>> performance tests, but they aren't the usual case, so I think it is fair >>> to compare compiles without optimizations in this context. >> >> I suppose that's true for most people, but not for games developers. >> When testing changes, you need the game to be running at interactive >> framerates, and it's very difficult to achieve that in a debug build, >> so we generally always run optimized, and just use MSVC++'s #pramga >> optimize(off, "") directive to unoptimize specific sections of code >> for debugging. >> >> To be fair, my hobby project still runs fast without optimizations, >> but I definitely need to repeatedly compile with optimizations on when >> performance tuning. > > May I ask how much slower the frame rate is with the debug build? I > would think of certain modules like physics or vegetation to be in > precompiled external libraries (or do you use source code versions?) and > the graphic card doing a lot of work. My D project is between 1.5 and 2.0x slower in debug builds vs. release. At work it's about 4x slower on consoles in full debug vs. full opt and 3x slower vs. partial optimisation. > Also I remember when some game > developer said a while back "we are not cpu bound at all". Has that > changed again? Or is it maybe because your project has not been > optimized yet? Or is it simply that you are developing for the next > generation of hardware? It depends on the game, and of course what settings the player is playing at. Playing at lowest graphics fidelity you will probably always be CPU bound and at highest fidelity you will always be GPU bound. > My definition of interactive frame rate starts at ~20 FPS I guess, > sounds like it would be easy to achieve. Then again the only game I ever > finished was a random maze pac man clone with 16 colors Windows .ico > graphics *g* Well most consoles games run at 30fps these days, so for debug builds you are often dropping below 20, or even 10 fps. It's not just the frame rate either, we have a lot of assets to load in and they all take longer to load in non-optimized builds, which increases the time it takes to iterate on changes. | |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Sean Kelly | On 18/09/11 12:12 AM, Sean Kelly wrote: > On Sep 17, 2011, at 8:08 AM, Peter Alexander wrote: >> This is false. You can easily build several million lines of code in several minutes using unity files and distributed building. There need not be any build farm expenses, the build machines can just be everyone's dev machines. > > Linking can still be pretty time consuming. Unity builds help a lot with link time. > I'd be curious to hear from anyone who has tried developing for the Xbox 360 using D, or any of the big 3 consoles, really. Well, you'd need to target PowerPC for all current-gen consoles. Can GDC do that? | |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Peter Alexander Attachments:
| Peter Alexander wrote: > On 18/09/11 12:12 AM, Sean Kelly wrote: >> On Sep 17, 2011, at 8:08 AM, Peter Alexander wrote: >>> This is false. You can easily build several million lines of code in several minutes using unity files and distributed building. There need not be any build farm expenses, the build machines can just be everyone's dev machines. >> >> Linking can still be pretty time consuming. > > Unity builds help a lot with link time. > Note that you can also use partial linking to help with link times while iterating development on large projects. At least on Linux (with the -r or -i options to ld), I do not know about optlink. Jerome -- mailto:jeberger@free.fr http://jeberger.free.fr Jabber: jeberger@jabber.fr | |||
September 18, 2011 Re: Would You Bet $100,000,000 on D? | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Xavier | On 09/18/2011 05:41 AM, Xavier wrote:
> Timon Gehr wrote:
>> On 09/17/2011 10:57 AM, Josh Simmons wrote:
>>> On Sat, Sep 17, 2011 at 6:46 PM, Nick Sabalausky<a@a.a> wrote:
>>>>
>>>> Are you seriously trying say that that implies each successive one
>>>> is inherently no better than the previous? If so, then that's just
>>>> patently absurd. If not, then what in the world *is* your point?
>>>> Just to troll?
>>>
>>> No I believe the implication is that absolute quality is so absurdly
>>> impossible to define that it's somewhat irrelevant to even
>>> contemplate it. And it's certainly overly simplistic to consider it
>>> without putting it in the context of a given problem.
>>
>> Well, my pragmatic and simplistic definition of language quality is
>
> Oh curb it already.
>
>
The only difference between that definition and most of the contents of your posts in this thread is that it actually introduces itself as being maybe too simplistic and therefore possibly not appropriate for a given situation. That is a strength, not a weakness. Please think before you post.
| |||
Copyright © 1999-2021 by the D Language Foundation
Permalink
Reply