October 06, 2009 Re: dmd 1.048 and 2.033 releases | ||||
---|---|---|---|---|
| ||||
Posted in reply to grauzone | On Wed, 07 Oct 2009 00:54:22 +0400, grauzone <none@example.net> wrote:
> Jarrett Billingsley wrote:
>> On Tue, Oct 6, 2009 at 3:08 PM, Walter Bright
>> <newshound1@digitalmars.com> wrote:
>>> Lutger wrote:
>>>> Walter Bright wrote:
>>>>
>>>>> Don wrote:
>>>>>> It's pretty standard, though. For example, there are some bugs which
>>>>>> Visual C++ detects only when the optimiser is on. From memory, they are
>>>>>> all flow-related. The MS docs recommend compiling a release build
>>>>>> occasionally to catch them.
>>>>> The flow analysis could be run on every compile by default, but it would
>>>>> make for pretty slow turnaround.
>>>> Is it possible / reasonably to run flow analysis but still have a build
>>>> that can be properly debugged? If yes, wouldn't it be nice to have it as a
>>>> separate compiler option? Some people with build slaves, fast cpu's or
>>>> smallish projects won't care that much for the performance.
>>> Just compile with:
>>> -debug -O
>> You don't seem to be grasping the issue here. It's not using -O with
>> -debug that's the problem, it's using it with -g. You can't reasonably
>> expect someone to put an optimized executable through a debugger.
>
> As I understand, -0 just enables flow analysis (which is slow and thus shouldn't be run normally), not full optimization.
No, -O means "optimize". It's just much easier to check code flow when an executable is optimized, that's it.
|
October 06, 2009 Re: dmd 1.048 and 2.033 releases | ||||
---|---|---|---|---|
| ||||
Posted in reply to Denis Koroskin | Denis Koroskin wrote:
> On Wed, 07 Oct 2009 00:54:22 +0400, grauzone <none@example.net> wrote:
>
>> Jarrett Billingsley wrote:
>>> On Tue, Oct 6, 2009 at 3:08 PM, Walter Bright
>>> <newshound1@digitalmars.com> wrote:
>>>> Lutger wrote:
>>>>> Walter Bright wrote:
>>>>>
>>>>>> Don wrote:
>>>>>>> It's pretty standard, though. For example, there are some bugs which
>>>>>>> Visual C++ detects only when the optimiser is on. From memory, they are
>>>>>>> all flow-related. The MS docs recommend compiling a release build
>>>>>>> occasionally to catch them.
>>>>>> The flow analysis could be run on every compile by default, but it would
>>>>>> make for pretty slow turnaround.
>>>>> Is it possible / reasonably to run flow analysis but still have a build
>>>>> that can be properly debugged? If yes, wouldn't it be nice to have it as a
>>>>> separate compiler option? Some people with build slaves, fast cpu's or
>>>>> smallish projects won't care that much for the performance.
>>>> Just compile with:
>>>> -debug -O
>>> You don't seem to be grasping the issue here. It's not using -O with
>>> -debug that's the problem, it's using it with -g. You can't reasonably
>>> expect someone to put an optimized executable through a debugger.
>>
>> As I understand, -0 just enables flow analysis (which is slow and thus shouldn't be run normally), not full optimization.
>
> No, -O means "optimize". It's just much easier to check code flow when an executable is optimized, that's it.
And I thought it was a zero... never mind, then.
|
October 06, 2009 Re: dmd 1.048 and 2.033 releases | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | Walter Bright:
> The flow analysis could be run on every compile by default, but it would make for pretty slow turnaround.
On GCC if you want a safer compilation you add things like -Wall -Wextra, etc.
In D the default is better to be safe (just like you add -release to remove some safeties), so with flow analysis activated. If someone needs a faster but less safe compilation, then such person may add a flag to disable flow analysis, like -noflow.
Bye,
bearophile
|
Copyright © 1999-2021 by the D Language Foundation