September 19, 2019
On Wednesday, 18 September 2019 at 09:52:34 UTC, John Colvin wrote:
> On Tuesday, 17 September 2019 at 19:31:49 UTC, Brett wrote:
>> On Tuesday, 17 September 2019 at 19:19:46 UTC, Timon Gehr wrote:
>>> On 17.09.19 19:34, Brett wrote:
>>>> 
>>>> What's more concerning to me is how many people defend the compilers behavior.
>>>> ...
>>>
>>> What you apparently fail to understand is that there are trade-offs to be considered, and your use case is not the only one supported by the language. Clearly, any wraparound behavior in an "integer" type is stupid, but the hardware has a fixed word size, programmers are lazy, compilers are imperfect and efficiency of the generated code matters.
>>
>> And this is why compilers should do everything they can to reduce problems... it doesn't just effect one person but everyone that uses the compiler. If the onus is on the programmer then it means that a very large percentage of people(thousands, 10's of thousands, millions) are going to have to deal with it as you've already said, they are lazy, so they won't.
>
> Carelessly doing everything you can to reduce problems is a good way to create lots of problems. For example, there can be a trade-off between consistently (and therefore predictably) wrong and inconsistently right.
>
>> No, that is not the right behavior because you've already said that wrapping is *defined* behavior... and it is not! One if we multiply two numbers together that may be generated at ctfe using mixins or by using a complex constant expression that may be near the upper bound and it happens to overflow? Then what?
>>
>> You are saying it is ok for undefined behavior to exist in a program and that is never true! Undefined behavior accounts for 100% of all program bugs. Even a perfectly written program is undefined behavior if it doesn't do what the user wants/programmer wants.
>>
>> The compiler can warn us at compile time for ambiguous cases, that is the best solution. To say it is not because wrapping is "defined behavior" is the thing that creates inconsistencies.
>
> Just to make sure you don't misunderstand:
>
> For better or worse, integer overflow is defined behaviour in D, the reality of the overwhelming majority of CPU hardware is encoded in the language.
>
> That is using the meaning of the term "defined" as it used in e.g. the C standard.

I do not care if it is defined, it is wrong. Things that are wrong should be righted... Few seem to get that here.

You can claim that it is right because that is how it is done but you fail to realize that the logic you used to come to that conclusion is wrong. To wrongs do not make a right, no matter how hard you try.

See, at worst we get a warning. You want that warning to be surprised, I want it to be explicit. You want obscure errors to exist I do not.  You are wrong, I'm right. You can huff and puff and try to blow the house down but you still will be wrong.


1 2 3 4
Next ›   Last »