October 27, 2008
Hxal Wrote:
> As long as range checked types were part of the standard library and there weren't dozens different implementations from different libraries, then a people would probably use them.

Bah, the apparent grammar mistake is an effect of my touchpad posting before I finished editing :P

October 27, 2008
On Mon, 27 Oct 2008 17:40:44 +0100, Hxal <Hxal@freenode.irc> wrote:

> bearophile Wrote:
>> Unfortunately I think such library solution is nearly useless. Programmers are lazy, and lot of them even actively resist changes and ideas that may improve their programs.
>
> Well then, that's their problem, isn't it? I mean, there's no point making their programs better against their will. :P

As was mentioned here, the 'int foo = void;' syntax is an example of the
above, and, I feel, works great. It forces programmers to be explicit about
their intentions, without adding too much overhead syntax. The same could
be said of using int_unsafe instead of int.

-- 
Simen
October 27, 2008
Hxal:
> Well then, that's their problem, isn't it?

I can't agree. While it's generally impossible to engineer/design a really "fool proof" system, a well designed system (I am talking about machines too, car/plane pilot levers and commands too) must take into account the basics of what we today know about the human nature too, and avoid human errors when possible and/or not too much costly. You can read books written by Donald Norman (www.jnd.org ) about this topic.

What today often we call "human errors" (of some user) are instead errors of the human designers (that haven't done well what I have just said), and not of the humans that use a system. On the other hand those human designers where often working in situations where their bosses didn't want a better design, etc. So basic social psychology studies have seen the cause of the mistake is often politics inside a hyerarchical group of working people... But I don't want to slip into this can of worms now.

Bye,
bearophile
October 28, 2008
bearophile wrote:
> In a modern language it's probably good for ALL integral numeric types to be range checked 

I've noticed you've argued for range checking on integral types many times. Have you found overflow to be a common bug?
IE, is your comment mainly based on experience, or mainly on theory?

I ask this because I've very rarely encountered that type of bug.
(It could be that it occurs frequently in some problem domains; maybe we could work out what they are).
October 28, 2008
Don:
> I've noticed you've argued for range checking on integral types many times.

You are right, I am sorry for spamming this newsgroup (and to bore people). I have seen that lot of people don't follow this newsgroup closely, so saying the same thing every once in a while makes more people read it. But for people like you that probably reads every post, it becomes boring...


> Have you found overflow to be a common bug?

Not too much common, but I have had 2 bugs derived by mixing signed and unsigned types (once by array.length). I have have had one or two bugs derived by applying a map() on an array of bytes, and returning a byte that contains a bogus value.
I have stopped using unsigned values every time I don't strictly need them, because instead of being safer, that is using them to represent nonnegative numbers, they are actually much less safe.
So I think integral values are a source of troubles.

This document from experience says that integral overflow bugs are a significant percentage of the total: http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf

Bye,
bearophile
October 28, 2008
On Tue, Oct 28, 2008 at 7:21 AM, bearophile <bearophileHUGS@lycos.com> wrote:
> This document from experience says that integral overflow bugs are a significant percentage of the total: http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf

No, it says array-out-of-bounds errors, dereferencing null pointers, accessing uninitialized variables _and_ integer overflows together represent 50% of the bugs.  I don't know about you but I run into those first three cases (well.. two, since there aren't uninitialized variables in D) waaaaay more than I do integer overflows.
October 28, 2008
Jarrett Billingsley wrote:
> On Tue, Oct 28, 2008 at 7:21 AM, bearophile <bearophileHUGS@lycos.com> wrote:
>> This document from experience says that integral overflow bugs are a significant percentage of the total:
>> http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf
> 
> No, it says array-out-of-bounds errors, dereferencing null pointers,
> accessing uninitialized variables _and_ integer overflows together
> represent 50% of the bugs.  I don't know about you but I run into
> those first three cases (well.. two, since there aren't uninitialized
> variables in D) waaaaay more than I do integer overflows.

I think the frequency of a bug should be multiplied with the trouble it takes to fix it. Frequency alone isn't terribly relevant.

Andrei
October 28, 2008
Don Wrote:

> bearophile wrote:
> > In a modern language it's probably good for ALL integral numeric types to be range checked
> 
> I've noticed you've argued for range checking on integral types many
> times. Have you found overflow to be a common bug?
> IE, is your comment mainly based on experience, or mainly on theory?

I hit an overflow bug in my D code 2 weeks ago. An intermediate value overflowed.

> 
> I ask this because I've very rarely encountered that type of bug.
> (It could be that it occurs frequently in some problem domains; maybe we
> could work out what they are).

November 02, 2008
Andrei Alexandrescu wrote:
> Jarrett Billingsley wrote:
>> On Tue, Oct 28, 2008 at 7:21 AM, bearophile <bearophileHUGS@lycos.com> wrote:
>>> This document from experience says that integral overflow bugs are a significant percentage of the total:
>>> http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf
>>
>> No, it says array-out-of-bounds errors, dereferencing null pointers,
>> accessing uninitialized variables _and_ integer overflows together
>> represent 50% of the bugs.  I don't know about you but I run into
>> those first three cases (well.. two, since there aren't uninitialized
>> variables in D) waaaaay more than I do integer overflows.
> 
> I think the frequency of a bug should be multiplied with the trouble it takes to fix it. Frequency alone isn't terribly relevant.
> 
> Andrei

And the time it takes to find the source of the bug. It doesn't matter if it's a one-character fix if you have to go through 10 KLOC to find where the problem is.

For dereferencing null, you can look at the call stack and add contracts to find where null's being passed in. For integer overflows, it's a bit more difficult.
November 04, 2008
Andrei Alexandrescu wrote:
> Jarrett Billingsley wrote:
>> On Tue, Oct 28, 2008 at 7:21 AM, bearophile <bearophileHUGS@lycos.com> wrote:
>>> This document from experience says that integral overflow bugs are a significant percentage of the total:
>>> http://www.st.cs.uni-sb.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf
>>
>> No, it says array-out-of-bounds errors, dereferencing null pointers,
>> accessing uninitialized variables _and_ integer overflows together
>> represent 50% of the bugs.  I don't know about you but I run into
>> those first three cases (well.. two, since there aren't uninitialized
>> variables in D) waaaaay more than I do integer overflows.
> 
> I think the frequency of a bug should be multiplied with the trouble it takes to fix it. Frequency alone isn't terribly relevant.
> 
> Andrei

Indeed!

-- 
Bruno Medeiros - Software Developer, MSc. in CS/E graduate
http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D