March 03, 2009 Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Just noticed this hit Slashdot, and thought I might repost the abstract here. http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake > I call it my billion-dollar mistake. It was the invention of the null reference in 1965. [...] This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. [...] More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965. -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner Serendipitous, since I just spent today trying to track down an (expletives deleted) obscure null dereference problem. I figure I must be in good company if even the guy who invented null doesn't like it... It also make me look up this old thing; it's several years old now, but I still think it's got some good points in it. http://www.st.cs.uni-saarland.de/edu/seminare/2005/advanced-fp/docs/sweeny.pdf > * Accessing arrays out-of-bounds > * Dereferencing null pointers > * Integer overflow > * Accessing uninitialized variables > > 50% of the bugs in Unreal can be traced to these problems! Tim Sweeny isn't an amateur; he's responsible, at least in part, for one of the most commercially successful game engines ever. I figure if even he has trouble with these things, it's worth trying to fix them. Note that D already solves #1 and #4, LDC could give us #3... that just leaves #2. :D -- Daniel |
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Daniel Keep | Daniel Keep:
> > * Accessing arrays out-of-bounds
> > * Dereferencing null pointers
> > * Integer overflow
> > * Accessing uninitialized variables
> ...
> Note that D already solves #1 and #4, LDC could give us #3... that just
> leaves #2. :D
Nice to see another person appreciate my desire to avoid those bugs as much as possible. Eventually I hope to see D avoid all four of them.
There are other bugs too, like ones derived by out-of-bounds pointer arithmetic, but they are less common (Cyclone and other languages are able to avoid them too, but it costs some).
Do you remember the ptr==null thing? As soon as D2 has full integral overflow checks, all people will find several bugs in their "large" D2 programs. I have seen this many times, switching on such overflow checks in Delphi programs :-)
Bye,
bearophile
|
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Daniel Keep | Daniel Keep wrote:
> Just noticed this hit Slashdot, and thought I might repost the abstract
> here.
>
> http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake
>
>> I call it my billion-dollar mistake. It was the invention of the null
>> reference in 1965. [...] This has led to innumerable errors,
>> vulnerabilities, and system crashes, which have probably caused a
>> billion dollars of pain and damage in the last forty years. [...] More
>> recent programming languages like Spec# have introduced declarations
>> for non-null references. This is the solution, which I rejected in
>> 1965.
>
> -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner
I suggested to Walter an idea he quite took to: offering the ability of disabling the default constructor. This is because at root any null pointer was a pointer created with its default constructor. The feature has some interesting subtleties to it but is nothing out of the ordinary and the code must be written anyway for typechecking invariant constructors.
That, together with the up-and-coming alias this feature, will allow the creation of the "perfect" NonNull!(T) type constructor (along with many other cool things). I empathize with those who think non-null should be the default, but probably that won't fly with Walter.
Andrei
|
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On Tue, 03 Mar 2009 21:59:16 +0300, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote: > Daniel Keep wrote: >> Just noticed this hit Slashdot, and thought I might repost the abstract >> here. >> http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake >> >>> I call it my billion-dollar mistake. It was the invention of the null >>> reference in 1965. [...] This has led to innumerable errors, >>> vulnerabilities, and system crashes, which have probably caused a >>> billion dollars of pain and damage in the last forty years. [...] More >>> recent programming languages like Spec# have introduced declarations >>> for non-null references. This is the solution, which I rejected in >>> 1965. >> -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner > > I suggested to Walter an idea he quite took to: offering the ability of disabling the default constructor. This is because at root any null pointer was a pointer created with its default constructor. The feature has some interesting subtleties to it but is nothing out of the ordinary and the code must be written anyway for typechecking invariant constructors. > > That, together with the up-and-coming alias this feature, will allow the creation of the "perfect" NonNull!(T) type constructor (along with many other cool things). I empathize with those who think non-null should be the default, but probably that won't fly with Walter. > > > Andrei If nullable is the default and NonNull!(T) has no syntactic sugar, I bet it won't be used at all. I know I woudn't, even though I'm one of the biggest advocates of introducing non-nullable types in D. In my opinion, you should teach novices safe practices first, and dangerous tricks last. Not vice-versa. If using of nullable types would be easier that non-nullable once, it won't be widely used. The syntax ought to be less verbose and more clear to get an attention. I hope that this great idea won't get spoiled by broken implementation... |
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Denis Koroskin | On Tue, Mar 3, 2009 at 3:08 PM, Denis Koroskin <2korden@gmail.com> wrote:
>
> If nullable is the default and NonNull!(T) has no syntactic sugar, I bet it won't be used at all. I know I woudn't, even though I'm one of the biggest advocates of introducing non-nullable types in D.
>
> In my opinion, you should teach novices safe practices first, and dangerous tricks last. Not vice-versa.
Exactly. I thought one of the ideas behind D was to have "safe" defaults. Yeah, I know, null references can't actually do damage to your computer because of virtual memory, but neither can concurrent access to shared data, or accessing uninitialized variables, but they're taken care of.
If nonnull types were the default, Nullable!(T) would be implementable
as an Algebraic type, just like in Haskell. One more potential
bragging point ;)
|
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Denis Koroskin | Denis Koroskin wrote: > On Tue, 03 Mar 2009 21:59:16 +0300, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote: > >> Daniel Keep wrote: >>> Just noticed this hit Slashdot, and thought I might repost the abstract >>> here. >>> http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake >>> >>> >>>> I call it my billion-dollar mistake. It was the invention of the null >>>> reference in 1965. [...] This has led to innumerable errors, >>>> vulnerabilities, and system crashes, which have probably caused a >>>> billion dollars of pain and damage in the last forty years. [...] More >>>> recent programming languages like Spec# have introduced declarations >>>> for non-null references. This is the solution, which I rejected in >>>> 1965. >>> -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner >> >> I suggested to Walter an idea he quite took to: offering the ability of disabling the default constructor. This is because at root any null pointer was a pointer created with its default constructor. The feature has some interesting subtleties to it but is nothing out of the ordinary and the code must be written anyway for typechecking invariant constructors. >> >> That, together with the up-and-coming alias this feature, will allow the creation of the "perfect" NonNull!(T) type constructor (along with many other cool things). I empathize with those who think non-null should be the default, but probably that won't fly with Walter. >> >> >> Andrei > > If nullable is the default and NonNull!(T) has no syntactic sugar, I bet it won't be used at all. I know I woudn't, even though I'm one of the biggest advocates of introducing non-nullable types in D. > > In my opinion, you should teach novices safe practices first, and dangerous tricks last. Not vice-versa. > > If using of nullable types would be easier that non-nullable once, it won't be widely used. The syntax ought to be less verbose and more clear to get an attention. > > I hope that this great idea won't get spoiled by broken implementation... > I did some more research and found a study: http://users.encs.concordia.ca/~chalin/papers/TR-2006-003.v3s-pub.pdf Very interestingly (and exactly the kind of info I was looking for), the study measures how references are meant to be in a real application of medium-large size. Turns out in 2/3 of cases, references are really meant to be non-null... not really a landslide but a comfortable majority. Andrei |
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On Tue, 3 Mar 2009, Andrei Alexandrescu wrote:
> I did some more research and found a study:
>
> http://users.encs.concordia.ca/~chalin/papers/TR-2006-003.v3s-pub.pdf
>
> Very interestingly (and exactly the kind of info I was looking for), the study measures how references are meant to be in a real application of medium-large size.
>
> Turns out in 2/3 of cases, references are really meant to be non-null... not really a landslide but a comfortable majority.
>
>
> Andrei
I'd love to see a similar study for smart pointers. Are they more like pointers or references? My assumption is references, leading to them beign poorly named. :)
Later,
Brad
|
March 03, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Daniel Keep | Daniel Keep wrote:
> Just noticed this hit Slashdot, and thought I might repost the abstract
> here.
>
> http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake
>
>> I call it my billion-dollar mistake. It was the invention of the null
>> reference in 1965. [...] This has led to innumerable errors,
>> vulnerabilities, and system crashes, which have probably caused a
>> billion dollars of pain and damage in the last forty years. [...] More
>> recent programming languages like Spec# have introduced declarations
>> for non-null references. This is the solution, which I rejected in
>> 1965.
>
> -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner
>
> Serendipitous, since I just spent today trying to track down an
> (expletives deleted) obscure null dereference problem. I figure I must
> be in good company if even the guy who invented null doesn't like it...
There are issues shoe-horning non-nullables into a nullable world:
- preallocating arrays (or static arrays)
- structs with non-nullable fields
- pointers to non-nullables
It's sufficient that I gave up on my attempts to implement it.
If it were implemented, non-nullable absolutely must be the default. I'm still sad about mutable being the default in d2.
|
March 04, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Christopher Wright | Christopher Wright:
>I'm still sad about mutable being the default in d2.<
Maybe D3 will move more in that direction, I don't know. It's a really big jump from C/C++, quite bigger than nonnullable by default :-)
Bye,
bearophile
|
March 04, 2009 Re: Null references (oh no, not again!) | ||||
---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | Andrei Alexandrescu Wrote:
> Daniel Keep wrote:
> > Just noticed this hit Slashdot, and thought I might repost the abstract here.
> >
> > http://qconlondon.com/london-2009/presentation/Null+References:+The+Billion+Dollar+Mistake
> >
> >> I call it my billion-dollar mistake. It was the invention of the null reference in 1965. [...] This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. [...] More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965.
> >
> > -- Sir Charles Hoare, Inventor of QuickSort, Turing Award Winner
>
> I suggested to Walter an idea he quite took to: offering the ability of disabling the default constructor. This is because at root any null pointer was a pointer created with its default constructor. The feature has some interesting subtleties to it but is nothing out of the ordinary and the code must be written anyway for typechecking invariant constructors.
>
> That, together with the up-and-coming alias this feature, will allow the creation of the "perfect" NonNull!(T) type constructor (along with many other cool things). I empathize with those who think non-null should be the default, but probably that won't fly with Walter.
>
>
> Andrei
Alias this?
|
Copyright © 1999-2021 by the D Language Foundation