Thread overview
Null References and related matters
Dec 23, 2008
bearophile
Dec 23, 2008
bearophile
Dec 23, 2008
Bill Baxter
Dec 24, 2008
Nick Sabalausky
December 23, 2008
This was already discussed in the past, but I think it doesn't hurt rehashing it a little, when there's an opinion of a famous computer scientist and programmer like Tony Hoare: "Null References: The Billion Dollar Mistake" presentation:

>I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null. More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965.<

That's why day ago I have said that sooner or later D will have something like the ? syntax of Delight (and C#): http://delight.sourceforge.net/null.html

Bye,
bearophile
December 23, 2008
> Tony Hoare: "Null References: The Billion Dollar Mistake" presentation:

Sorry, I have forgotten the link: http://qconlondon.com/london-2009/speaker/Tony+Hoare

Bye,
bearophile
December 23, 2008
On Tue, Dec 23, 2008 at 10:36 PM, bearophile <bearophileHUGS@lycos.com> wrote:

> system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null.

Is this a typo?  Why would you want to give warnings about things that "may be non-null"?

--bb
December 24, 2008
"bearophile" <bearophileHUGS@lycos.com> wrote in message news:giqphn$1aog$1@digitalmars.com...
> This was already discussed in the past, but I think it doesn't hurt
> rehashing it a little, when there's an opinion of a famous computer
> scientist and programmer like
> Tony Hoare: "Null References: The Billion Dollar Mistake" presentation:
>
>>I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years. In recent years, a number of program analysers like PREfix and PREfast in Microsoft have been used to check references, and give warnings if there is a risk they may be non-null. More recent programming languages like Spec# have introduced declarations for non-null references. This is the solution, which I rejected in 1965.<
>
> That's why day ago I have said that sooner or later D will have something like the ? syntax of Delight (and C#): http://delight.sourceforge.net/null.html
>

Interesting. And now that I think about it, null references do seem to be little more than the reference equivilent of sentinel values, which I've never been a big fan of (ie, reserving special values to indicate something other than what the variable normally represents, for instance embedding error codes in the return value of a function that normally returns a meaningful value).