November 16, 2013
On Sat, 2013-11-16 at 00:04 -0500, Jonathan M Davis wrote: […]
> I really don't understand this. Optional<T> is one of the most useless ideas that I've ever seen in Java. Just use null. It's built into the language. It works just fine. And wrapping it in a class isn't going to make it go away. Just learn to deal with null properly. I've known many programmers who have no problems with null whatsoever, and I'd be worried about any programmer who is so scared of it that they feel the need to wrap nullable objects in another type which has its own concept of null.
> 
> The only types which aren't nullable in Java are the primitive types, and if you use them in generics (like Optional<T> does), then you get a class that boxes the primitive type rather than using the primitive type directly, and that object is of course nullable. So, you might as well just use the class that boxes the primitive type directly and set its reference to null when you need it to be null. And Optional<T> doesn't even protect against null, since it's perfectly possible to make its contents null. So, as far as I can see, Optional<T> is utterly pointless. IMHO, it's outright bad software design.

Maybe and Optional are far from pointless and returning null is anathema if you want to use modern fluent approaches, streaming, and generally getting things right.

Haskell has to do things this way because there is really no choice in pure functional languages.

Scala, Ceylon, Kotlin do things analogously so as to integrate the OO and functional approaches into what is rapidly becoming the programming norm. Java 8 is the beginning of Java requiring that null never be returned from a method call. And likely soon the demise of all primitive types in favour of efficient boxing and unboxing of reference types (unlike what happens at present).

"Dealing with null properly" is simply a remnant of 1994 era thinking, we have long-ago moved past this.

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

November 16, 2013
On 11/16/13 7:41 AM, Max Klyga wrote:
> On 2013-11-16 05:04:20 +0000, Jonathan M Davis said:
>
>> I really don't understand this. Optional<T> is one of the most useless
>> ideas
>> that I've ever seen in Java. Just use null.
>
> Optional specifies explicitly that value can be absent and forces client
> to check before using the value.
> Also, if Optional implements monadic interface one can easily chain
> computations depending on the state of optional values.
>
> Even if references are nullable by default users do not check them for
> null on every usage. NullPointerException and the like are almost always
> indicators of programming error.
>
> Null is just horrible.

Null is only horrible if the compiler let's you dereference it... something that happens in almost every language out there.

November 16, 2013
On Saturday, November 16, 2013 11:18:38 Meta wrote:
> On Saturday, 16 November 2013 at 05:04:42 UTC, Jonathan M Davis
> 
> wrote:
> > I really don't understand this. Optional<T> is one of the most
> > useless ideas
> > that I've ever seen in Java. Just use null. It's built into the
> > language. It
> > works just fine. And wrapping it in a class isn't going to make
> > it go away.
> > Just learn to deal with null properly. I've known many
> > programmers who have no
> > problems with null whatsoever, and I'd be worried about any
> > programmer who is
> > so scared of it that they feel the need to wrap nullable
> > objects in another
> > type which has its own concept of null.
> 
> The value of an Option<T> type is that it moves checking for null into the type system. It forces you to check for null before you perform any potentially NullPointerException-throwing operations, whereas using naked class references in Java, it's easy to forget, or miss a null check, or just ignore it and hope everything is fine. With Option<T>, you have no choice but to check for null before you perform an operation on the wrapped class reference.

If you want to use the type system to try and protect against dereferencing null, then having wrapper which guarantees that the object _isn't_ null makes a lot more sense, particularly when just because you used Optional<T> instead of T mkaes no guarantees whatsoever that all of the other T's in the program are non-null. At best, if Optional<T> is used 100% consistently, you know that when a naked T is null, it's a bug.

> > The only types which aren't nullable in Java are the primitive
> > types, and if
> > you use them in generics (like Optional<T> does), then you get
> > a class that
> > boxes the primitive type rather than using the primitive type
> > directly, and
> > that object is of course nullable. So, you might as well just
> > use the class
> > that boxes the primitive type directly and set its reference to
> > null when you
> > need it to be null. And Optional<T> doesn't even protect
> > against null, since
> > it's perfectly possible to make its contents null. So, as far
> > as I can see,
> > Optional<T> is utterly pointless. IMHO, it's outright bad
> > software design.
> 
> I would have to respectfully disagree with that. Proper use of an Option<T> type can dramatically reduce the chance of calling a method on a null object, or even eliminate it entirely.

Honestly, I pretty much never have problems with null pointers/references, and my natural reaction when I see people freaking out about them is to think that they don't know what they're doing or that they're just plain paranoid. That doesn't mean that my natural reaction is right. It could easily be the case that many such people are merely programming in environments different enough from anything I've had to deal with that null is actually a real problem for them and that it would be a real problem for me in the same situation. But in my experience, null really isn't a problem, and it can be  very useful. So, when people freak out about it and insist on trying to get the type system to protect them from it, it really baffles me. It feels like they're trying to take a very useful tool out of the toolbox just because they weren't careful and managed to scratch themselves with it once or twice.

And Java's Optional seems even more useless, because it doesn't actually protect you against dereferencing null, and because it doesn't prevent anything which isn't in an Optional from being null.

Much as I don't think that it's worth it, I can at least see arguments for using NonNullable (which will end up in std.typecons eventually) to guarantee that the object isn't null, but I really don't think that using Optional or Nullable on a nullable type gains you anything except the illusion of protection.

Oh, well. null seems to be a very divisive topic. There are plenty of folks who are convinced that it's a huge disaster, and plenty of others who have no problems with it at all and consider it to be useful. And for some reason, it seems like the folks in Java land freak out over it a lot more than the folks in C++ land, and aside from D, C++ is definitely the language that I've used the most and am most comfortable with, as well as tend to agree with the proponents of the most (though obviously, it has plenty of flaws - hence why I prefer D).

- Jonathan M Davis
November 17, 2013
On Saturday, 16 November 2013 at 23:34:55 UTC, Jonathan M Davis wrote:
> If you want to use the type system to try and protect against dereferencing
> null, then having wrapper which guarantees that the object _isn't_ null makes
> a lot more sense, particularly when just because you used Optional<T> instead
> of T mkaes no guarantees whatsoever that all of the other T's in the program
> are non-null. At best, if Optional<T> is used 100% consistently, you know that
> when a naked T is null, it's a bug.

You're right, it's better to ensure that the object is not null in the first place, which is what languages like Haskell, Spec#, Kotlin, and Rust do. D currently doesn't do this, and most developers probably won't have the discipline to use NonNull consistently throughout their code. The best we can do on that front is make sure it's used consistently within Phobos, so we can guarantee that we'll never give a user a null value.

> Honestly, I pretty much never have problems with null pointers/references, and
> my natural reaction when I see people freaking out about them is to think that
> they don't know what they're doing or that they're just plain paranoid. That
> doesn't mean that my natural reaction is right.

I think in this case, your natural reaction is wrong, because you've used mostly languages with nullable references. It's a case of the blub fallacy: "Nullable references are good enough. Why bother with all that hairy non-nullable stuff?"

> It could easily be the case
> that many such people are merely programming in environments different enough
> from anything I've had to deal with that null is actually a real problem for
> them and that it would be a real problem for me in the same situation. But in
> my experience, null really isn't a problem, and it can be  very useful. So, when people freak out about it and insist on trying to get the type system to
> protect them from it, it really baffles me. It feels like they're trying to take
> a very useful tool out of the toolbox just because they weren't careful and
> managed to scratch themselves with it once or twice.

I don't think anyone's freaking out about null, and you're right that null is useful. The question is, why do we need object references to be nullable by default? If they were non-nullable by default, we could eliminate a whole class of errors for free. Not for some arcane definition of free. This is a free lunch that is being refused. You seem to be asking the question "why do we need them", when you should be asking "what do we lose by not having them".

Note that I'm arguing for non-nullable references here, which D is obviously never going to have. The next best thing is, as you suggested, having a wrapper type that we can use to be reasonably sure never holds a null reference. Again, the problem with that is that it requires programmer discipline.

> And Java's Optional seems even more useless, because it doesn't actually
> protect you against dereferencing null, and because it doesn't prevent
> anything which isn't in an Optional from being null.

See, that's the problem. References are nullable by default in Java, so even with an Optional type and a NonNullable wrapper you can never be 100% that you're not dealing with null masquerading as an object. The truly safe thing would be to enforce in the language that all references are wrapped in Optional by the compiler, or make a language change to disallow null references, but doing either of those is not at all realistic. Still, creating a convention of avoiding objects that aren't wrapped in Optional among Java developers could get you pretty close.

> Much as I don't think that it's worth it, I can at least see arguments for
> using NonNullable (which will end up in std.typecons eventually) to guarantee
> that the object isn't null, but I really don't think that using Optional or
> Nullable on a nullable type gains you anything except the illusion of
> protection.

Well, again, Optional would force you to check that the underlying object was null before you used it. You simply can't call, say, calculatePrice() on a Nullable!SalesGood (well, you actually can due to the fact that Nullable aliases itself to the wrapped object, which is a huge mistake IMO).

> Oh, well. null seems to be a very divisive topic. There are plenty of folks
> who are convinced that it's a huge disaster, and plenty of others who have no
> problems with it at all and consider it to be useful. And for some reason, it
> seems like the folks in Java land freak out over it a lot more than the folks
> in C++ land, and aside from D, C++ is definitely the language that I've used
> the most and am most comfortable with, as well as tend to agree with the
> proponents of the most (though obviously, it has plenty of flaws - hence why I
> prefer D).

I think "huge disaster" might be a mischaracterization on your part. There is no worldwide hysteria over nullable references, just a growing realization that we've been doing it wrong for the past 20 years. And yes, null is useful to indicate the absence of a value, but objects don't have to be nullable by default for you to use null. Many languages make the programmer ask for a nullable reference specifically by appending ? to the type, which makes everyone reading the code aware that the reference you have might be null, and to take appropriate care.

1 2
Next ›   Last »