February 03, 2004
In article <bvjq6d$tuq$1@digitaldaemon.com>, Ben Hinkle says...
>
>If everything was an Object (in D's current definition of Object) then it would have to have reference semantics. In particular ints and arrays would change behavior. For int the change would be drastic.

This would obviously lead to a major rethink of parts of the language. If we were to do it, then we could consider having different assignment operators for references and values.

Heron does this right, I think.


February 03, 2004
In article <bvjtdl$13d7$1@digitaldaemon.com>, Sean L. Palmer says...
>
>See, that's the problem.  Reference semantics should be in usage code, not an inherent property of Object.  C++ is proof that this is viable.
>
>You should be able to subclass int just like you can subclass any other class.  I run into this in C++ alot.  I want to make a sort of streaming adapter template, with decorated objects that act like some other type but know how to stream themselves to and from binary files, but I cannot inherit from int, thus I have to break the illusion that my adapter *is* the kind of object and the user code has to go thru another level of indirection, cluttering up the entire program, *or* forcing me to make special cases for every single builtin type, of which there are many in C++, including pointers which can be to any user-defined type.  Needless to say, this makes decorating types in a generic manner an unnecessarily painful endeavor.
>
>With modern day optimizing compilers, we *should* be able to treat every object semantically as if it were a heap object, and have the compiler "optimize" the object down onto the stack or into registers when it can safely do so.  And once we're treating all objects uniformly, we can stop having to explicitly specify that they go "on the heap" and just use objects in a simple way, letting the compiler take care of the details of where and how to allocate them.
>
>As has been said by others many times, the language should be defined by its semantics.  Such semantics should be chosen so that efficient implementations can be made, without exposing the programmer to the gory details of such implementations.  The less explicit you force the programmers to be, the less opportunity they have to screw up, and the more opportunity the compiler vendor has to do things The Right Way.
>
>Boxing/unboxing is an ugly implementation detail that has "escaped" and flutters about distracting programmers from the task at hand.  It is a sign of bad language design.
>
>Likewise with memory allocation.  If you want programs that never leak, express memory allocation and deallocation in a way that can't possibly leak.  If you want programs that never crash, find ways to do things that can't possibly crash, (checking at runtime is a poor substitute for *proving* the program cannot misbehave).

The quality of discussion in this newsgroup has risen tremendously during January. This is an excellent example of it!

>This is one area where I fear Walter may actually be too stuck in the old-school mindset to innovate.  That doesn't mean D won't be a good language, it just means it won't be a cutting-edge language.  There is a revolution brewing, and I feel a language will have to be very solidly designed to survive.  Adding some patches to C won't be enough.


1 2 3
Next ›   Last »