November 07, 2002
"Mike Wynn" <mike.wynn@l8night.co.uk> wrote in message news:aqcmgq$8qe$1@digitaldaemon.com...
> I find it interesting that you oppose D having strict well defined
semantics
> and yet on the D overview you say
> it is aimed at "Numerical programmers" and implement lots of floating
point
> Nan and infinity behavours, but are unwilling to fix the float and doubles to defined standards. are these behaviours part of the "bendability" ?

The floating point behavior for D will conform to IEEE 754 if the underlying hardware supports it. If the underlying hardware does not, then D's semantics on that platform will have to bend. This is not as big a problem as it sounds, as non-IEEE floating point is a thing of the past, not the future. Interestingly, IEEE floating point has been on the PC for 20 years now, and VERY few languages use its capability! This failure includes nearly all C/C++ compilers (except for DMC++). DMC++ and D allow the programmer to exploit the floating point hardware (such as using IEEE comparisons, and 80 bit extendeds).

> I've been reading and rereading the D overview to try to gain and understanding for why you might oppose semantics over implementation, and apart from low down and dirty programming, which is not that effected (you do need a integer that can hold a pointer) I see nothing there, in fact
much
> of what I read says to me D will be defined by its semantics.
> (I assume it's a bit out of date as it also says all objects are in the
> heap, but I thought RAII put objects onto the stack ?)

RAII does not put objects on the stack, it's still done by reference.

> I believe that if you asked the programmers who fit into the "who is D
for"
> list then 95% would prefer D to be the same D on every supported platform and to be freed from the implementation, and the expense of some platforms being harder to support than others.

The most popular platforms should, of course, be the ones most in line with D semantics. D also explicitly abandons 16 bit architectures in its semantics.


November 07, 2002
Mike, right on.  However please improve your grammar and separate your paragraphs.  Your comments are hard to read!

Under .NET it will be interesting to see many languages targeting the *same* virtual machine.

Particulars of the Intel chips are important, but we should recognize that, like Windows itself, those chips are hardly the best design.  It would help to step back and contemplate future ports to other chips.  As Mike says, distinguish the semantics from the implementation.

The Intel chip is a piece of work.  It has about a dozen different modes, gets more encrusted with each generation (MMX, etc.), and really amounts to a crazy Rube Goldberg machine whose major attraction is backward compatibility all the way back to the ancient 8088 -- in other words, something that keeps making money for Intel.  Just because the dang chip has some weird issue that affects languge design does not invalidate the language design principle at stake.  It just means the language is targeting a crummy chip and needs to work around it.

I see similar problems affecting D in relation to C++, but have already spouted my opinions on that.  Backward compatibility is a huge limitation when it drives the design, as it did with the Intel chip line.


>> I'm not convinced that Java's success on embedded systems is due to strict semantics.

Well, at least the Java success shows that strict semantics do not spell doom as people seem to think it will for D.


> the embedded Java VM's ... are all different ...,
> but all of this
> is only possible because there are strict rules imposed

Right.


>Java may be a bad language to compare D with, because many people merge the Java Language and JavaVM together, they are infact completly separate


Right.  Think about .NET too.


>I would not consider bugs in a VM a valid argument against semantics over implementation.

Right.


>as for different behaviours that is agreeing with my argument for D to be defined by its semantics and not its implementation.

Yes, I think so.


>> D will have bendable semantics so that it can be
>> efficiently implemented on a wide variety of machines. But not quite as
>> bendable as C is (no one's complement!).

Please provide three examples of bendable semantics.  I do not see a conflict between consistent semantics and efficiency.

Are you saying that good semantics will "hide the chip," whereas D wants to "permit full access"?  When I type "double" I mean an IEEE double-precision floating point number.  If the chip doesn't support that data type, then it really doesn't support the language I am using.  Conversely, if there is some bizarre feature unique to a particular chip, I would expect a good language to treat it as a bizzare feature, such that it's not a part of the language per se, but a secondary layer of capabilities accessible from the main language on that one platform only.

If I sound confused, maybe I am.  Please offer some examples of what you mean to help me think it through.  Give some examples of bendable semantics helping efficiency on two different chips (even if hypothetical ones).


>in fact much
>of what I read says to me D will be defined by its semantics.

Let's hope so!


>I believe that if you asked the programmers who fit into the "who is D for" list then 95% would prefer D to be the same D on every supported platform

Yes and the more platforms, the better.



November 10, 2002
"Mark Evans" <Mark_member@pathlink.com> wrote in message news:aqd4pp$me2$1@digitaldaemon.com...
> >> D will have bendable semantics so that it can be
> >> efficiently implemented on a wide variety of machines. But not quite as
> >> bendable as C is (no one's complement!).
> Please provide three examples of bendable semantics.  I do not see a
conflict
> between consistent semantics and efficiency.

Bendable semantics would be things like the size of a pointer, the size of an extended float value, and byte ordering.


November 10, 2002
Walter says...
>
>Bendable semantics would be things like the size of a pointer, the size of an extended float value, and byte ordering.
>

That stuff is not semantics, just implementation detail.  So I would merely repeat, "As Mike says, distinguish the semantics from the implementation."

No language, not even C or D, asks the programmer to state what size pointer he wants, what size extended float, or what (native) byte ordering.  These details are not language semantics because no expression changes them.  You can change them all around by porting to another platform, but that program will retain identical semantic content.  Meaning, you run the program, and identical inputs produce identical outputs.  The underlying representation details of inputs and outputs have nothing to do with semantics per se.

Semantics is about the meaning of expressions.  Obviously one has to draw a line somewhere.  To carry this implementation confusion to its bitter end, we must include, under the heading "semantics," all voltage levels and current flows through every CPU transistor.  By this definition the same program, running on different Pentiums, induces different semantics, because the underlying transistor layouts are different.

The reality is that we draw the line ("semantic domain") where it stops helping our understanding of a program's purpose.  Granted it can be fuzzy, but most programmers think about questions like: "Does the loop in the program above execute at all? If it executes, does it terminate? What is the value of the variable after the while loop?"

http://www.sil.org/linguistics/GlossaryOfLinguisticTerms/WhatIsSemantics.htm http://www.cs.du.edu/~ramki/courses/3351/2002Autumn/notes/Lectures4and5.pdf http://www4.informatik.tu-muenchen.de/papers/RUM98a.ps.gz

Mark


November 11, 2002
"Mark Evans" <Mark_member@pathlink.com> wrote in message news:aqmr8u$21dp$1@digitaldaemon.com...
> Walter says...
> >Bendable semantics would be things like the size of a pointer, the size
of
> >an extended float value, and byte ordering.
> That stuff is not semantics, just implementation detail.  So I would
merely
> repeat, "As Mike says, distinguish the semantics from the implementation."

Consider that Java specifies these things (at least the byte ordering <g>).


> No language, not even C or D, asks the programmer to state what size
pointer he
> wants, what size extended float, or what (native) byte ordering.  These
details
> are not language semantics because no expression changes them.  You can
change
> them all around by porting to another platform, but that program will
retain
> identical semantic content.  Meaning, you run the program, and identical
inputs
> produce identical outputs.  The underlying representation details of
inputs and
> outputs have nothing to do with semantics per se.

I have plenty of experience with multiple pointer sizes, programmer specified pointer sizes, and code breaking happening when pointer sizes are assumed. <g> Unless you carefully construct your code to be independent of pointer size, it can and will break when ported. Related things will break, too, such as what integral type can hold an offset to a pointer?


> Semantics is about the meaning of expressions.  Obviously one has to draw
a line
> somewhere.  To carry this implementation confusion to its bitter end, we
must
> include, under the heading "semantics," all voltage levels and current
flows
> through every CPU transistor.  By this definition the same program,
running on
> different Pentiums, induces different semantics, because the underlying transistor layouts are different.

Java is a language that attempts to have no bendable semantics. It succeeds in some areas, but fails in others (notably on timing issues between threads, when gc finalization happens, etc.) Having no bendable semantics is necessary to deliver on the "write once, run everywhere" paradigm. With C code, we accept the notion that a little bit of debugging and tweaking of source will be necessary for each port to deal with inadvertant reliance on bendable semantics.

> The reality is that we draw the line ("semantic domain") where it stops
helping
> our understanding of a program's purpose.  Granted it can be fuzzy, but
most
> programmers think about questions like: "Does the loop in the program
above
> execute at all? If it executes, does it terminate? What is the value of
the
> variable after the while loop?"
>
>
http://www.sil.org/linguistics/GlossaryOfLinguisticTerms/WhatIsSemantics.htm
>
http://www.cs.du.edu/~ramki/courses/3351/2002Autumn/notes/Lectures4and5.pdf
> http://www4.informatik.tu-muenchen.de/papers/RUM98a.ps.gz

In some languages, we draw the line on semantic specification when it would put an intolerable burden on the resulting code - for example, attempting to emulate a 32 bit flat pointer model in 16 bit C code is just not worth it (some people did try it). This means, for example, that a garbage collector written for a 32 bit flat memory model will have to be scrapped and completely redone for the "far" memory model on the PC.


November 16, 2002
Walter says...
>
>> "...distinguish the semantics from the implementation."
>
>Consider that Java specifies these things
>(at least the byte ordering <g>).

Java byte ordering doesn't change during the execution of a program!!!  This is just a configuration flag for the implementation.

Semantics is stuff that makes one program logically different from another. Whether I compute x=2+2 in LSB or MSB byte-ordering is irrelevant to the logical result.  The output is still x=4 in whatever representation.  That is the semantic meaning of the program.


>I have plenty of experience with multiple pointer sizes, programmer specified pointer sizes, and code breaking happening when pointer sizes are assumed. <g>

You've spent years doing 16 and 32 bit compilers in a world dominated by Microsoftisms, so take a step back and think through that fog. <g>

Programmers should not assume what languages don't support.  That is what we call "bad code."  Even C offers sizeof(); for example sizeof(void*) would be "good code" entailing no assumption.  Get a better programmer!

Win16 versus Win32 was a Microsoftism, not a language feature of C.  Even at that, programs were either all 16-bit or all 32-bit (though there was a transition period) and did not care about pointer size.

It is possible to write good C code without knowing the byte ordering, pointer size, or anything of the sort.  That is why we have compilers to deal with such mundane nonsense -- instead of writing assembly.


>Unless you carefully construct your code to be independent of pointer size

Carefully?  Almost all code is independent by default.  Code should be careless about such things, because they are compiler implementation issues.  That's why we have compilers.


>it can and will break when ported. Related things will break, too, such as what integral type can hold an offset to a pointer?

It will break when ported for reasons having little to do with pointer sizes, and much to do with OS issues (which are *not* language issues).



>Java is a language that attempts to have no bendable semantics.

On the contrary, by your definition Java semantics do backflips and change colors.  Presumably you include the VM implementation in your definition of semantics because that is what determines byte ordering, pointer sizes, etc. Byte orderings change, pointer sizes change, and everything still works in Java.

By my definition:  Java has solid semantics, and the implementations do backflips.  Same semantics, varying implementations.  That's true cross-platform code.


>In some languages, we draw the line on semantic specification when it would put an intolerable burden on the resulting code - for example, attempting to emulate a 32 bit flat pointer model in 16 bit C code is just not worth it (some people did try it). This means, for example, that a garbage collector written for a 32 bit flat memory model will have to be scrapped and completely redone for the "far" memory model on the PC.

The stated problem is to emulate one implementation of a language using a different implementation, a rather artificial corner case having nothing to do with semantics, and everything to do with implementation details that most people don't care about.  The C language was not designed for such trickery. You'll have to use assembly for that.

Thanks Walter,
Mark


1 2 3 4 5
Next ›   Last »