June 09, 2005
On Thu, 09 Jun 2005 00:40:19 -0700, Unknown W. Brackets wrote:

> The point is it would still have to analyze it.

Yeah, whatever. Look, I'm over it.

Personally I don't think there is a solution, so I'm resigned to using some other tool to develop with from now on. If someone comes up with a great way of achieving the 'grand' aim of only maintaining one set of source files per application, just let us know. Currently, D alone is not enough; it falls short of ever being able to do this, unlike C and C++. In short, I need an additional tool so I can reduce my development costs.

-- 
Derek Parnell
Melbourne, Australia
9/06/2005 7:44:59 PM
June 09, 2005
Hasan Aljudy wrote:
<snip>
> And, what about this example:
> arr["key"] = arr["key"];
> or this:
> arr["key"] = arr["key"] + 1;
> 
> What would happen if the key "key" didn't exist? would it be created as in
> arr["key"] = value;
> or would it cause an error as in
> if( arr["key"] ) ...

The latter.  The rvalue is evaluated and then assigned to the lvalue. So at the point when the rvalue is evaluated,

> I don't really care if the questions were answered ..
> I just wanted to point out the confusion.
> 
> Why shouldn't arr["key"] create the key "key"?
> 
> I guess this was mainly done to avoid creating a new key when the user only wants to check if it existed
> if( arr["key"] ) //shouldn't create the key "key"
> but isn't this solved with the "in" keyword? so the above line becomes deprecated, and the new way is:
> if( "key" in arr )

No, that has always been the way.

The two statements had different meanings then and have different meanings now.

Previously,

    if (arr["key"])

would succeed if the value of arr["key"] has a 'true' boolean interpretation.  If arr["key"] doesn't exist prior to the lookup, it would succeed if the value type's .init has a 'true' boolean interpretation.  At least that's the way it was supposed to behave - there seems to be a bug as I test it in GDC 0.11.

On the other hand,

    if ("key" in arr)

succeeds iff the key is present in the AA.

Now,

    if (arr["key"])

would have the same effect if the key is in, or throw an ArrayBoundsError if it isn't.  OTOH,

    if ("key" in arr)

behaves in the same way as it did before.

Stewart.

-- 
My e-mail is valid but not my primary mailbox.  Please keep replies on the 'group where everyone may benefit.
June 09, 2005
"Walter" <newshound@digitalmars.com> wrote in message news:d88dou$961$3@digitaldaemon.com...
> AAIIIIEEEEEEE !!!!!!

Lol, I take it that's a "no" ;)


June 09, 2005
> I was actually using all the now deprecated features.
> Guess how painful it was to upgrade to dmd 0.126.
> Walter, could you please *not* roll out too many major changes in one
> release?

I actually prefer upgrading my code once rather than strung out over several releases. I've been upgrading to 126 and the error messages make it obvious what has changed so it isn't that bad IMO.


June 09, 2005
"Derek Parnell" <derek@psych.ward> wrote in message news:1he6yst5qhjp2.9hzslvyydu8u$.dlg@40tude.net...
> On Tue, 7 Jun 2005 22:17:50 -0700, Walter wrote:
>
>> Added inner classes.
>>
>> http://www.digitalmars.com/d/changelog.html
>
> Is there a reason for cstream.d having public imports? I think private
> ones
> will cause less potential hiccups.

Since cstream is designed to make mixing std.c.stdio and std.stream easy any hiccup should preferrably be fixed rather than avoided. The behavior is similar to how std.stdio publically imports std.c.stdio. Are there any issues in particular you are worried about?


June 09, 2005
Derek Parnell wrote:
<snip>
> The problem is, that not all compilers are at the same release level, or
> will share the same features. So if one wants one's product to be to be
> supported with multiple compilers, how does one code their source files?

By using deprecated features.

AIUI the whole point of the -d option is to enable legacy code to work on modern compilers.  I realise you want to modernise your code, but you're trying to rush into it too quickly.

Code for the lowest common denominator of compilers you want to support.  As that lowest common denominator becomes more modernised, _then_ modernise your code to keep up with it.

Stewart.

-- 
My e-mail is valid but not my primary mailbox.  Please keep replies on the 'group where everyone may benefit.
June 09, 2005
In article <19gr67kb4q1fw.vwbenzrtgmvv$.dlg@40tude.net>, Derek Parnell says...
>
>> And because D is pre-1.0, there's little reason to move slowly with changes.
>
>Okay, so along comes 1.01 with a new feature. How long should I wait for all the other compilers I want to support to catch up? What if some of those never catch up but are still in use. Borland C v5.5 has lots of adherents.

True enough.  But should the development of a language be halted for the sake of backwards compatibility?  Or do you have a suggestion for how this could be resolved another way.  I'm drawing a blank.

>Yes, but irrelevant. How does one support in a *single* source base, the various *incompatible* syntaxes that may exist at *any* given time? Think in general terms and not specific compilers or language features.

For D, my only suggestion would be to ignore version blocks even for syntax checking (assuming they aren't already).  If the version flag isn't set, the block doesn't exist at all.  This doesn't solve the more general problem of new operators and such that can't be quite so easily isolated, but I don't see the language changing much at that level once we hit 1.0.  I'm personally not willing to introduce macros into the language just to solve this particular problem.  I'd rather compiler implementors have some incentive to update their code within a reasonable timeframe.  And when you get down to it, no one says you have to use the new features anyway.  Perhaps not an appealing option (this still bugs me with C++ compiler conformance) but at least it's one we're used to.


Sean


June 09, 2005
In article <d88733$1t9$1@digitaldaemon.com>, Ben Hinkle says...
>
>>> <quote>
>>> Now throws an ArrayBoundsError if accessing an associative array with a
>>> key that is not already in the array. Previously, the key would be added
>>> to the array.
>>> </quote>
>>> Execuse my n00bness, but how do you add keys to the associative array
>>> then?
>>
>> int[char[]] arr;
>>
>>  arr["these"] = 1;
>>  arr["are"] = 3;
>>  arr["new"] = 6;
>>  arr["keys"] = 2;
>
>I'm happy to see Walter was able to also keep the samples/d/wc2.d style of adding when used in an lvalue operator like ++. The word-count code uses dictionary[word]++ to insert if not present.

So insertion just doesn't occur when the AA is treated as an rvalue?  This is good news.  I was worried we wouldn't be able to do this.


Sean


June 09, 2005
In article <tb0gyescbf38.hee66wywoqdw.dlg@40tude.net>, Derek Parnell says...
>
>On Wed, 08 Jun 2005 16:48:14 -0700, Unknown W. Brackets wrote:
>
>> It would still have to count curly braces, at the least, because a literal reading of your suggestion means:
>> 
>> version (asdf)
>> {
>>    if (true)
>>    {
>>       writef("only shown for version=asdf!");
>>    }
>> 
>>    writef("shown always.");
>> } // <-- error.
>
>I said the stuff in between the braces, that is the version statement's braces would be ignored. Thus your example above is equivalent to ...
>
>  version (asdf)
>  {
>     Any text at all can go here (except unmatched braces)
>  }
>
>I think you were trying to say ...
>
> version (asdf)
> {
>    writef("only shown for version=asdf!");
> }
> writef("shown always.");
>
>I know its not perfect but it could be a starting point for discussion and thought. Though I believe its a hopeless situation now.

Derek, I'm jumping into the middle of the thread here, please feel free to correct me if I'm way off base here. :)

Part of the problem is that DMD (at least by inspecting the front-end) has a lexer that provides tokens to the parser in a limited read-ahead fashion.  I don't think it would be hard to get the lexer to do a deeper scan (like you propose) ahead of any given parser production, but it would still crumble when it encounters a non-supported token.

IMO, the only time this would fail would be in the case of '$' being introduced as you mentioned earlier.  Even with version(), the older lexer is going to trip on any new tokens should they be introduced.  One way to escape the problem is to make DMD 'preprocess' version blocks as to avoid a full-on lex of them. Somehow, I don't think this is going to fly with Walter. :(

A compromise would be for DMD to produce 'unknown' tokens from the lexer, to let the parser cough up an error upon encountering them.  *That* would solve the problem now and from here on out, since the parser would skip version() branches (with all their possibly unsupported tokens) that it doesn't need to /parse/.  I also think that it would be easier to implement into both DMD and GDC without disrupting the overall compiler design.

Now, will Walter go for it?

- EricAnderton at yahoo
June 09, 2005
Forget everything I said.

I'm looking at the frontend source now.  Derek is right, there's no way to side-step the introduction of a new token (say '@' or '#'), or new variation on an existing production (like 'x = foo ?: bar;') in D without introducing some rudimentary preprocessing behavior.

The problem lies in the fact that all version() branches must be valid D code in order to make their way into the parse tree.  After several semantic passes, the tree is then evaulated, and steers itself down the appropriate version() branches.  Then we get binary output from that last, version() sensitive, pass.

So the only way to avoid these problems with the existing rules and compiler family is to code to the least-common-denominator of all compilers, which defeats some of the purpose behind version().


>Derek, I'm jumping into the middle of the thread here, please feel free to correct me if I'm way off base here. :)
>
>Part of the problem is that DMD (at least by inspecting the front-end) has a lexer that provides tokens to the parser in a limited read-ahead fashion.  I don't think it would be hard to get the lexer to do a deeper scan (like you propose) ahead of any given parser production, but it would still crumble when it encounters a non-supported token.
>
>IMO, the only time this would fail would be in the case of '$' being introduced as you mentioned earlier.  Even with version(), the older lexer is going to trip on any new tokens should they be introduced.  One way to escape the problem is to make DMD 'preprocess' version blocks as to avoid a full-on lex of them. Somehow, I don't think this is going to fly with Walter. :(
>
>A compromise would be for DMD to produce 'unknown' tokens from the lexer, to let the parser cough up an error upon encountering them.  *That* would solve the problem now and from here on out, since the parser would skip version() branches (with all their possibly unsupported tokens) that it doesn't need to /parse/.  I also think that it would be easier to implement into both DMD and GDC without disrupting the overall compiler design.
>
>Now, will Walter go for it?
>
>- EricAnderton at yahoo