November 12, 2006
Carlos Santander wrote:
> I don't think they're valid concerns (meaning they're subjective, not everyone will have those ideas), but I think we (the D community) just want a way to clearly differentiate both meanings of auto. So, choose any two words: auto/scope, var/auto, def/scoped, foo/bar, and let's move on.

I think the auto/scope is probably the best idea.

> Also keep in mind that your option #5 (auto c = Class()) doesn't seem to be a popular proposal (based on what has been said in the ng).

I'm a bit surprised at that, but the negative reaction to it is pretty clear.
November 12, 2006
Andrey Khropov wrote:
> Well, 'var' is used in C# 3.0 for this purpose (which is obviously a C-family
> language and a popular one)

C# was designed by a Anders Hjelsberg, who wrote Pascal compilers and designed Delphi.
November 12, 2006
Walter Bright wrote:
> Sean Kelly wrote:
> 
>> Walter Bright wrote:
>>
>>> The auto storage class currently is a little fuzzy in meaning, it can mean "infer the type" and/or "destruct at end of scope".
>>
>>
>> As Don explained to me, 'auto' is a storage class in D and in C along with 'const' and 'static', so type inference doesn't occur because 'auto' is present so much as because a type is omitted.  The presence of 'auto' merely serves to indicate that the statement is a declaration (since 'auto' is the default storage class and therefore otherwise optional).  Type inference occurs with the other storage classes as well.  I think this is an important distinction because it seems to be a common misconception that 'auto' means 'infer the type of this expression' and that a specific label is necessary for this feature.
> 
> 
> True. Consider that type inference works in these cases:
> 
>     static a = 3;    // a is an int
>     const b = '3';    // b is a char
> 
> So auto doesn't actually ever mean "infer the type", it's just needed because one of the other storage class keywords isn't there.

I think it's okay to "have a word" for "infer the type". Think about it like this: through years C folks have used for(int i=0; i<10; ++i) as the default loop construct, never suspecting how shallow their understanding of it really is. But only people with a deeper understanding could write something like for(Node p=list; p; p=p.next) out of their own head.

Similarly, having "a word" for automatic type inference is okay, and it won't become any more a hindrance on your way to become a top programmer than the similar misconception about for. Once you've understood the language enough, you'll figure out that it's simply about not specifying a type and just needing a placeholder in the syntax if no other storage attribute happens to be needed.

So, it should be no biggie.

---

That would be, IMHO, a better solution than the current, where people instead run in circles chasing wild geese, and even highly admired D gurus like Sean have to ask Don about what the hell is going on here. (I know I should have asked Don!)

THAT IS UNACCEPTABLE. It is imperative that the language be designed so that such massive confusion simply does not arise.

(I know this post may seem self-contradictory at first reading. :-) )
November 12, 2006
Don Clugston wrote:
> Dave wrote:
>> Kyle Furlong wrote:
>>> Chris Miller wrote:
>>>> On Sat, 11 Nov 2006 13:48:00 -0500, Walter Bright <newshound@digitalmars.com> wrote:
>>>>
>>>>> The auto storage class currently is a little fuzzy in meaning, it can mean "infer the type" and/or "destruct at end of scope". The latter only has meaning for class objects, so let's look at the syntax. There are 4 cases:
>>>>>
>>>>
>>>> Choosing from your list I guess "auto" for auto destruction and "infer" for auto type deduction.
>>>>
>>>> However, I tend to prefer "scope" (without parentheses following) for auto destruction, which frees up "auto" for auto type deduction:
>>>>
>>>>    scope Object o = new Object(); // Destructed at end of scope.
>>>>    scope auto q = new Foo(); // Auto type deduction and end-of-scope destruction.
>>>
>>> scope has my vote, its elegant, as raii is functionally similar to the scope(x) construct.
>>
>> Vote++.
>>
>> But Walter specifically (and I think purposefully) left out any mention of new keywords. Nonetheless I'd like to see auto deprecated in favor of 'infer'. 'scope' and 'infer' both describe exactly what they are used for. 
> Not so, 'auto' *never* means 'type inference'. It's the absence of a type that means type inference. But the usage of auto to mean both 'raii' and 'local variable storage class' is really confusing people.
> 

I realize 'auto' is technically the default storage class, but was suggesting when both a storage class and type are absent, 'infer' would be used (and not as a storage class) instead of auto.

>> 'auto' just seems too much like a deprecated artifact of C.
> 
> Agreed, but I don't know of anything better.
> 
> const infer x = 3.5;
> static infer x = 4.5L;
> 
> are too wordy.
> 

I think so too, but I don't think 'infer' would be necessary in those cases, even for the sake of consistency.

>> Based on previous discussions though, 'auto' is here to stay one way or the other forever. I sincerely don't understand Walter's infatuation with 'auto', but it is what it is.
November 12, 2006
I hope you don't mind a comment from a lurker. :)

I had assumed that there was some more important purpose for type inference, such as that it was useful in generics or something. But if it's just to avoid having to declare types, I wonder if it's really that important to have.

"Walter Bright" <newshound@digitalmars.com> wrote in message news:ej6vb3$2s5m$1@digitaldaemon.com...
>
> What is the type of foo.length? Is it an int? uint? size_t? class Abc? Without type inference, I'd have to go look it up. With type inference, I know I've got the *correct* type for i automatically.
>

However, you don't know what that type is. If you want to know, you need to go look it up. This is especially true if you're working on someone else's code. I think that's fine if you're working within an IDE that can tell you the type. I'm not a fan of Hungarian notation. When working within Visual Studio for example, there are multiple ways to find the type of a variable easily. You can hover the mouse over it. You can right-click on it and select "Go to Declaration". I think this obviates things like Hungarian notation. It would help with discovering the types of inferred variables.

But if you're using some editor like vi, then you may find it tedious and time-consuming to always search for the types of variables. I would be more likely to avoid type inference altogether and simply write the type just for the sake of documenting it.

> Now, suppose I change the type of foo.length. I've got to go through and check *every* use of it to get the types of the i's updated. I'll inevitably miss one and thereby have a bug.

I assume if you've made an assignment to the wrong variable type, the compiler will catch this (unless there's an implicit cast). At least the compiler errors should help you find all occurrences easily. Although, you'd still need to go through and manually change the type in each occurrence. I just feel that making these kinds of changes to code are more the job of development tools. Perhaps the line between language and tools can start to get a little blurry; that's another discussion. Anyway, I just wonder if there is a strong a case for having type inference at all.

Jim



November 12, 2006
Jim Hewes wrote:
> I hope you don't mind a comment from a lurker. :)

Not at all.

> I had assumed that there was some more important purpose for type inference, such as that it was useful in generics or something. But if it's just to avoid having to declare types, I wonder if it's really that important to have.

I think I must have presented the case for it very poorly. The goal is to minimize bugs; avoiding having to declare types is a means to that goal, not the end itself.

> "Walter Bright" <newshound@digitalmars.com> wrote in message news:ej6vb3$2s5m$1@digitaldaemon.com...
>> What is the type of foo.length? Is it an int? uint? size_t? class Abc? Without type inference, I'd have to go look it up. With type inference, I know I've got the *correct* type for i automatically.
>>
> 
> However, you don't know what that type is. If you want to know, you need to go look it up. This is especially true if you're working on someone else's code. I think that's fine if you're working within an IDE that can tell you the type. I'm not a fan of Hungarian notation. When working within Visual Studio for example, there are multiple ways to find the type of a variable easily. You can hover the mouse over it. You can right-click on it and select "Go to Declaration". I think this obviates things like Hungarian notation. It would help with discovering the types of inferred variables.

If you're always viewing code by using such a tool, then discovering the type is trivial as you suggest. But even with such a tool available, lots of times you won't be using it to view the code.

And just to make it clear, I am not advocating Hungarian notation in the sense that it normally means (I do think, however, such notation used in the manner that Joel Spolsky writes about is useful).
http://www.joelonsoftware.com/articles/Wrong.html


> But if you're using some editor like vi, then you may find it tedious and time-consuming to always search for the types of variables. I would be more likely to avoid type inference altogether and simply write the type just for the sake of documenting it.

I'd posit here that if type inference is possible, then perhaps documenting the type at that point is the wrong place to document it.


>> Now, suppose I change the type of foo.length. I've got to go through and check *every* use of it to get the types of the i's updated. I'll inevitably miss one and thereby have a bug.
> 
> I assume if you've made an assignment to the wrong variable type, the compiler will catch this (unless there's an implicit cast).

That's just the problem. There is a lot of implicit casting going on. Consider .length - it often does mean int, uint, size_t, etc., and there are implicit casts between them. So if you 'document' it as int, then there's a bug if it really should be size_t.

> At least the compiler errors should help you find all occurrences easily.

Unfortunately, it doesn't. One could counter with "make all implicit casting illegal." Pascal tried that, and in my not-so-humble-opinion that was a big reason for the failure of Pascal, and it's the primary reason why I dropped it as soon as I discovered C. Even so, who wants to go through their source, compiler error by compiler error, fixing it? If the work could be cut in half using type inference, why not?

> Although, you'd still need to go through and manually change the type in each occurrence. I just feel that making these kinds of changes to code are more the job of development tools. Perhaps the line between language and tools can start to get a little blurry; that's another discussion. Anyway, I just wonder if there is a strong a case for having type inference at all.

It is optional. The ability to explicitly type declarations is not going to go away.
November 12, 2006
Walter Bright wrote:
> Bill Baxter wrote:
> 
>> Which is good, since automatic type deduction is basically a typing saving feature to begin with.
> 
> 
> No, it's way way more than that. Auto type deduction gets us much closer to the goal of the type of something only needs to be specified once, and then anything that manipulates it infers the right type. This makes code much more robust.
> 
> For example,
> 
>     auto len = foo.length;
> 
> What is the type of foo.length? Is it an int? uint? size_t? class Abc? Without type inference, I'd have to go look it up. With type inference, I know I've got the *correct* type for i automatically.

From a software maintenance and robustness standpoint I think that cuts both ways.  If I don't know the type then it's easy to do something wrong with it like

   auto len = foo.length;
   len = -1;  // oops! len is unsigned!  not what I intended!

From the point of view of someone reading and maintaining code, I hope the only places I see auto are where the type is obvious from the context, like auto foo = new SomeClassWithALongNameIDontWantToTypeTwice.

I think it's bad practice to use auto just becuase you're too lazy to go figure out what the right type is.  If you don't figure it out when writing it, then every person after you reading or maintaining the code is going to have to go figure it out.

--bb
November 12, 2006
Andrey Khropov wrote:
> Walter Bright wrote:
> 
> 
>>("let" is far worse, as it gives the impression that D is some sort of new
>>Basic language.)
> 
> 
> Forget about Basic, 'let' is used in ML that pioneered the concept of type
> inference!

Forget about ML, 'let' is used in Lisp, the granddaddy of all functional programming languages!

> And there is also 'def' (from define)
> (used in Nemerle for this purpose, in many languages indicates function
> definition).

It's a slightly different meaning -- let in functional programming languages, and def in Nemerle define something that's immutable.  But close enough.

> But I'd like to see it short (3-4 characters) otherwise typing 'int' would be
> simpler :).

Indeed.
November 12, 2006
Bill Baxter wrote:
>  From a software maintenance and robustness standpoint I think that cuts both ways.  If I don't know the type then it's easy to do something wrong with it like
> 
>    auto len = foo.length;
>    len = -1;  // oops! len is unsigned!  not what I intended!

If you wrote:

	int len = foo.length;

and foo.length was unsigned, isn't that just another sort of bug?

>  From the point of view of someone reading and maintaining code, I hope the only places I see auto are where the type is obvious from the context, like auto foo = new SomeClassWithALongNameIDontWantToTypeTwice.

I see it as much more useful than that.

> I think it's bad practice to use auto just becuase you're too lazy to go figure out what the right type is.  If you don't figure it out when writing it, then every person after you reading or maintaining the code is going to have to go figure it out.

Not necessarily. Think of it like writing templates that depend on a generic type T and assume certain properties of that type.
November 12, 2006
Walter Bright wrote:
> Andrey Khropov wrote:
> 
>> Well, 'var' is used in C# 3.0 for this purpose (which is obviously a C-family
>> language and a popular one)
> 
> 
> C# was designed by a Anders Hjelsberg, who wrote Pascal compilers and designed Delphi.

Good point.  But have you heard *anyone* complain that they don't want to use C# because it's too much like Pascal?

--bb