View mode: basic / threaded / horizontal-split · Log in · Help
February 17, 2012
Re: Why is there no or or and ?
On 2/16/12 11:47 PM, F i L wrote:
> I would use them over '||' and '&&' for the reasons bearophile gave.
> Highlighted as keywords, they're easily set appart, easier to type, and
> more distinguished... then again if I had my way I'd remove the '('/')'
> brackets, ending marks, and auto keyword;

I figured I won't be able to unread the rest, so I stopped here :o).

Andrei
February 17, 2012
Re: Why is there no or or and ?
On 02/16/2012 08:53 PM, bearophile wrote:
> Caligo:
>
>> possible enhancement request?  or is there a good reason it is not in
>> the language?
>
> I have asked for them, but Walter doesn't want, he thinks C/C++ programmers will not use them... :-( Despite D != C/C++.
>
> ----------------
>
> Jonathan M Davis:
>
>> And I'm actually mildly shocked that anyone (at least any programmer) would
>> think that "or" and "and" were more readable. The fact that operators aren't
>> words is a _major_ boon to code readibility.
>
> This is very very wrong. Of course that "or" and "and" are more readable. When you read "and" it's immediate to think it's probably related to the AND logical or binary operation more than meaningless symbols that are unrelated to "AND".

'and' & 'or' are just a patch of meaningless squiggles with no more or 
less meaning than '&&' and '||' have. They have the meaning we impart to 
them. Personally, I associate '||' with the idea of disjunction just as 
much as I associate 'or'.

Having the operators and the key-words/identifiers from different sets 
of characters, IMHO make things easer to read.

>
> "or" and "and" are about as long as those symbols in char count, quicker to write because they are lowercase letters instead of symbols, and they are much simpler told apart from bitwise&  |. This avoids some bugs where people use "&&" where they want to use"&" or the other way around. Such bugs are so common that D have had to introduce one or two rules to help avoid them.
>
> Python got this waaaaay much better than D. Using "&" for (uncommon, in Python) binary ops, and "and" for the common logic boolean operation.

In a debate about readability, I don't give much weight to an argument 
that holds up Python as an ideal. Semanticly significant white-space? 
Are you kidding me?
February 17, 2012
Re: Why is there no or or and ?
On Friday, February 17, 2012 06:57:48 F i L wrote:
> I knew someone was going to say that >:{

Well, you _were_ asking for it. :)

But I do honestly think that it's ugly. Obviously, different people have 
different ideas of what a language's syntax should look like - frequently 
strongly influenced by what language or languages they used first. Personally, I 
_like_ the C style syntax and do not understand why people want to get rid of 
stuff like braces or semicolons (you didn't get rid of braces in your suggested 
syntax, but it's a common suggestion for people who don't like C style 
syntax). Overall, I think that D's syntax is very good.

- Jonathan M Davis
February 17, 2012
Re: Why is there no or or and ?
On 02/16/2012 09:16 PM, Nick Sabalausky wrote:
> "Jonathan M Davis"<jmdavisProg@gmx.com>  wrote in message
> news:mailman.450.1329455016.20196.digitalmars-d@puremagic.com...
>>
>> Seriously?&&  and || are _way_ more readible, because they're obviously
>> not
>> functions or variables. It's immediately obvious what the operators are
>> when
>> scanning code. That's not the case when the operators are words instead of
>> symbols. I'm certain that you'd have quite a few programmers up in arms if
>> you
>> tried to change&&  to "and" and || to "or." And having multiple operators
>> which do exactly the same thing is a horrible idea which reduces code
>> readibility. So, even adding them as alternate options is a really bad
>> idea
>> IMHO.
>>
>> I'm surprised that anyone would think that and was better than&&.
>>
>
> This is why I think people are nuts when they claim that english-like
> VB-style syntax is more readable than C-style.
>
> (Yea, to a grandmother with zero programming experience english-like
> languages are more readable. For a programmer it's worse becase code !=
> english.)

Any language that is designed to be easy for amateurs to use will be 
used by amateurs, and only by amateurs.

Yes, avoid making the language unnecessarily hard for beginners, but 
don't in any way compromises the language to do so.
February 17, 2012
Re: Why is there no or or and ?
On Fri, Feb 17, 2012 at 06:47:20AM +0100, F i L wrote:
> I would use them over '||' and '&&' for the reasons bearophile gave.
> Highlighted as keywords, they're easily set appart, easier to type,
> and more distinguished... then again if I had my way I'd remove the
> '('/')' brackets, ending marks, and auto keyword; switched the
> definition name-type placement and change if/else/return/contract
> syntax...

Well, if you're going to reinvent the language syntax, I'd like to
replace:

	=	with	:=
	==	with	=

These two are the most annoying syntax inherited from C, IMNSHO. I mean,
mathematically speaking, = is equality, not assignment. Traditionally :=
has been used for assignment; so why mix them up? Besides, what on earth
is == anyway? Equal-equal? It makes no sense. And even worse, languages
like Javascript that copied C's lousy choice of equality operator made
it worse by introducing ===, which is both nonsensical in appearance and
semantically a symptom of language maldesign.

Next on the list is, of course:

	&&	with	and	(or perhaps &)
	||	with	or	(or perhaps even |)

The symbol '&' is commonly used to mean 'and', such as "John & Wiley's".
So why the && stutter? Bitwise operations aren't used very much anyway,
so they shouldn't be hogging single-character operators. Let bitwise AND
be &&, and I'd be OK with that. But C has gotten it the wrong way round.

Similarly '|' *has* been used traditionally to separate alternatives,
such as in BNF notation, so there's no reason for that silly || stutter.
Bitwise OR isn't used very often anyway, so if anything, | should be
logial OR, and I suppose it's OK for || to be bitwise OR. Again C has it
the wrong way round.

But more importantly:

	^^	with	^
	^	with something else altogether

I mean, c'mon. Everybody knows ^ means superscript, that is,
exponentiation. So why waste such a convenient symbol on bitwise XOR,
which is only rarely used anyway?! It should simply be called 'xor' at
best.  Nobody who hasn't learned C (or its derivatives) knows what '^'
means (in C) anyway.

And then:

	!	with	not

Everyone knows ! means exclamation mark, or factorial. Having it also
mean logical NOT is just needlessly confusing. What's wrong with 'not'?
Or, since we have Unicode, what about ¬? Much clearer.

As for bitwise NOT, '~' is about the most counterintuitive symbol for
such a thing. My presumptuous guess is that Kernighan ran out of symbols
on the keyboard for operators, so he resorted to ~. The symbol '~'
should've been reserved for an "approximately equal" operator, useful in
comparing floating-point numbers (which as we know usually shouldn't be
compared with equality due to roundoff errors), like this:

	if (a ~ b) { ... }
	
rather than today's baroque dance of:

	if (fabs(b-a) < EPSILON) { ... }

And what about:

	.	with	: or ;

OK. The symbol '.' is supposed to be used for the end of a sentence. At
least, so we were told in grade school. In the case of programming, it
should denote the end of a statement. So why is it that ';' is used to
end statements, and '.' to access struct/class members? It seems so
bass-ackwards. A semicolon (or a colon) is much more suitable for what
amounts to a name composed of parts (module:object:property), because
they signify partial stop, implying there's more to come. The period (or
full-stop for you brits) '.' should be used to *end* statements, not to
*continue* a multi-part name.

But who am I to speak out against more than four decades of historical
accidents, right? I think I'll shut up now.


T

-- 
If you want to solve a problem, you need to address its root cause, not
just its symptoms. Otherwise it's like treating cancer with Tylenol...
February 17, 2012
Re: Why is there no or or and ?
Andrei Alexandrescu wrote:
> I figured I won't be able to unread the rest, so I stopped here 
> :o).

Well... my syntax must require a finer taste </posh> ;-p


Jonathan M Davis wrote:
> On Friday, February 17, 2012 06:57:48 F i L wrote:
>> I knew someone was going to say that >:{
>
> Well, you _were_ asking for it. :)
>
> But I do honestly think that it's ugly. Obviously, different 
> people have different ideas of what a language's syntax should 
> look like - frequently strongly influenced by what language or 
> languages they used first. Personally, I _like_ the C style 
> syntax and do not understand why people want to get rid of 
> stuff like braces or semicolons (you didn't get rid of braces 
> in your suggested syntax, but it's a common suggestion for 
> people who don't like C style syntax). Overall, I think that 
> D's syntax is very good.

I started on C++ actually (ahh Codewarrior :)) and still find the 
C (read D/C#) syntax to be the most readable overall, though, 
after navigating the myriad of syntaxes and styles of web design, 
I'm not too picky. But my syntax does have purpose behind it's 
brilliance! For instance, all identifier names are placed at the 
beginning of Lines, which eliminates eyeballing each line to find 
it's definition and keeps accessor keywords (static, public, etc) 
from getting in the way of variable grouping. 'case' is used 
instead of 'if' to keep short, like-kind conditions aligned. 
'else' can be used as 'else if' for the same reason. Contracts 
are kept from bumping up against the definitions (important when 
identifiers are first) and ending marks are largely unneeded in 
the first place.

To eliminate Go's required "{" at the end of body definitions, I 
would make each "free scope" require a scope keyword. This might 
even be useful for something like functions derived of functions 
(not thoroughly thought out):

    foo( age:int )
    {
        before: scope {}

        // foo code

        after: scope
        {
            // default after code
        }
    }

    bar( name:text, age:int ): foo( age )
    {
        // inherited foo code

        scope( after )
        {
            // override after code
        }
    }

I know this is what polymorphism is for, but I think because 
object methods are conceptually "random access" and scopes are 
"in sequence" there might be a (any?) benefit to static function 
inheritance. Maybe in combination with structs or UFCS? Idk, 
maybe i'm just crazy :)
February 17, 2012
Re: Why is there no or or and ?
"F i L" <witte2008@gmail.com> wrote in message 
news:simejelbyihexcsbkoyl@forum.dlang.org...
>I would use them over '||' and '&&' for the reasons bearophile gave. 
>Highlighted as keywords, they're easily set appart, easier to type, and 
>more distinguished... then again if I had my way I'd remove the '('/')' 
>brackets, ending marks, and auto keyword; switched the definition name-type 
>placement and change if/else/return/contract syntax...
>
>     foo( a, b: float ): int

In other languages, I can live with JS-style "var:Type" but I've never 
really liked it. Just seems totally backwards to me:

1. When I declare a variable, I normally know the type I want before I know 
what to name it, so just typing it in is backwards.

2. With function definitions, why is the return type so rediculously far 
away from the function name? Should be "foo:int( a, b: float )". Besides, 
when you call the func and assign the return value to a variable, the return 
value is going to the left, not the right. In C-style, return values/types 
move "left". In JS-style, it's all willy-nilly.

3. Makes it harder to distinguish declarations from assignments at a glance. 
You have to look in the middle of the statement to see what the heck it is. 
With C-style you only have to look at the beginning (which are conveniently 
all lined up): Starts with a variable? Assignment. Starts with a type or 
attribute? Declaration. Starts with colored text? *Definitely* declaration. 
Don't see why languages keep trying to marginalize the idea of declarations.

4. Initializers are just downright goofy:

a:int = 5;

Looks like it's assigning 5 to "int" instead of to "a", which is completely 
nonsensical.
February 17, 2012
Re: Why is there no or or and ?
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message 
news:mailman.460.1329459948.20196.digitalmars-d@puremagic.com...
> On Fri, Feb 17, 2012 at 06:47:20AM +0100, F i L wrote:
>> I would use them over '||' and '&&' for the reasons bearophile gave.
>> Highlighted as keywords, they're easily set appart, easier to type,
>> and more distinguished... then again if I had my way I'd remove the
>> '('/')' brackets, ending marks, and auto keyword; switched the
>> definition name-type placement and change if/else/return/contract
>> syntax...
>
> Well, if you're going to reinvent the language syntax, I'd like to
> replace:
>
> = with :=
> == with =
>
> These two are the most annoying syntax inherited from C, IMNSHO. I mean,
> mathematically speaking, = is equality, not assignment. Traditionally :=
> has been used for assignment; so why mix them up? Besides, what on earth
> is == anyway? Equal-equal? It makes no sense. And even worse, languages
> like Javascript that copied C's lousy choice of equality operator made
> it worse by introducing ===, which is both nonsensical in appearance and
> semantically a symptom of language maldesign.
>
> Next on the list is, of course:
>
> && with and (or perhaps &)
> || with or (or perhaps even |)
>
> The symbol '&' is commonly used to mean 'and', such as "John & Wiley's".
> So why the && stutter? Bitwise operations aren't used very much anyway,
> so they shouldn't be hogging single-character operators. Let bitwise AND
> be &&, and I'd be OK with that. But C has gotten it the wrong way round.
>
> Similarly '|' *has* been used traditionally to separate alternatives,
> such as in BNF notation, so there's no reason for that silly || stutter.
> Bitwise OR isn't used very often anyway, so if anything, | should be
> logial OR, and I suppose it's OK for || to be bitwise OR. Again C has it
> the wrong way round.
>
> But more importantly:
>
> ^^ with ^
> ^ with something else altogether
>
> I mean, c'mon. Everybody knows ^ means superscript, that is,
> exponentiation. So why waste such a convenient symbol on bitwise XOR,
> which is only rarely used anyway?! It should simply be called 'xor' at
> best.  Nobody who hasn't learned C (or its derivatives) knows what '^'
> means (in C) anyway.
>
> And then:
>
> ! with not
>
> Everyone knows ! means exclamation mark, or factorial. Having it also
> mean logical NOT is just needlessly confusing. What's wrong with 'not'?
> Or, since we have Unicode, what about ¬? Much clearer.
>
> As for bitwise NOT, '~' is about the most counterintuitive symbol for
> such a thing. My presumptuous guess is that Kernighan ran out of symbols
> on the keyboard for operators, so he resorted to ~. The symbol '~'
> should've been reserved for an "approximately equal" operator, useful in
> comparing floating-point numbers (which as we know usually shouldn't be
> compared with equality due to roundoff errors), like this:
>
> if (a ~ b) { ... }
>
> rather than today's baroque dance of:
>
> if (fabs(b-a) < EPSILON) { ... }
>
> And what about:
>
> . with : or ;
>
> OK. The symbol '.' is supposed to be used for the end of a sentence. At
> least, so we were told in grade school. In the case of programming, it
> should denote the end of a statement. So why is it that ';' is used to
> end statements, and '.' to access struct/class members? It seems so
> bass-ackwards. A semicolon (or a colon) is much more suitable for what
> amounts to a name composed of parts (module:object:property), because
> they signify partial stop, implying there's more to come. The period (or
> full-stop for you brits) '.' should be used to *end* statements, not to
> *continue* a multi-part name.
>
> But who am I to speak out against more than four decades of historical
> accidents, right? I think I'll shut up now.
>

Meh.

All of the syntaxes you're advocating are every bit as arbitrary as the ones 
you're against. So what if there's some other convention used in a 
completely different discipline? Code isn't english, and code isn't math: 
The needs, use-cases, etc. are all totally different so it makes sence that 
a different set of conventions would be much more appropriate.

These arguments reminds me of a non-programmer I once talked to who was 
complaining that different languages don't all use the same syntax for 
comments. My reaction was: 1. Who the hell cares? 2. They're *different* 
languages, why can't they *be* different?
February 17, 2012
Re: Why is there no or or and ?
On Friday, 17 February 2012 at 06:25:49 UTC, H. S. Teoh wrote:
> On Fri, Feb 17, 2012 at 06:47:20AM +0100, F i L wrote:
>> I would use them over '||' and '&&' for the reasons bearophile 
>> gave.
>> Highlighted as keywords, they're easily set appart, easier to 
>> type,
>> and more distinguished... then again if I had my way I'd 
>> remove the
>> '('/')' brackets, ending marks, and auto keyword; switched the
>> definition name-type placement and change 
>> if/else/return/contract
>> syntax...
>
> Well, if you're going to reinvent the language syntax, I'd like 
> to
> replace:
>
> 	=	with	:=
> 	==	with	=

I would agree with this, only there should be a distinction 
between assignment and declaration. Which in my syntax is ':'. 
Maybe the keyword 'is' could apply to runtime conditions.. might 
go nicely with the 'not' statement.


> These two are the most annoying syntax inherited from C, 
> IMNSHO. I mean,
> mathematically speaking, = is equality, not assignment. 
> Traditionally :=
> has been used for assignment; so why mix them up? Besides, what 
> on earth
> is == anyway? Equal-equal? It makes no sense. And even worse, 
> languages
> like Javascript that copied C's lousy choice of equality 
> operator made
> it worse by introducing ===, which is both nonsensical in 
> appearance and
> semantically a symptom of language maldesign.
>
> Next on the list is, of course:
>
> 	&&	with	and	(or perhaps &)
> 	||	with	or	(or perhaps even |)
>
> The symbol '&' is commonly used to mean 'and', such as "John & 
> Wiley's".
> So why the && stutter? Bitwise operations aren't used very much 
> anyway,
> so they shouldn't be hogging single-character operators. Let 
> bitwise AND
> be &&, and I'd be OK with that. But C has gotten it the wrong 
> way round.
>
> Similarly '|' *has* been used traditionally to separate 
> alternatives,
> such as in BNF notation, so there's no reason for that silly || 
> stutter.
> Bitwise OR isn't used very often anyway, so if anything, | 
> should be
> logial OR, and I suppose it's OK for || to be bitwise OR. Again 
> C has it
> the wrong way round.

Agreed. Though '|' is used to accumulate bit flags, but I guess 
"flag1 || flag2 || etc" isn't so bad. Especially since, as you 
said, these situations aren't uses nearly as much as conditional 
OR. Still, I think the best would be to simply use keyword and/or 
and leave &/| as bitwise operations.


> But more importantly:
>
> 	^^	with	^
> 	^	with something else altogether
>
> I mean, c'mon. Everybody knows ^ means superscript, that is,
> exponentiation. So why waste such a convenient symbol on 
> bitwise XOR,
> which is only rarely used anyway?! It should simply be called 
> 'xor' at
> best.  Nobody who hasn't learned C (or its derivatives) knows 
> what '^'
> means (in C) anyway.
>
> And then:
>
> 	!	with	not
>
> Everyone knows ! means exclamation mark, or factorial. Having 
> it also
> mean logical NOT is just needlessly confusing. What's wrong 
> with 'not'?
> Or, since we have Unicode, what about ¬? Much clearer.
>
> As for bitwise NOT, '~' is about the most counterintuitive 
> symbol for
> such a thing. My presumptuous guess is that Kernighan ran out 
> of symbols
> on the keyboard for operators, so he resorted to ~. The symbol 
> '~'
> should've been reserved for an "approximately equal" operator, 
> useful in
> comparing floating-point numbers (which as we know usually 
> shouldn't be
> compared with equality due to roundoff errors), like this:
>
> 	if (a ~ b) { ... }
> 	
> rather than today's baroque dance of:
>
> 	if (fabs(b-a) < EPSILON) { ... }

Yep! Though, I like D's '~' as append operator for arrays. Though 
I i'm not sure this wouldn't work better:

    a, b: [1, 2, 3, 4, 5]

    a += b[2] // appends b[0] to a
    a[] += b[2] // adds b[0]'s value to all of a

Seeing as how your right, '~' means "about" in math.


> And what about:
>
> 	.	with	: or ;
> 
> OK. The symbol '.' is supposed to be used for the end of a 
> sentence. At
> least, so we were told in grade school. In the case of 
> programming, it
> should denote the end of a statement. So why is it that ';' is 
> used to
> end statements, and '.' to access struct/class members? It 
> seems so
> bass-ackwards. A semicolon (or a colon) is much more suitable 
> for what
> amounts to a name composed of parts (module:object:property), 
> because
> they signify partial stop, implying there's more to come. The 
> period (or
> full-stop for you brits) '.' should be used to *end* 
> statements, not to
> *continue* a multi-part name.

I don't think lines need ending marks at all. So, seeing as how 
':' takes two key presses and '.' only takes one, I'd opt to keep 
that as the default member-access operator.


> But who am I to speak out against more than four decades of 
> historical
> accidents, right? I think I'll shut up now.

Nothing wrong with being creative ;-) Even if we know these 
changes will most likely never be used. I've been experimenting 
with LLVM to write a proof-of-concept for Tuple syntax, language 
State-objects, and a modularized compiler designed to be also be 
an IDE parser. Just a simple test obviously, but I'm using these 
syntax concepts. Thanks for the input.
February 17, 2012
Re: Why is there no or or and ?
> All of the syntaxes you're advocating are every bit as 
> arbitrary as the ones you're against.

Programming is logic largely based around math. Seeing as how 
we're all educated around with mathematic symbols as children, a 
language design which reflects what is most familiar will be the 
easiest to initially understand. Less friction means more 
productivity.

Thought to be honest I doubt we'll all still be designing 
applications in text (only) editors, even fancy ones, in the next 
10-15 years. Software design is very modular, and even arbitrary 
logic tools could be better at presenting this data. Simple 
things like code-completion has gone a long way flatten the 
learning curve, and that can only get better when visual and 
audio logic can be manipulated in like-fashion.
1 2 3 4 5 6
Top | Discussion index | About this forum | D home