July 10, 2016
On Sunday, 10 July 2016 at 11:21:49 UTC, Walter Bright wrote:
> On 7/9/2016 7:44 PM, Ola Fosheim Grøstad wrote:
>> Scheme is a simple functional language which is easy to extend.
>
> If they have to extend it, it isn't Scheme anymore.

You misunderstand the meaning of "extend" in respect to Scheme due to your lack of experience with it. Macros are the way of extending Scheme, you don't need to hack the compiler for that.

From Wikipedia:
---------------
Invocations of macros and procedures bear a close resemblance — both are s-expressions — but they are treated differently. When the compiler encounters an s-expression in the program, it first checks to see if the symbol is defined as a syntactic keyword within the current lexical scope. If so, it then attempts to expand the macro, treating the items in the tail of the s-expression as arguments without compiling code to evaluate them, and this process is repeated recursively until no macro invocations remain. If it is not a syntactic keyword, the compiler compiles code to evaluate the arguments in the tail of the s-expression and then to evaluate the variable represented by the symbol at the head of the s-expression and call it as a procedure with the evaluated tail expressions passed as actual arguments to it.
---------------

For example, an "if expression" is written as follows:

; Returns either settings or an error, depending on the condition
(if (authenticated)
  (load-settings)
  (error "cannot load settings, authentication required"))

Either branch is an expression, and the false-branch can be omitted (then a Scheme's "null" equivalent will be returned instead). If you need a "block", a sequence of expressions, you could write this:

(if (authenticated)
  (begin
    (display "loading settings")
    (load-settings))
  (begin
    (display "something went wrong")
    (error "cannot load settings, authentication required")))

When you specify true-branch only, it's tedious to wrap your sequence in "begin" expression. But you can write a "when" macro, which takes a condition and a sequence of expressions and generates the code for you:

(define-syntax when
  (syntax-rules ()
    ((when pred exp exps ...)
      (if pred (begin exp exps ...)))))

Now you can use it just as an ordinary "if":

(when (authenticated)
  (save-settings)
  (display "Saved settings"))

What about the false-branch-only "if"?

(define-syntax unless
  (syntax-rules ()
    ((unless pred exp exps ...)
      (if (not pred) (begin exp exps ...)))))

(unless (dead)
  (display "walking")
  (walk))

The only syntax Scheme has is S-expressions, which are used to represent both data and code, so there's nothing to be extended in the language itself. You just write a macro that generates the code you want. Macros are effectively AST transformers, it just happens so that in Scheme everything is represented in S-expressions, so the code you write is already the AST.

So if you "extend" Scheme by writing a macro, it's still Scheme. You can think of macros as of D string mixins, but without the ugly stringiness.
July 10, 2016
On 7/10/2016 7:54 AM, Ola Fosheim Grøstad wrote:
> Ok. Those are syntactic conventions.

You're changing the subject.

> Does not affect the language design
> as such.

And changing the subject again.

Face it, your argument is destroyed :-)


July 10, 2016
On 7/10/2016 11:40 AM, burjui wrote:
> On Sunday, 10 July 2016 at 11:21:49 UTC, Walter Bright wrote:
>> On 7/9/2016 7:44 PM, Ola Fosheim Grøstad wrote:
>>> Scheme is a simple functional language which is easy to extend.
>>
>> If they have to extend it, it isn't Scheme anymore.
>
> You misunderstand the meaning of "extend" in respect to Scheme due to
> your lack of experience with it. Macros are the way of extending Scheme,
> you don't need to hack the compiler for that.

I don't know Scheme, but macros are not really extending the language. The Wikipedia article suggested much more, as in non-portable extensions and multiple dialects, not just macros.

July 10, 2016
On Friday, 8 July 2016 at 19:43:39 UTC, jmh530 wrote:
> On Friday, 8 July 2016 at 18:16:03 UTC, Andrei Alexandrescu wrote:
>>
>> You may well be literally the only person on Earth who dislikes the use of "static" in "static if". -- Andrei
>
> You have to admit that static is used in a lot of different places in D. It doesn't always mean something like compile-time either. For instance, a static member function is not a compile time member function. However, I doubt something like this is going to change, so it doesn't really bother me.
>
> I liked the way that the Sparrow language (from the presentation you posted a few weeks ago) did it. Instead of static if, they use if[ct].

I think it is a serious mistake to use the same word for different concepts.

In the case of 'static', the problem is that it started out meaning 'as at, or pertaining to, compile time', and then got additional meanings.
Therefore, suggest we change the keyword 'static', as used for compile time, to 'ctime'.

July 11, 2016
On Sunday, 10 July 2016 at 21:35:16 UTC, Walter Bright wrote:
> On 7/10/2016 7:54 AM, Ola Fosheim Grøstad wrote:
>> Ok. Those are syntactic conventions.
>
> You're changing the subject.

What?  Nope, but let's stick to what most people evaluate: the core language. Syntax isn't really the big blocker. Yes, it may be sufficient to annoy some people, but it is when the core language is different that programmers get serious problems.

Some examples of somewhat elegant languages, that also are useful:

Beta, everything is a pattern or an instance of a pattern.
Self, everything is an object.
Prolog, everything is a horn clause.
Scheme, everything is a list.

All of these are useful languages, but programmers have trouble getting away from the semantic model they have of how programs should be structured. Btw, C++ is increasingly moving towards the Beta model: everything is a class-object (including lambdas), but it is too late for C++ to get anywhere close to elegance.

> Face it, your argument is destroyed :-)

Of course not. Consistency and simplicity is not undermining usefulness. The core language should be simple. It has many advantages and is what most language designers strive for, unfortunately the understanding of what the core language ought to be often come too late.

July 11, 2016
On Friday, 8 July 2016 at 19:26:59 UTC, Andrei Alexandrescu wrote:
> On 07/08/2016 02:42 PM, deadalnix wrote:
>> It is meaningless because sometime, you have A and B that are both safe
>> on their own, but doing both is unsafe. In which case A or B need to be
>> banned, but nothing allows to know which one. This isn't a bug, this is
>> a failure to have a principled approach to safety.
>
> What would be a good example? Is there a bug report for it?
>

For instance:

@safe
int foo(int *iPtr) {
	return *iPtr;
}

@safe
int bar(int[] iSlice) {
	return foo(iSlice.ptr);
}

foo assume that creating an invalid pointer is not safe, while bar assume that .ptr is safe as it doesn't access memory. If the slice's size is 0, that is not safe.

This is one such case where each of this operation is safe granted some preconditions, but violate each other's preconditions so using both is unsafe.

>> The position is inconsistent because the dictatorship refuses to
>> compromise on mutually exclusive goals. For instance, @safe is defined
>> as ensuring memory safety. But not against undefined behaviors (in fact
>> Walter promote the use of UB in various situations, for instance when it
>> comes to shared). You CANNOT have undefined behavior that are defined as
>> being memory safe.
>
> I agree with that. What would be a good example? Where is the reference to Walter's promotion of UB in @safe code?
>

I don't have a specific reference to point to right now. However, there have been several event of "@safe guarantee memory safety, it doesn't protect against X" while X is undefined behavior most of the time.

July 11, 2016
On 7/11/16 1:50 PM, deadalnix wrote:
> On Friday, 8 July 2016 at 19:26:59 UTC, Andrei Alexandrescu wrote:
>> On 07/08/2016 02:42 PM, deadalnix wrote:
>>> It is meaningless because sometime, you have A and B that are both safe
>>> on their own, but doing both is unsafe. In which case A or B need to be
>>> banned, but nothing allows to know which one. This isn't a bug, this is
>>> a failure to have a principled approach to safety.
>>
>> What would be a good example? Is there a bug report for it?
>>
>
> For instance:
>
> @safe
> int foo(int *iPtr) {
>     return *iPtr;
> }
>
> @safe
> int bar(int[] iSlice) {
>     return foo(iSlice.ptr);
> }
>
> foo assume that creating an invalid pointer is not safe, while bar
> assume that .ptr is safe as it doesn't access memory. If the slice's
> size is 0, that is not safe.

That was reported and being worked on:

https://github.com/dlang/dmd/pull/5860

-Steve
July 11, 2016
On Saturday, 9 July 2016 at 00:14:34 UTC, Walter Bright wrote:
> On 7/8/2016 2:58 PM, Ola Fosheim Grøstad wrote:
>> On Friday, 8 July 2016 at 21:24:04 UTC, Walter Bright wrote:
>>> On 7/7/2016 5:56 PM, deadalnix wrote:
>>>> While this very true, it is clear that most D's complexity doesn't come from
>>>> there. D's complexity come for the most part from things being completely
>>>> unprincipled and lack of vision.
>>>
>>> All useful computer languages are unprincipled and complex due to a number of
>>> factors:
>>
>> I think this is a very dangerous assumption. And also not true.
>
> Feel free to post a counterexample. All you need is one!
>

Lisp.

>
>> What is true is that it is difficult to gain traction if a language does not
>> look like a copy of a pre-existing and fairly popular language.
>
> I.e. Reason #2:
>
> "what programmers perceive as logical and intuitive is often neither logical nor intuitive to a computer"

That's why we have compiler writer and language designers.

July 11, 2016
On Monday, 11 July 2016 at 18:18:22 UTC, deadalnix wrote:
> Lisp.

Which one?
July 11, 2016
On Saturday, 9 July 2016 at 23:44:07 UTC, H. S. Teoh wrote:
> I find this rather disturbing, actually.  There is a fine line between taking advantage of assert's to elide stuff that the programmer promises will not happen, and eliding something that's defined to be UB and thereby resulting in memory corruption.
>
> [...]
>
>
> T

While I understand how frustrating it looks, there is simply no other way around in practice. For instance, the shift operation on x86 is essentially :

x >> (y & ((1 << (typeof(x).sizeof * 8)) - 1))

But will differs on other plateforms. This means that in practice, the compiler would have to add bound checks on every shift. The performance impact would be through the roof, plus, you'd have to specify what to do in case of out of range shift.

Contrary to popular belief, the compiler do not try to screw you with UB. There is no code of the form "if this is UB, then so this insanely stupid shit". But what happen is that algorithm A do not explore the UB case - because it is UB - and just do nothing with it, and algorithm B on his side do not check care for UB, but will reuse results from A and do something unexpected.

In Andrei's example, the compiler won't say, fuck this guy, he wrote an UB. What will happen is that range checking code will conclude that 9 >> something must be smaller than 10. The the control flow simplification code will use that range to conclude that the bound check must be always true and replace it with an unconditional branch.

As you can see the behavior of each component here is fairly reasonable. However, the end result may not be.