May 28, 2020
On 5/28/20 3:36 AM, Paolo Invernizzi wrote:
> On Thursday, 28 May 2020 at 01:23:31 UTC, Steven Schveighoffer wrote:
>> On 5/27/20 8:31 PM, Jonathan M Davis wrote:
>>> On Wednesday, May 27, 2020 5:57:00 PM MDT Meta via Digitalmars-d wrote:
>>>> On Wednesday, 27 May 2020 at 18:50:50 UTC, Jonathan M Davis wrote:
>>>>> Based on some of Walter's comments, it also sounds like he
>>>>> intends to make nothrow the default in another DIP, which is
>>>>> also a terrible idea. I'm increasingly worried about the future
>>>>> of D with some of where these DIPs are going.
>>>>>
>>>>> - Jonathan M Davis
>>>>
>>>> What's wrong with nothrow by default? Probably 97% of code
>>>> doesn't need to throw exceptions.
>>>
>>> If anything, I would say the opposite.
>>
>> It actually doesn't matter what's more common (and I agree with Jonathan, there's actually a lot of throwing calls because of the calls that you make into other functions). What matters is that there are functions that are actually nothrow that aren't marked nothrow. Hence the desire that these functions should actually be marked nothrow implicitly so people who care about that can just use the functions without issue.
>>
> 
> What make me feel "mhmm" is that the motivation is always "because no throw is speediest, so should be the default" ...

That's not the motivation for the default.

throwing code can call nothrow code. Nothrow code cannot call throw code (without doing something about the exceptions). If code is actually nothrow (meaning it does not ever throw any exceptions) but not marked as nothrow, then it's an attribute away from being more useful.

If we make nothrow the default, then code that already is nothrow, but simply not marked, now becomes more useful.

Same goes for @safe, @nogc, pure.

This is about properly marking code, not about speed.

But I'm thinking we are approaching this wrong. We should simply make inference the default, and opt out by using an attribute or some other mechanism (pragma?). It would have the same effect but not break all code in existence.

-Steve
May 28, 2020
On 5/28/20 10:53 AM, Joseph Rushton Wakeling wrote:
> On Thursday, 28 May 2020 at 13:53:39 UTC, Adam D. Ruppe wrote:
>> The beauty of inferred by default is neither are breaking changes.
> 
> Not sure I agree about that TBH.  Inferred-by-default just means the function signature doesn't by default tell you anything about what properties you can rely on, and any change to the implementation may alter the available properties, without any external sign that this has happened.

1. Templates already do this, and it has not been a problem (much of Phobos and much of what I've written is generally templates).

2. If we went to an "inferred-by-default" regime, there would have to be a way to opt-out of it, to allow for crafting attributes of public extern functions.

3. You would still need to specify exact attributes for virtual functions.

4. Documentation should show the inferred attributes IMO (not sure if this already happens for auto functions for example).

5. Yes, inferred attributes might change. This would be a breaking change. It might be a breaking change for others where it is not for the library/function in question. But it would still be something that IMO would require a deprecation period. For things outside our control, it's very possible that these changes would be done anyway even if they were actual attributes.

6. One might also take the view that a lack of attributes means the function may or may not have those attributes inferred in the future (i.e. it's not part of the API). I think much code is already written this way.

-Steve
May 29, 2020
On 29/05/2020 3:00 AM, Steven Schveighoffer wrote:
> But I'm thinking we are approaching this wrong. We should simply make inference the default, and opt out by using an attribute or some other mechanism (pragma?). It would have the same effect but not break all code in existence.

I had the same realization a few days ago.

The fact that you have to type @safe and @system at all is the real problem.

Apart from function pointers and explicit overriding you should not be writing them normally.

.di files would be generated with the annotations regardless of if it is supplied by user or not.

The trick to get this to work well AND have false positives (i.e. a function that is @system but appears @safe and vice versa) is to do the inferring as early as possible.

This sounds crazy, but it would force people to consider if it should be @trusted or if it was just something that needs fixing. The goal would be to make @safe transitively poison the call stack to require fixing. But if most D code already is @safe, what is there to worry about?
May 28, 2020
On Thursday, 28 May 2020 at 15:00:11 UTC, Steven Schveighoffer wrote:
> On 5/28/20 3:36 AM, Paolo Invernizzi wrote:
>> On Thursday, 28 May 2020 at 01:23:31 UTC, Steven Schveighoffer wrote:
>>> On 5/27/20 8:31 PM, Jonathan M Davis wrote:
>>>> On Wednesday, May 27, 2020 5:57:00 PM MDT Meta via Digitalmars-d wrote:
>>>>> On Wednesday, 27 May 2020 at 18:50:50 UTC, Jonathan M Davis wrote:
>>>>>> Based on some of Walter's comments, it also sounds like he
>>>>>> intends to make nothrow the default in another DIP, which is
>>>>>> also a terrible idea. I'm increasingly worried about the future
>>>>>> of D with some of where these DIPs are going.
>>>>>>
>>>>>> - Jonathan M Davis
>>>>>
>>>>> What's wrong with nothrow by default? Probably 97% of code
>>>>> doesn't need to throw exceptions.
>>>>
>>>> If anything, I would say the opposite.
>>>
>>> It actually doesn't matter what's more common (and I agree with Jonathan, there's actually a lot of throwing calls because of the calls that you make into other functions). What matters is that there are functions that are actually nothrow that aren't marked nothrow. Hence the desire that these functions should actually be marked nothrow implicitly so people who care about that can just use the functions without issue.
>>>
>> 
>> What make me feel "mhmm" is that the motivation is always "because no throw is speediest, so should be the default" ...
>
> That's not the motivation for the default.

DIP 1029, Rationale:

"The problem is that exceptions are not cost-free, even in code that never throws. Exceptions should therefore be opt-in, not opt-out. Although this DIP does not propose making exceptions opt-in, the throw attribute is a key requirement for it. The attribute also serves well as documentation that yes, a function indeed can throw."

Maybe I'm wrong, but when Walter uses "not cost-free" he seldom refers to something else than ... speed.


May 28, 2020
On Thursday, 28 May 2020 at 12:21:31 UTC, welkam wrote:
> On Thursday, 28 May 2020 at 07:36:05 UTC, Paolo Invernizzi wrote:
>
>> tuning the hot path is still the way to go if you care for speed.
>
> You havent done many optimizations have you?

For sure I have ...

> The few hot spots happens when you have a simple program or there were no attempts made to optimize the code.

Granted

> If you profile say DMD code you will find that there are no hot spots. If you want the code to be fast you need to care about all of it. Some parts need more attention than others but you still need to care about it.

I'm on that boat, but I think that this is not always the state of affairs, it really depend on the domain of the application.

DMD codebase is well known, written since a couple of decades and based (for the backend, for example) on the shoulders of DMC ... and written by one of the most brilliant programmers in the whole world ... Walter Bright: I will be surprised, if there's an hot path to squeeze in it.

Anyway, I think you are right talking in general case, you still need to care about the whole .. and, figure out, we work in the real-time domain, as most of our customers are medical companies, so I understand very well what you are referring to ... :-P


May 28, 2020
On 5/28/20 11:16 AM, Paolo Invernizzi wrote:
> On Thursday, 28 May 2020 at 15:00:11 UTC, Steven Schveighoffer wrote:
>> On 5/28/20 3:36 AM, Paolo Invernizzi wrote:
>>> What make me feel "mhmm" is that the motivation is always "because no throw is speediest, so should be the default" ...
>>
>> That's not the motivation for the default.
> 
> DIP 1029, Rationale:
> 
> "The problem is that exceptions are not cost-free, even in code that never throws. Exceptions should therefore be opt-in, not opt-out. Although this DIP does not propose making exceptions opt-in, the throw attribute is a key requirement for it. The attribute also serves well as documentation that yes, a function indeed can throw."
> 
> Maybe I'm wrong, but when Walter uses "not cost-free" he seldom refers to something else than ... speed.

I should be clearer that MY motivation for having these things be the default is so that more code can be used in more situations. Walter's motivation may differ.

The fact that this function is not @nogc @safe pure nothrow is a failure of the language:

int multiply(int x, int y) { return x * y; }

I shouldn't have to have attribute soup everywhere, and most likely I'm not going to bother.

The motivation for the existence of nothrow in general is to avoid the cost of exception handling.

But the motivation of making it the *default* is because people just don't mark their nothrow functions nothrow. The easier default is nothrow because it is callable from either situation.

In other words, it enables more code.

-Steve
May 28, 2020
On Thursday, 28 May 2020 at 15:12:56 UTC, Steven Schveighoffer wrote:
>> Not sure I agree about that TBH.  Inferred-by-default just means the function signature doesn't by default tell you anything about what properties you can rely on, and any change to the implementation may alter the available properties, without any external sign that this has happened.
>
> 1. Templates already do this, and it has not been a problem (much of Phobos and much of what I've written is generally templates).

Yes, but the user tends to have a lot of control there in practice, by what template arguments they pass.  So the template is less of a black box of surprise.  I also think there's a bit of an implicit motivation for an author of templated code to try and _make_ the template args the dominant factor there (because that makes the template more usable).
May 28, 2020
On 5/28/20 11:30 AM, Joseph Rushton Wakeling wrote:
> On Thursday, 28 May 2020 at 15:12:56 UTC, Steven Schveighoffer wrote:
>>> Not sure I agree about that TBH.  Inferred-by-default just means the function signature doesn't by default tell you anything about what properties you can rely on, and any change to the implementation may alter the available properties, without any external sign that this has happened.
>>
>> 1. Templates already do this, and it has not been a problem (much of Phobos and much of what I've written is generally templates).
> 
> Yes, but the user tends to have a lot of control there in practice, by what template arguments they pass.  So the template is less of a black box of surprise.  I also think there's a bit of an implicit motivation for an author of templated code to try and _make_ the template args the dominant factor there (because that makes the template more usable).

I think a ton of templates get written without considering attribute inference at all. It just happens to work because the compiler is guaranteed to have all the source.

I've seen people change a function in to a no-arg template function not even for inference, but to ensure the compiler only generates code if called. All of a sudden, it gets a new set of attributes, and nobody complains.

The most pleasant thing about template inference is that it generally works out well because it provides the most restrictive attributes it can. It doesn't get in the way of the author who doesn't care about attributes or the user who does care.

But it can also blow up if you can't figure out why some inference is happening and you expect something different. I think in addition to such a change to inference by default the compiler should provide a mechanism to explain how it infers things so you can root out the cause of it. I'd like to have this feature regardless of any defaults.

-Steve
May 28, 2020
On Thursday, 28 May 2020 at 15:28:02 UTC, Steven Schveighoffer wrote:
> On 5/28/20 11:16 AM, Paolo Invernizzi wrote:
>> [...]
>
> I should be clearer that MY motivation for having these things be the default is so that more code can be used in more situations. Walter's motivation may differ.
>
> The fact that this function is not @nogc @safe pure nothrow is a failure of the language:
>
> int multiply(int x, int y) { return x * y; }
>
> I shouldn't have to have attribute soup everywhere, and most likely I'm not going to bother.
>
> The motivation for the existence of nothrow in general is to avoid the cost of exception handling.
>
> But the motivation of making it the *default* is because people just don't mark their nothrow functions nothrow. The easier default is nothrow because it is callable from either situation.
>
> In other words, it enables more code.
>
> -Steve

I can agree with you, but I would like to see a solid rationale for that kind of switches.
But I guess that we should only wait till the discussion on the future DIP around that.

May 28, 2020
On Thursday, 28 May 2020 at 15:12:56 UTC, Steven Schveighoffer wrote:
> 2. If we went to an "inferred-by-default" regime, there would have to be a way to opt-out of it, to allow for crafting attributes of public extern functions.

You'd just have to write them out there.

> 4. Documentation should show the inferred attributes IMO (not sure if this already happens for auto functions for example).

Eeeeeeh, I'd be ok with that but it would need to actually point out that it was inferred - that this is NOT a promise of forward compatibility, it just happens to be so in this version

> 6. One might also take the view that a lack of attributes means the function may or may not have those attributes inferred in the future (i.e. it's not part of the API). I think much code is already written this way.

yes, i tend to explicitly write it out if im making a point about it.