Thread overview
Why aren't function attributes inferred?
Aug 20, 2011
Sean Eskapp
Aug 20, 2011
Timon Gehr
Aug 20, 2011
Sean Eskapp
Aug 20, 2011
Nick Sabalausky
Aug 20, 2011
Sean Eskapp
Aug 20, 2011
Nick Sabalausky
Aug 20, 2011
bearophile
Aug 21, 2011
Jonathan M Davis
August 20, 2011
Since the compiler can clearly tell when a function is not const, safe, pure, or nothrow, why can't they just be assumed, unless proven otherwise?
August 20, 2011
On 08/20/2011 06:50 PM, Sean Eskapp wrote:
> Since the compiler can clearly tell when a function is not const, safe, pure,
> or nothrow, why can't they just be assumed, unless proven otherwise?

This sort of inference is already done for function/delegate literals and template functions.

It is not done for other functions, because eg. their code is not necessarily available.

int foo(int x) pure; // how would you infer purity here?

Also, if normal functions would infer those attributes, they would be leaking implementation details all over the place. If a pure implementation would have to be changed to a non-pure one, all code that relied on the undocumented but inferred purity would break.
August 20, 2011
== Quote from Timon Gehr (timon.gehr@gmx.ch)'s article
> On 08/20/2011 06:50 PM, Sean Eskapp wrote:
> > Since the compiler can clearly tell when a function is not const, safe, pure, or nothrow, why can't they just be assumed, unless proven otherwise?
> This sort of inference is already done for function/delegate literals
> and template functions.
> It is not done for other functions, because eg. their code is not
> necessarily available.
> int foo(int x) pure; // how would you infer purity here?
> Also, if normal functions would infer those attributes, they would be
> leaking implementation details all over the place. If a pure
> implementation would have to be changed to a non-pure one, all code that
> relied on the undocumented but inferred purity would break.

I understand your point about functions with no definition, but your point about
normal functions holds true anyway. If I have a pure function foo(), and a
function bar() which relies on the purity of foo(), then changing the purity of
foo() would break bar()'s internals. Either way, purity should still be inferred
at optimization time, where it could really make a difference!
August 20, 2011
"Sean Eskapp" <eatingstaples@gmail.com> wrote in message news:j2ooko$15m4$1@digitalmars.com...
> Since the compiler can clearly tell when a function is not const, safe,
> pure,
> or nothrow, why can't they just be assumed, unless proven otherwise?

That would defeat the whole point.

Suppose it did work that way: If a function is *supposed* to be const, safe, pure, or nothrow, and you make a change that violates that, then you'll never know. It won't tell you. If it just simply decided "ok, so it's just not a safe/pure/whatever function", then what would be the point of having safe/pure/etc functions? They wouldn't serve any purpose. It would just be arbitrary metadata that sits around doing nothing.

The whole point of those attributes is that if a function is *supposed* to have certain guartantees, the compiler will actually *tell* you when you violate them.


August 20, 2011
== Quote from Nick Sabalausky (a@a.a)'s article
> "Sean Eskapp" <eatingstaples@gmail.com> wrote in message news:j2ooko$15m4$1@digitalmars.com...
> > Since the compiler can clearly tell when a function is not const, safe,
> > pure,
> > or nothrow, why can't they just be assumed, unless proven otherwise?
> That would defeat the whole point.
> Suppose it did work that way: If a function is *supposed* to be const, safe,
> pure, or nothrow, and you make a change that violates that, then you'll
> never know. It won't tell you. If it just simply decided "ok, so it's just
> not a safe/pure/whatever function", then what would be the point of having
> safe/pure/etc functions? They wouldn't serve any purpose. It would just be
> arbitrary metadata that sits around doing nothing.
> The whole point of those attributes is that if a function is *supposed* to
> have certain guartantees, the compiler will actually *tell* you when you
> violate them.

I was under the impression that it helped some with optimization - purity, for instance, can help with inlining and caching. Safety wouldn't be useful, as far as I can tell.
August 20, 2011
"Sean Eskapp" <eatingstaples@gmail.com> wrote in message news:j2p7mh$1uk2$1@digitalmars.com...
> == Quote from Nick Sabalausky (a@a.a)'s article
>> "Sean Eskapp" <eatingstaples@gmail.com> wrote in message news:j2ooko$15m4$1@digitalmars.com...
>> > Since the compiler can clearly tell when a function is not const, safe,
>> > pure,
>> > or nothrow, why can't they just be assumed, unless proven otherwise?
>> That would defeat the whole point.
>> Suppose it did work that way: If a function is *supposed* to be const,
>> safe,
>> pure, or nothrow, and you make a change that violates that, then you'll
>> never know. It won't tell you. If it just simply decided "ok, so it's
>> just
>> not a safe/pure/whatever function", then what would be the point of
>> having
>> safe/pure/etc functions? They wouldn't serve any purpose. It would just
>> be
>> arbitrary metadata that sits around doing nothing.
>> The whole point of those attributes is that if a function is *supposed*
>> to
>> have certain guartantees, the compiler will actually *tell* you when you
>> violate them.
>
> I was under the impression that it helped some with optimization - purity,
> for
> instance, can help with inlining and caching. Safety wouldn't be useful,
> as far as
> I can tell.

Well, that too. But the main thing is the checked guarantees.


August 20, 2011
Nick Sabalausky:

> Well, that too. But the main thing is the checked guarantees.

Maybe @safe too is usable to improve some optimizations, I don't know. Maybe simplifies pointers strictness analysis?

Bye,
bearophile
August 21, 2011
On Saturday, August 20, 2011 16:50:32 Sean Eskapp wrote:
> Since the compiler can clearly tell when a function is not const, safe, pure, or nothrow, why can't they just be assumed, unless proven otherwise?

As of 2.054, @safe, pure, and nothrow are inferred for delegates and templated functions. This is because whether they can be @safe, pure, or nothrow depends entirely on the types that they're instantiated with. For normal functions, there is no such inferrence. It's not needed.

As for just assuming attributes, the language was designed like pretty much every other C-based language (C++, Java, C#, etc.) in that unless a function is marked with an attribute, it doesn't have that attribute. It _could_ have been designed the other way around, but then all of your attributes become stuff like mutable, impure, and throw. It doesn't really buy you anything. It just changes what attributes you have to mark stuff with. And whether you'd be using more attributes or fewer attributes with such a scheme would depend entirely on the code. But such a scheme would be entirely foreign to most programmers of C-based languages.

- Jonathan M Davis
August 22, 2011
On Sat, 20 Aug 2011 20:23:44 -0400, Jonathan M Davis <jmdavisProg@gmx.com> wrote:

> On Saturday, August 20, 2011 16:50:32 Sean Eskapp wrote:
>> Since the compiler can clearly tell when a function is not const, safe,
>> pure, or nothrow, why can't they just be assumed, unless proven otherwise?
>
> As of 2.054, @safe, pure, and nothrow are inferred for delegates and templated
> functions. This is because whether they can be @safe, pure, or nothrow depends
> entirely on the types that they're instantiated with. For normal functions,
> there is no such inferrence. It's not needed.

Well, and also because you can't template purity, @safety, and um... nothrow-ity :)

-Steve