July 11, 2012
On Jul 11, 2012, at 12:58 AM, Don Clugston <dac@nospam.com> wrote:

> On 10/07/12 19:13, H. S. Teoh wrote:
>> On Tue, Jul 10, 2012 at 06:48:51PM +0200, Timon Gehr wrote:
>>> On 07/10/2012 06:45 PM, H. S. Teoh wrote:
>>>> Yeah, this is logical const. Unfortunately, D doesn't have logical const.
>>>> 
>>> 
>>> Then why on earth is druntime acting as if it does?
>> 
>> Y'know, this brings up an interesting question. Do methods like toString _need_ to be const? That is, _physical_ const?  Or are we unconsciously conflating physical const with logical const here?
>> 
>> Yes, certain runtime operations need to be able to work with const methods, but I wonder if those required const methods really belong to a core set of more primitive operations that guarantee physical const, and perhaps shouldn't be conflated with logical operations like "convert this object to a string representation", which _may_ require caching, etc.?
>> 
>> Or perhaps application code want to be defining their own non-const versions of certain methods so that they can do whatever they need to do with logical const, without worrying about breaking physical const-ness.
>> 
>> I'm starting to think that D's hardline approach to const is clashing with the principle of information hiding. Users of a class shouldn't _need_ to know if an object is caching the value of toString, toHash, or whatever it is. What they care for is that the object doesn't visibly change, that is, logical const. Binary const implies logical const, but the implication doesn't work the other way round. While it's nice to have binary const (strong, enforceable guarantee), it breaks encapsulation: just because a class needs to do caching, means its methods can't be const, and this is a visible (and viral, no less) change in its external API. What should just be an implementation detail has become a visible difference to the outside world -- encapsulation is broken.
>> 
>> I don't know how to remedy this. It's clear that physical const does have its value -- it's necessary to properly support immutable, allows putting data in ROM, etc.. But it's also clear that something is missing from the picture. Implementation details are leaking past object APIs, caching and other abstractions can't work with const, etc., and that's not a good thing.
>> 
>> 
>> T
> 
> I think you're right.
> Something I wonder about, though, is how many different use cases are we dealing with?
> 
> Suppose we had a caching solution (you could think of it as @cached, but it could be done in a library). The user would need to provide a const, pure function which returns the same value that is stored in the cache.
> This is enforceable. The only way to write to the cache, is by calling the function.
> 
> How far would that take us? I don't think there are many use cases for logically pure, apart from caching, but I have very little idea about logical const.

Lazy loading, which I suppose is a type of caching. Use of synchronization primitives should be allowed as well. We sidestep this because the synchronized statement sidesteps const restrictions.
July 11, 2012
On 07/11/2012 02:39 PM, Andrei Alexandrescu wrote:
> On 7/10/12 10:59 PM, Jakob Ovrum wrote:
>> On Wednesday, 11 July 2012 at 02:02:52 UTC, Andrei Alexandrescu wrote:
>>> On 7/10/12 9:45 PM, Timon Gehr wrote:
>>>> I do not desire logical const as a language feature. But conservative
>>>> type systems are not good for everything. The root of the class
>>>> hierarchy needs to be good for everything. Object is not an adequate
>>>> root any more.
>>>
>>> How about we consider just stiffening that upper lip and implement
>>> comparison and hashing without modifying their target?
>>>
>>> Andrei
>>
>> It's more likely to go down like this: programmer attempts to write his
>> opEquals (or toString etc) within the restrictions of const, but fails
>> due to the requirements of the implementation (which can easily go
>> beyond simple performance measures like caching, as demonstrated). The
>> programmer then writes his own mutable member function and neglects
>> opEquals altogether. If the programmer is real nice, he/she will write a
>> throwing opEquals stub.
>
> I gave evidence on a large, high quality C++ codebase that the use of
> mutable (which is the solution of choice for memoization, caching, and
> lazy computation) is extremely scarce.
>

Unlike in D, in C++ programmers have a choice of declaring a method
const or not const.
Also, what percentage of the usages of 'const' would disappear, without
changing the code in any other way, if

- const was transitive?
- all usages of mutable disappeared?
- the social graph was infinite?
- any combination of the above?

The data given so far is not evidence relevant to this discussion.
(it is not evidence in the traditional sense anyway, because statements
about a closed source code base cannot be validated by a third party.)

> What evidence do you have for your prediction?
>

Attempts to impose restrictions without conceivable benefit always go
wrong at some point. I'd claim this to be general knowledge.

Almost nobody has responded to the RawObject proposal.

Jacob Carlborg has brought my attention to this:
http://www.ruby-doc.org/core-1.9.3/BasicObject.html

I still think that would be the best solution.
July 11, 2012
On 07/11/2012 02:55 PM, Andrei Alexandrescu wrote:
> ...
> If this is a useful artifact, Walter had an idea a while ago that we can
> have the compiler help by using the per-object monitor pointer instead
> of the static hashtable. Right now the pointer points to a monitor
> object, but it could point to a little struct containing e.g. a Monitor
> and a void*, which opens the way to O(1) access to unbounded cached
> data. The compiler would then "understand" to not consider that date
> regular field accesses, and not make assumptions about them being
> immutable.
>

The additional indirection has no impact. This is 'mutable' in disguise.


> Any takers for Cached? It would be good to assess its level of
> usefulness first.
>
>
> Andrei

Well, it is inefficient.


July 11, 2012
On 07/11/2012 02:40 PM, Andrei Alexandrescu wrote:
> On 7/11/12 12:59 AM, H. S. Teoh wrote:
>> On Wed, Jul 11, 2012 at 04:59:28AM +0200, Jakob Ovrum wrote:
>>> On Wednesday, 11 July 2012 at 02:02:52 UTC, Andrei Alexandrescu
>>> wrote:
>>>> On 7/10/12 9:45 PM, Timon Gehr wrote:
>>>>> I do not desire logical const as a language feature. But
>>>>> conservative type systems are not good for everything. The root of
>>>>> the class hierarchy needs to be good for everything. Object is not
>>>>> an adequate root any more.
>>>>
>>>> How about we consider just stiffening that upper lip and implement
>>>> comparison and hashing without modifying their target?
>>>>
>>>> Andrei
>>>
>>> It's more likely to go down like this: programmer attempts to write
>>> his opEquals (or toString etc) within the restrictions of const, but
>>> fails due to the requirements of the implementation (which can
>>> easily go beyond simple performance measures like caching, as
>>> demonstrated). The programmer then writes his own mutable member
>>> function and neglects opEquals altogether. If the programmer is real
>>> nice, he/she will write a throwing opEquals stub.
>>
>> This is exactly what I was saying. All that beautiful, pristine, perfect
>> infrastructure we're building in druntime eventually just gets
>> sidestepped, because it is unable to cater for what the programmer
>> needs, and so the programmer ends up reimplementing his own
>> infrastructure, over and over again. I can't see how that is beneficial.
>
> How often do you need memoization?

Once is sufficient. It will invade the code base because const is
transitive.

> It's not even recognized by this mailer's editor.
>
> Andrei
>
>

'lazy computation' is.
July 11, 2012
On 7/11/12 10:33 AM, Timon Gehr wrote:
>> Any takers for Cached? It would be good to assess its level of
>> usefulness first.
>>
>>
>> Andrei
>
> Well, it is inefficient.

The idea here is to assess functionality provided in order to decide whether to go down this route or not.

Andrei
July 11, 2012
On Wednesday, 11 July 2012 at 14:42:43 UTC, Andrei Alexandrescu wrote:
> On 7/11/12 10:33 AM, Timon Gehr wrote:
>>> Any takers for Cached? It would be good to assess its level of
>>> usefulness first.
>>>
>>>
>>> Andrei
>>
>> Well, it is inefficient.
>
> The idea here is to assess functionality provided in order to decide whether to go down this route or not.
>
> Andrei

Probably using a separate thread would be better for voting, since this one already has many branches of ideas.
July 11, 2012
On Wednesday, 11 July 2012 at 14:42:43 UTC, Andrei Alexandrescu wrote:
> On 7/11/12 10:33 AM, Timon Gehr wrote:
>>> Any takers for Cached? It would be good to assess its level of
>>> usefulness first.
>>>
>>>
>>> Andrei
>>
>> Well, it is inefficient.
>
> The idea here is to assess functionality provided in order to decide whether to go down this route or not.
>
> Andrei

Probably using a separate thread would be better for voting or discussing, since this one already has many branches of ideas.
July 11, 2012
On Wednesday, 11 July 2012 at 13:55:30 UTC, deadalnix wrote:
> The cached value can in fact rely outside the object.

For the umpteenth time, I am not talking about caching.
July 11, 2012
On 07/11/2012 04:16 PM, deadalnix wrote:
> On 11/07/2012 16:04, Timon Gehr wrote:
>> On 07/11/2012 02:58 PM, deadalnix wrote:
>>> I don't think that is a big issue.
>>>
>>> Either the value isn't accessed often, and then it make no sense to
>>> cache it, or it is accessed often and it make sense to precalculate the
>>> cached value when building the object.
>>>
>>
>> And sometimes it is not known in advance if it will be accessed at all.
>> And this is somewhere in a class that seems unrelated to the object
>> that wants to compute a string representation/hash value/comparison.
>> It is not the simple cases that are an issue.
>
> Arguably, this is a software design problem.

Abstraction is a software design problem?

> I never encountered a case where I need to do such thing

This means it will bite you in expectedly the next 48 hours. I have some
experience with making statements like this.

> and never encountered something similar in
> code except code that demonstrate trick or subverting language
> features/librairies in unexpected ways (which is always fun to do, but
> should never ends up in production or be promoted by a language/a lib).

I'll be happy to inspect eg. SDC as soon as it reliably compiles D code.
July 11, 2012
On Wednesday, 11 July 2012 at 12:36:43 UTC, Andrei Alexandrescu wrote:
> I was a long-time proponent of this. It's less exciting than it may seem actually.
>
> (a) Classes that work with const just fine incur one extra virtual call. I think this can be avoided by having the compiler plant the same pointer for the const and non-const version in the vtable.
>
> (b) Classes that can't do as little as one of these four operations without mutating the object are completely excluded from the immutability system, even if they'd otherwise benefit from it. Even those that don't "care" they need to actively _work_ on not caring, which doesn't sit well.
>
> So I don't see this as a viable solution to people who are fine with const, but would like to use e.g. some lazy computation.
>
>
> Andrei

This solution is not for allowing people to use lazy computation in their const overrides, it's for allowing people to still use opEquals, toString etc. even if their implementations cannot and should not be const.

e.g. the LuaD function I posted earlier - it has nothing to do with caching or lazy computation, it's just that it's only logically constant and cannot ever be bitwise constant due to the underlying API. Immutable instances of such structures are next to useless, as every member function except for a single getter function uses mutation.