July 16, 2016
On 7/16/2016 7:17 PM, Andrew Godfrey wrote:
> I'm more interested in engaging on "in how many years will the D leadership be
> interested in engaging on the topic of D3?" I feel this is a significant
> omission from the vision doc, and that omission inflames a lot of the recurring
> animosity I see on the forums. Even an answer of "never" would be a significant
> improvement over "we refuse to engage on that". And I doubt you're really
> thinking "never".

There are no plans for D3 at the moment. All plans for improvement are backwards compatible as much as possible. D had its wrenching change with D1->D2, and it nearly destroyed us.


> I do think that ideas from academia will mostly cause a lot of unwanted noise in
> such a discussion - because academia, in my experience, is more focused on
> "software construction" than on "software evolution", and D takes an approach
> that is built on practical experience with evolution. But academia also has
> occasional nuggets of extreme value.

Academia certainly does have value for us. Andrei has a PhD in computer science, I have a BS in mechanical and aerospace engineering, and I believe the difference in our backgrounds makes for great complementary skills.
July 17, 2016
On Sunday, 17 July 2016 at 05:14:57 UTC, Walter Bright wrote:
> On 7/16/2016 7:17 PM, Andrew Godfrey wrote:
>> I'm more interested in engaging on "in how many years will the D leadership be
>> interested in engaging on the topic of D3?" I feel this is a significant
>> omission from the vision doc, and that omission inflames a lot of the recurring
>> animosity I see on the forums. Even an answer of "never" would be a significant
>> improvement over "we refuse to engage on that". And I doubt you're really
>> thinking "never".
>
> There are no plans for D3 at the moment. All plans for improvement are backwards compatible as much as possible. D had its wrenching change with D1->D2, and it nearly destroyed us.
>

I think alienating the Tango crowd did way more in that reguard than any breaking change could.

July 16, 2016
On Sun, Jul 17, 2016 at 02:59:42AM +0000, Nobody via Digitalmars-d wrote:
> On Saturday, 9 July 2016 at 00:14:34 UTC, Walter Bright wrote:
> > On 7/8/2016 2:58 PM, Ola Fosheim Grøstad wrote:
> > > On Friday, 8 July 2016 at 21:24:04 UTC, Walter Bright wrote:
> > > > On 7/7/2016 5:56 PM, deadalnix wrote:
> > > > > While this very true, it is clear that most D's complexity doesn't come from there. D's complexity come for the most part from things being completely unprincipled and lack of vision.
> > > > 
> > > > All useful computer languages are unprincipled and complex due to a number of factors:
> > > 
> > > I think this is a very dangerous assumption. And also not true.
> > 
> > Feel free to post a counterexample. All you need is one!
> > 
> 
> Perl 6.

Are you serious?  Perl is the *prime* example of "unprincipled and complex".  Larry Wall himself said (in print, no less):

	English is useful because it is a mess. Since English is a mess,
	it maps well onto the problem space, which is also a mess, which
	we call reality. Similarly, Perl was designed to be a mess,
	though in the nicest of all possible ways. -- Larry Wall


T

-- 
Being able to learn is a great learning; being able to unlearn is a greater learning.
July 17, 2016
On 7/15/16 10:43 AM, Andrew Godfrey wrote:
> On Friday, 15 July 2016 at 11:09:24 UTC, Patrick Schluter wrote:
>> On Friday, 15 July 2016 at 10:25:16 UTC, Shachar Shemesh wrote:
>>>
>>> I think the one that hurts the most is fixing "C++ fault" #3. It
>>> means there are many scenarios in which I could put const in C++, and
>>> I simply can't in D, because something somewhere needs to be mutable.
>>
>> Then it is not const and marking it as const is a bug. D enforces to
>> not write a bug, what's wrong with that?
>
> One example is if you make a class that has an internal cache of
> something. Updating or invalidating that cache has no logical effect on
> the externally-observable state of the class. So you should be able to
> modify the cache even on a 'const' object. This is not a bug and I've
> seen it have a huge effect on performance - probably a lot more than the
> const optimizations Walter is talking about here.

I suggest you take a look at http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2669.htm. It adds guarantees for STL containers that effectively prohibit them from using mutable. If they do use mutable, they are on their own in ensuring correctness. Also, although arguably all types should behave that way, there is no way to express something near "this user-defined type satisfies N2669" within the C++ type system. Also, N2669 encodes existing practice; the whole logical const and surreptitious caches inside apparently const objects is liable to bring more problems than it solves (see e.g. the std::string reference counting fiasco). -- Andrei

July 17, 2016
On 2016-07-17 05:35, Andrew Godfrey wrote:

> No it's not the same - void initialization leaves the variable
> uninitialized. I'm saying, something that still initialized, but marks
> that initial value as not to be used. Anyway... given the existence of
> void initialization (which I'd forgotten about), what I suggested would
> be very confusing to add.

I think annotating a variable with a UDA would be perfect for this. The static analyzer would recognize the UDA and do the proper analyzes.

-- 
/Jacob Carlborg
July 17, 2016
On Sunday, 17 July 2016 at 12:38:46 UTC, Andrei Alexandrescu wrote:
> On 7/15/16 10:43 AM, Andrew Godfrey wrote:
>> On Friday, 15 July 2016 at 11:09:24 UTC, Patrick Schluter wrote:
>>> On Friday, 15 July 2016 at 10:25:16 UTC, Shachar Shemesh wrote:
>>>>
>>>> I think the one that hurts the most is fixing "C++ fault" #3. It
>>>> means there are many scenarios in which I could put const in C++, and
>>>> I simply can't in D, because something somewhere needs to be mutable.
>>>
>>> Then it is not const and marking it as const is a bug. D enforces to
>>> not write a bug, what's wrong with that?
>>
>> One example is if you make a class that has an internal cache of
>> something. Updating or invalidating that cache has no logical effect on
>> the externally-observable state of the class. So you should be able to
>> modify the cache even on a 'const' object. This is not a bug and I've
>> seen it have a huge effect on performance - probably a lot more than the
>> const optimizations Walter is talking about here.
>
> I suggest you take a look at http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2669.htm. It adds guarantees for STL containers that effectively prohibit them from using mutable. If they do use mutable, they are on their own in ensuring correctness. Also, although arguably all types should behave that way, there is no way to express something near "this user-defined type satisfies N2669" within the C++ type system. Also, N2669 encodes existing practice; the whole logical const and surreptitious caches inside apparently const objects is liable to bring more problems than it solves (see e.g. the std::string reference counting fiasco). -- Andrei


It's certainly true that if I see "mutable" used in code, it catches my attention and engages my extreme skepticism. It is very hard to get right. Yet, in the handful of cases I've ever seen it used, the people who used it generally knew what they were doing and did get it right. And banning mutable in those situations would have caused a cascade of non-const reaching far up into the system, where it wasn't wanted and would remove important protections.


I read N2669 and it doesn't "effectively prohibit" mutable as far as I can see. It does mean that to use any mutable state you'd need protection, such as locks, or lockfree trickery.

Generally, I suspect that the only allowable externally-observable effect of using "mutable" is improved performance. But perhaps there is some other valid use that I just haven't encountered.
July 18, 2016
On Sunday, 17 July 2016 at 02:17:52 UTC, Andrew Godfrey wrote:
> On Saturday, 16 July 2016 at 21:35:41 UTC, Walter Bright wrote:
>> On 7/16/2016 6:09 AM, Andrew Godfrey wrote:
>>> Walter called Prolog "singularly useless". You have been referring to changes
>>> that would amount to a new major version of D as "a cleanup". From the forums,
>>> my sense is that there IS a groundswell of opinion, that D2 has some major
>>> mistakes in it that can't be rectified without doing a D3, and there's a strong
>>> reaction to that idea based on experience with D1 -> D2. Perhaps what is needed
>>> is a separate area for discussion about ideas that would require a major version
>>> change. The thing about that is that it can't be done incrementally; it's the
>>> rare kind of thing that would need to be planned long in advance, and would have
>>> to amount to a huge improvement to justify even considering it.
>>
>> I agree that D2 has made some fundamental mistakes. But it also got a great deal right.
>>
>> I haven't banned Ola from the forums, he has done nothing to deserve that. He's welcome to post here, and others are welcome to engage him.
>
> I'm more interested in engaging on "in how many years will the D leadership be interested in engaging on the topic of D3?" I feel this is a significant omission from the vision doc, and that omission inflames a lot of the recurring animosity I see on the forums. Even an answer of "never" would be a significant improvement over "we refuse to engage on that". And I doubt you're really thinking "never".
>
> I do think that ideas from academia will mostly cause a lot of unwanted noise in such a discussion - because academia, in my experience, is more focused on "software construction" than on "software evolution", and D takes an approach that is built on practical experience with evolution. But academia also has occasional nuggets of extreme value.

The question is what is D3 supposed to be? I'm neither for nor against D3, it pops up every once in a while when people are not happy with a feature. My questions are:

1. Is there any clear vision of what D3 should look like?

2. What exactly will it fix?

3. Is there a prototype (in progress) to actually prove it will fix those things?

4. If there is (real) proof[1], would it justify a break with D2 and risk D's death?

I think this topic is too serious to be just throwing in (partly academic) ideas that might or might not work in the real world. It's too serious to use D as a playground and later say "Ah well, it didn't work. [shrug]". D has left the playground and can no longer afford to just play around with ideas randomly. One has to be realistic.

I'd also like to add that if we had a "clean and compact" D3, it would become more complex over time and people would want D4 to solve this, then D5 and so forth. I haven't seen any software yet that hasn't become more complex over time.

Last but not least, it would help to make a list of the things D2 got right to put the whole D3 issue into proportion.

[1] I.e. let's not refer to other languages in an eclectic manner. I'm asking for a proof that D works as D3 and is superior to D2.
July 18, 2016
On Sunday, 17 July 2016 at 05:50:31 UTC, H. S. Teoh wrote:
> On Sun, Jul 17, 2016 at 02:59:42AM +0000, Nobody via Digitalmars-d wrote:
>> 
>> Perl 6.
>
> Are you serious?  Perl is the *prime* example of "unprincipled and complex".  Larry Wall himself said (in print, no less):
>
> 	English is useful because it is a mess. Since English is a mess,
> 	it maps well onto the problem space, which is also a mess, which
> 	we call reality. Similarly, Perl was designed to be a mess,
> 	though in the nicest of all possible ways. -- Larry Wall
>
>
> T

1. Perl 6 is not Perl.
2. Perl 6 is better designed language than D will ever be.
3. Perl 6 is complex, but not complicated.  I think people sometimes confuse the two.
4. D is a failed language, regardless of how people choose to categorize its attributes.
July 18, 2016
On Monday, 18 July 2016 at 11:05:34 UTC, Bill Hicks wrote:
> On Sunday, 17 July 2016 at 05:50:31 UTC, H. S. Teoh wrote:
>> On Sun, Jul 17, 2016 at 02:59:42AM +0000, Nobody via Digitalmars-d wrote:
>>> 
>>> Perl 6.
>>
>> Are you serious?  Perl is the *prime* example of "unprincipled and complex".  Larry Wall himself said (in print, no less):
>>
>> 	English is useful because it is a mess. Since English is a mess,
>> 	it maps well onto the problem space, which is also a mess, which
>> 	we call reality. Similarly, Perl was designed to be a mess,
>> 	though in the nicest of all possible ways. -- Larry Wall
>>
>>
>> T
>
> 1. Perl 6 is not Perl.
> 2. Perl 6 is better designed language than D will ever be.
> 3. Perl 6 is complex, but not complicated.  I think people sometimes confuse the two.
> 4. D is a failed language, regardless of how people choose to categorize its attributes.

There are some interesting discussions about Perl 6[1][2]. They remind me of the discussions about D. Apart from some irrational points (the logo!), the fact that it took 15 years figures prominently - and people complain about its features that were so carefully designed. I don't know Perl 6 and cannot comment on the validity of that criticism.

[1] http://blogs.perl.org/users/zoffix_znet/2016/01/why-in-the-world-would-anyone-use-perl-6.html
[2] https://www.quora.com/Why-is-Perl-6-considered-to-be-a-disaster
July 18, 2016
On Saturday, 16 July 2016 at 21:52:02 UTC, Walter Bright wrote:
> I've seen SAL before, but have not studied it. My impression is it is much more complex than necessary. For example,
>
>   https://msdn.microsoft.com/en-us/library/hh916383.aspx
>
> describes annotations to memcpy(). I believe these are better handled by use of dynamic arrays and transitive const.

I suppose in case of memcpy the compiler can catch (at the caller side) the case when the destination buffer has insufficient size, while D can catch it only at runtime. It's a contract expressed with a simple grammar.