March 13, 2014
On Wed, Mar 12, 2014 at 08:01:39PM -0700, Walter Bright wrote:
> On 3/12/2014 6:30 PM, Kapps wrote:
> >I used to get frustrated when my code would randomly break every compiler update (and it shows how much D has progressed that regressions in my own code are now a rare occurrence), but unexpected regressions such as the std.json regression are much different from intended changes with plenty of time and warning that provide an overall (even if slight in many cases) benefit to the end-user.
> 
> I got caught by breaking changes myself. I even approved the changes. But they unexpectedly broke projects of mine, and I had to go through updating & fixing them, supplying updates, etc.
> 
> It sux.
> 
> And it's much, much, much worse if you've got lots of legacy code with only a vague idea of how it works because the engineers who wrote it have moved on, etc.

Or you wrote that code but it has been so long ago that you don't remember the fine details of it to be able to judge what is the correct way to fix it. This doubly sux when the code is for a workhorse program that you're actually *using* on a daily basis, which has been working just fine for the last 2 years, and now it suddenly doesn't compile / doesn't work anymore, and you need it to get something done and don't have time to sit down and figure out why it broke (or how to fix it).


T

-- 
I am a consultant. My job is to make your job redundant. -- Mr Tom
March 13, 2014
On 3/12/14, 9:40 PM, Manu wrote:
> On 13 March 2014 12:48, Andrei Alexandrescu
> <SeeWebsiteForEmail@erdani.org <mailto:SeeWebsiteForEmail@erdani.org>>
> wrote:
>
>     On 3/12/14, 5:40 PM, Vladimir Panteleev wrote:
>
>         On Thursday, 13 March 2014 at 00:18:06 UTC, Andrei Alexandrescu
>         wrote:
>
>             On 3/12/14, 5:02 PM, Chris Williams wrote:
>
>                 As someone who would like to be able to use D as a language,
>                 professionally, it's more important to me that D gain
>                 future clients
>                 than that it maintains the ones that it has. Even more
>                 important is that
>                 it does both of those things.
>
>
>             The saying goes, "you can't make a bucket of yogurt without
>             a spoonful
>             of rennet". The pattern of resetting customer code into the next
>             version must end. It's the one thing that both current and
>             future
>             users want: a pattern of stability and reliability.
>
>
>         Doesn't this sort of seal the language's fate in the long run,
>         though?
>         Eventually, new programming languages will appear which will
>         learn from
>         D's mistakes, and no new projects will be written in D.
>
>
>     Let's get to the point where we need to worry about that :o).
>
>
>         Wasn't it here that I heard that a language which doesn't evolve
>         is a
>         dead language?
>
>
>     Evolving is different from incessantly changing.
>
>
> Again, trivialising the importance of this change.
>
>          >From looking at the atmosphere in this newsgroup, at least to
>         me it
>         appears obvious that there are, in fact, D users who would be
>         glad to
>         have their D code broken if it means that it will end up being
>         written
>         in a better programming language.
>
>
>     This is not my first gig. Due to simple social dynamics, forum
>     participation saturates. In their heydays, forums like
>     comp.lang.c++.moderated, comp.lang.tex, and comp.lang.perl had
>     traffic comparable to ours, although their community was 1-2 orders
>     of magnitude larger. Although it seems things are business as usual
>     in our little hood here, there is a growing silent majority of D
>     users who aren't on the forum.
>
>
> Are you suggesting that only we in this thread care about this, at the
> expense of that growing silent majority?
>
> Many of the new user's I've noticed appearing are from my industry.
> There are seemingly many new gamedevs or ambitious embedded/mobile
> users. The recent flurry of activity on the cross-compilers, Obj-C, is a
> clear demonstration of that interest.
> I suspect they are a significant slice of the growing majority, and
> certainly of the growing potential. They care about this, whether they
> know it or not. Most users aren't low-level experts, even though it
> matters to their projects.
>
> I want to know what you think the potential or likely future breakdown
> of industrial application of D looks like?
>
> I have a suspicion that when the cross compilers are robust and word
> gets out, you will see a surge of game/realtime/mobile devs, and I don't
> think it's unrealistic, or even unlikely, to imagine that this may be
> D's largest developer audience at some time in the (not too distant?)
> future.
> It's the largest native-code industry left by far, requirements are not
> changing, and there are no other realistic alternatives I'm aware of on
> the horizon. Every other facet of software development I can think of
> has competition in the language space.

I hear you. Time to put this in a nice but firm manner: your arguments were understood but did not convince. The matter has been settled. There will be no final by default in the D programming language. Hope you understand.


Thanks,

Andrei

March 13, 2014
On Thursday, 13 March 2014 at 00:02:13 UTC, Chris Williams wrote:
> On Wednesday, 12 March 2014 at 22:50:00 UTC, Walter Bright wrote:
>> But we nearly lost a major client over it.
>>
>> We're past the point where we can break everyone's code.
>
> As someone who would like to be able to use D as a language, professionally, it's more important to me that D gain future clients than that it maintains the ones that it has. Even more important is that it does both of those things.
>

You don't gain clients by loosing clients.
March 13, 2014
On Thursday, 13 March 2014 at 03:15:25 UTC, Andrei Alexandrescu wrote:
>
>> I.e. the % of users who
>> wouldn't mind breaking changes is higher on the forum?
>
> I believe so, and I have many examples. Most people who don't hang out in the forum just want to get work done without minding every single language advocacy subtlety, and breakages prevent them from getting work done.
>
>
> Andrei

Yeah this pretty much defines me. I got other stuff to deal with but am very interested in using D. Just can't devote much time to make my voice heard that loudly on the forums.

From my point of view, D2 already has such a good set of features that it is sufficiently differentiated from the competition to make it a compelling product as-is. If the implementation could be polished to just work, that would help it to gain a reputation of stability. That in turn would allow D to gain traction in industry, publish a specification, build a third-party tooling and library ecosystem, etc.

I applaud the restraint Walter and Andrei are showing in keeping the implementation on track. I believe this is a win in the long run. New features like final-by-default are good to consider. And for that feature in particular, I think the reasoning backing it up is compelling. But I say tabling it for consideration until D-next is the way to go to win the race, rather than just the sprint.

Joseph
March 13, 2014
On 13 March 2014 14:37, Sarath Kodali <sarath@dummy.com> wrote:

> On Thursday, 13 March 2014 at 01:18:14 UTC, Chris Williams wrote:
>
>> On Thursday, 13 March 2014 at 00:48:15 UTC, Walter Bright wrote:
>>
>>> On 3/12/2014 5:18 PM, Andrei Alexandrescu wrote:
>>>
>>>> We are opposed to having compiler flags define language semantics.
>>>>
>>>
>>> Yeah, that's one of those things that always seems like a reasonable idea, but experience with it isn't happy.
>>>
>>
>> I would imagine that the reasons for this goal are 1) to keep the
>> compiler and language sane, and 2) insufficient personel to maintain legacy
>> variants.
>>
>> I think the answer to #1 is to not introduce such changes lightly nor frequently.
>>
>> For #2, since the codebase is now open sourced and, I presume, your "clients" pay you to perform specific tasks, legacy compilation features will end up being maintained either by random people who fix it themselves, or a client who based his code on an older version pays you to go into the legacy branch/build target code. This is the way most open sourced software works. Linux, GCC, emacs, etc. are all constantly moving targets that only through people paying Red Hat and others like them to make the insanity go away are able to work together as a single whole.
>>
>
> If I'm a enterprise customer I would be very angry if my code breaks with each new release of compiler. I will be angry irrespective of whether I'm paying for the compiler or not. Because every time my code breaks, I will have to allocate resources to figure out the reason why a working production code is broken and then have to test new code and testing can take months to complete.
>

That's not the way business works, at least, not in my neck of the woods. Having been responsible for rolling out many compiler/toolset/library upgrades personally, that's simply not how it's done.

It is assumed that infrastructural update may cause disruption. No sane
business just goes and rolls out updates without an initial testing and
adaptation period.
Time is allocated to the upgrade process, and necessary changes to workflow
are made by an expert that performs the upgrade.
In the case of controlled language feature deprecation (as opposed to the
std.json example), it should ideally be safe to assume an alternative
recommendation is in place, and it was designed to minimise disruption.
In the case we are discussing here, the disruption is small and easily
addressed.

Languages are adopted by enterprises only when there is long term stability
> to it. C code written 30 years back in K&R style still compiles without any problem. Please enhance the language but don't break existing code.
>

In my experience, C/C++ is wildly unstable.
I've been responsible for managing C compiler updates on many occasions,
and they often cause complete catastrophe, with no warning or deprecation
path given!
Microsoft are notorious for this. Basically every version of MSC is
incompatible with the version prior in some annoying way.

I personally feel D has a major advantage here since all 3 compilers share
the same front-end, and has a proper 'deprecated' concept (only recently
introduced to C), and better compile time detection and warning
opportunities.
Frankly, I think all this complaining about breaking changes in D is
massively overrated. C is much, much worse!
The only difference is that D releases are more frequent than C releases.
That will change as the language matures.

Also if something has to be deprecated, it should exist in that deprecated
> state for at least for 5 years. Currently it is one year and for enterprise customers that is a very short period.


This is possibly true. It's a tricky balancing act.

I'd rather see D take a more strict approach here, so that we don't end up
in the position where 30-year-old D code still exists alongside 'modern' D
code written completely differently, requiring to be compiled with a bunch
of different options.
The old codebases should be nudged to update along the way. I would
consider it a big mistake to retain the ancient-C backwards compatibility.


March 13, 2014
On Thursday, 13 March 2014 at 04:57:49 UTC, H. S. Teoh wrote:
> On Wed, Mar 12, 2014 at 08:01:39PM -0700, Walter Bright wrote:
>> On 3/12/2014 6:30 PM, Kapps wrote:
>> >I used to get frustrated when my code would randomly break every
>> >compiler update (and it shows how much D has progressed that
>> >regressions in my own code are now a rare occurrence), but unexpected
>> >regressions such as the std.json regression are much different from
>> >intended changes with plenty of time and warning that provide an
>> >overall (even if slight in many cases) benefit to the end-user.
>> 
>> I got caught by breaking changes myself. I even approved the changes.
>> But they unexpectedly broke projects of mine, and I had to go through
>> updating & fixing them, supplying updates, etc.
>> 
>> It sux.
>> 
>> And it's much, much, much worse if you've got lots of legacy code
>> with only a vague idea of how it works because the engineers who
>> wrote it have moved on, etc.
>
> Or you wrote that code but it has been so long ago that you don't
> remember the fine details of it to be able to judge what is the correct
> way to fix it. This doubly sux when the code is for a workhorse program
> that you're actually *using* on a daily basis, which has been working
> just fine for the last 2 years, and now it suddenly doesn't compile /
> doesn't work anymore, and you need it to get something done and don't
> have time to sit down and figure out why it broke (or how to fix it).
>
>
> T

Here here!

Or even the tooling and environment needed to get it to work are a thing of the past. Starting to remember some long hours working with old versions of MS Access on old Windows installations and trying to get them working on newer versions. Arg!

Joseph
March 13, 2014
On 13 March 2014 14:52, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org>wrote:

> On 3/12/14, 8:35 PM, Manu wrote:
>
>> On 13 March 2014 10:15, Walter Bright <newshound2@digitalmars.com <mailto:newshound2@digitalmars.com>> wrote:
>>
>>     On 3/12/2014 5:02 PM, Chris Williams wrote:
>>
>>         As someone who would like to be able to use D as a language,
>>         professionally,
>>         it's more important to me that D gain future clients than that
>>         it maintains the
>>         ones that it has. Even more important is that it does both of
>>         those things.
>>
>>
>>     The D1 -> D2 transition very nearly destroyed D by sacrificing all
>>     the momentum it had.
>>
>>
>> To draw that as a comparison to the issue on topic is one of the biggest exaggerations I've seen in a while, and you're not usually prone to that sort of thing.
>>
>
> Actually a lot of measurements (post statistics, downloads) and plenty of evidence (D-related posts on reddit) support that hypothesis. The transition was a shock of much higher magnitude than both Walter and I anticipated.


You're seriously comparing a deprecation warning telling you to write 'virtual' infront of virtuals to the migration from D1 to D2?


March 13, 2014
On Wednesday, 12 March 2014 at 22:50:00 UTC, Walter Bright wrote:
> The argument for final by default, as eloquently expressed by Manu, is a good one. Even Andrei agrees with it (!).
>
> The trouble, however, was illuminated most recently by the std.json regression that broke existing code. The breakage wasn't even intentional; it was a mistake. The user fix was also simple, just a tweak here and there to user code, and the compiler pointed out where each change needed to be made.
>
> But we nearly lost a major client over it.

I find this a bit baffling.  Given the investment this customer must have in D, I can't imagine them switching to a new language over something like this.  I hate to say it, but this sounds like the instances you hear of when people call up customer service just to have someone to yell at.  Not that the code breakage is okay, but I do feel like this may be somewhat of an exaggeration.

Regarding this virtual by default issue.  I entirely support Manu's argument and wholeheartedly agree with it.  I even think that I'd be more likely to use D professionally if D worked this way, for many of the same reasons Manu has expressed.  There may even be a window for doing this, but the communication around the change would have to be perfect.

Regarding user retention... I've spent the past N months beginning the process of selling D at work.  The language and library are at a point of maturity where I think it might have a chance when evaluated simply on the merits of the language itself.  However, what has me really hesitant to put my shoulder behind D and really push isn't that changes occur sometimes.  Even big changes.  It's how they're handled.  Issues come up in the newsgroup and are discussed back and forth for ages.  Seriously considered.  And then maybe a decision is apparently reached (as with this virtual by default thing) and so I expect that action will be taken.  And then nothing happens.  And other times big changes occur with seemingly little warning.  Personally, I don't really require perfect compatibility between released, but I do want to see things moving decisively in a clearly communicated direction.  I want to know where we're going and how we're going to get there, and if that means that I have to hold on moving to a new compiler release for a while while I sort out changes that's fine.  But I want to be able to prepare for it.  As things stand, I'm worried that if I got a team to move to D we'd have stuff breaking unexpectedly and I'd end up feeling like an ass for recommending it.  I guess that's probably what prompted the "almost lost a major client" issue you mentioned above.  This JSON parser change was more the proverbial straw than a major issue in itself.

As for the !virtual idea... I hate it.  Please don't add yet more ways for people to make their code confusing.
March 13, 2014
On Thursday, 13 March 2014 at 04:56:31 UTC, Sarath Kodali wrote:
>
> That is true if your code is under active development. What if you had a production code that was written 2 years back?
>
> - Sarath

Code that I wrote 2 years ago in GCC 4.7, is still compiled with the same compiler binary that I used to develop and test it.  I don't upgrade my tools until I'm not ready handle the risk of change.  But that doesn't prevent me from using the latest version of the compiler for my latest project.

I also have projects still being built in Embedded Visual C++ 6 for the same reason.

If evolution of the language and tools is a risk, I don't think it's wise to upgrade.

Mike

March 13, 2014
On 13 March 2014 15:14, Joseph Cassman <jc7919@outlook.com> wrote:

> On Thursday, 13 March 2014 at 04:57:49 UTC, H. S. Teoh wrote:
>
>> On Wed, Mar 12, 2014 at 08:01:39PM -0700, Walter Bright wrote:
>>
>>> On 3/12/2014 6:30 PM, Kapps wrote:
>>> >I used to get frustrated when my code would randomly break >every compiler update (and it shows how much D has progressed that regressions in my own code are now a rare occurrence), but >unexpected regressions such as the std.json regression are much >different from intended changes with plenty of time and warning that provide >an overall (even if slight in many cases) benefit to the >end-user.
>>>
>>> I got caught by breaking changes myself. I even approved the changes. But they unexpectedly broke projects of mine, and I had to go through updating & fixing them, supplying updates, etc.
>>>
>>> It sux.
>>>
>>> And it's much, much, much worse if you've got lots of legacy code with only a vague idea of how it works because the engineers who wrote it have moved on, etc.
>>>
>>
>> Or you wrote that code but it has been so long ago that you don't remember the fine details of it to be able to judge what is the correct way to fix it. This doubly sux when the code is for a workhorse program that you're actually *using* on a daily basis, which has been working just fine for the last 2 years, and now it suddenly doesn't compile / doesn't work anymore, and you need it to get something done and don't have time to sit down and figure out why it broke (or how to fix it).
>>
>>
>> T
>>
>
> Here here!
>
> Or even the tooling and environment needed to get it to work are a thing of the past. Starting to remember some long hours working with old versions of MS Access on old Windows installations and trying to get them working on newer versions. Arg!
>
> Joseph
>

Again, this is conflating random breakage with controlled deprecation.
A clear message with a file:line that says "virtual-by-default is
deprecated, add 'virtual' _right here_." is not comparable to the behaviour
of byLine() silently changing from release to release and creating some
bugs, or std.json breaking unexpectedly with no warning.