March 13, 2014
On 3/12/2014 9:57 PM, deadalnix wrote:
> On Wednesday, 12 March 2014 at 23:00:26 UTC, Namespace wrote:
>> Would it mean that we would deprecate "nothrow" and replace it with "!throw"?
>
> I like that !

Not no-how, not no-way!
March 13, 2014
On Thursday, 13 March 2014 at 05:13:00 UTC, Manu wrote:
>
> That's not the way business works, at least, not in my neck of the woods.
> Having been responsible for rolling out many compiler/toolset/library
> upgrades personally, that's simply not how it's done.

That may be how gamedev industry works, because you are always on cutting edge technology. But for big companies in traditional industry (like IT or Oil & Gas or Finance or Aviation), they always work on stable tool set. Even undocumented features cannot change in an adhoc manner. I have worked for some of the top IT companies and I did not use D for my projects primarily because it was not stable.

>
> It is assumed that infrastructural update may cause disruption. No sane
> business just goes and rolls out updates without an initial testing and
> adaptation period.
> Time is allocated to the upgrade process, and necessary changes to workflow
> are made by an expert that performs the upgrade.

That happens for D1 to D2 migration and not from D2.64 to D2.65. Every few months you cannot expect the team to test and validate the compiler tool set.

> In the case of controlled language feature deprecation (as opposed to the
> std.json example), it should ideally be safe to assume an alternative
> recommendation is in place, and it was designed to minimise disruption.
> In the case we are discussing here, the disruption is small and easily
> addressed.
>
> Languages are adopted by enterprises only when there is long term stability
>> to it. C code written 30 years back in K&R style still compiles without any
>> problem. Please enhance the language but don't break existing code.
>>
>
> In my experience, C/C++ is wildly unstable.
> I've been responsible for managing C compiler updates on many occasions,
> and they often cause complete catastrophe, with no warning or deprecation
> path given!
> Microsoft are notorious for this. Basically every version of MSC is
> incompatible with the version prior in some annoying way.
>
> I personally feel D has a major advantage here since all 3 compilers share
> the same front-end, and has a proper 'deprecated' concept (only recently
> introduced to C), and better compile time detection and warning
> opportunities.
> Frankly, I think all this complaining about breaking changes in D is
> massively overrated. C is much, much worse!
> The only difference is that D releases are more frequent than C releases.
> That will change as the language matures.
>
> Also if something has to be deprecated, it should exist in that deprecated
>> state for at least for 5 years. Currently it is one year and for enterprise
>> customers that is a very short period.
>
>
> This is possibly true. It's a tricky balancing act.
>
> I'd rather see D take a more strict approach here, so that we don't end up
> in the position where 30-year-old D code still exists alongside 'modern' D
> code written completely differently, requiring to be compiled with a bunch
> of different options.
> The old codebases should be nudged to update along the way. I would
> consider it a big mistake to retain the ancient-C backwards compatibility.

Even I do not want D to have 30 years of backward compatibility. But that is something the C community has got used to. That is why I said we need atleast 5 year depreciation cycle.

Anyways, coming to your original issue - why can't you add "final:" in your class as suggested by Walter?  Doesn't this solve your problem without changing the default behaviour?

- Sarath

March 13, 2014
I'd like to address, in general, the issue of, what I term, "performance by default" which is part of the argument for final by default.

C, C++, and D are billed as languages for writing high performance apps. And this is true. What is not true, or at least has not been true in my experience, is that an application is high performance merely because it is written in C, C++ or D.

Let me emphasize, code is not fast just because it is written using a high performance language.

High performance code does not happen by accident, it has to be intentionally written that way. Furthermore, I can pretty much guarantee you that if an application has never been profiled, its speed can be doubled by using a profiler. And if you really, really want high performance code, you're going to have to spend time looking at the assembler dumps of the code and tweaking the source code to get things right.

High performance code is not going to emanate from those programmers who are not skilled in the art, it is not going to happen by accident, it is not going to happen by following best practices, it is not going to happen just because you're writing in C/C++/D.

D certainly provides what is necessary to write code that blows away conventional C or C++ code.

It reminds me of when I worked in a machine shop in college. I'd toil for hours cutting parts, and the parts were not round, the holes were off center, heck, the surfaces weren't that smooth. There was an older machinist there who'd take pity on me. He'd look at what I was doing, cluck cluck, he'd touch the bit with a grinder, he'd tweak the feed speed, he'd make arcane adjustments to the machine tool, and out would come perfect parts. I am an awe to this day of his skill - I still don't know how he did it. The point is, he and I were using the same tools. He knew how to make them sing, I didn't.

I still cannot drill a goddam hole and get it where I measured it should be.
March 13, 2014
On 3/12/2014 10:15 PM, Manu wrote:
> You're seriously comparing a deprecation warning telling you to write 'virtual'
> infront of virtuals to the migration from D1 to D2?

You're talking about changing practically every D class in existence.

(The D1 => D2 transition also came with plenty of compiler warnings help.)
March 13, 2014
On 3/12/2014 9:23 PM, Manu wrote:
> It's not minor, and it's not achievable by other means though.

    class C { final: ... }

does it.


> You and Andrei are the only resistance in this thread so far. Why don't you ask
> 'temperamental client' what their opinion is? Give them a heads up, perhaps
> they'll be more reasonable than you anticipate?

I didn't even know about this client before the breakage. D has a lot of users who we don't know about.

> Both myself and Don have stated on behalf of industrial clients that we embrace
> breaking changes that move the language forward, or correct clearly identifiable
> mistakes.

Breaking changes has been a huge barrier to Don's company being able to move from D1 to D2. I still support D1 specifically for Don's company.

March 13, 2014
On Thursday, 13 March 2014 at 05:31:17 UTC, Manu wrote:
>
> Again, this is conflating random breakage with controlled deprecation.
> A clear message with a file:line that says "virtual-by-default is
> deprecated, add 'virtual' _right here_." is not comparable to the behaviour
> of byLine() silently changing from release to release and creating some
> bugs, or std.json breaking unexpectedly with no warning.

Hundreds or thousands of "deprecated" warning messages for every D project vs adding "final:" for few classes in few projects. Which is better?

- Sarath
March 13, 2014
On Wed, Mar 12, 2014 at 9:40 PM, Manu <turkeyman@gmail.com> wrote:

> ... you will see a surge of game/realtime/mobile devs, and I don't think
> it's unrealistic, or even unlikely, to imagine that this may be D's largest
> developer audience at some time in the (not too distant?) future.
> It's the largest native-code industry left by far, requirements are not
> changing, and there are no other realistic alternatives I'm aware of on the
> horizon.
>

Rust?


March 13, 2014
On 3/12/2014 8:29 PM, Chris Williams wrote:
>
> If done excessively, I could certainly see that. But outside of new
> languages that haven't gotten to that point yet, I don't know of any
> that don't have compiler/runtime flags of this sort. E.g. Java, Perl, C,
> C++, PHP, etc. I would be curious why you think D can escape this fate?
>

PHP is a perfect example of why language-altering flags is a very bad path to start heading down. (Granted, the problem is *vastly* worse in an interpreted language than a compiled one, but still.)

March 13, 2014
On 13 March 2014 15:51, Walter Bright <newshound2@digitalmars.com> wrote:

> I'd like to address, in general, the issue of, what I term, "performance by default" which is part of the argument for final by default.
>
> C, C++, and D are billed as languages for writing high performance apps. And this is true. What is not true, or at least has not been true in my experience, is that an application is high performance merely because it is written in C, C++ or D.
>
> Let me emphasize, code is not fast just because it is written using a high performance language.
>
> High performance code does not happen by accident, it has to be intentionally written that way. Furthermore, I can pretty much guarantee you that if an application has never been profiled, its speed can be doubled by using a profiler. And if you really, really want high performance code, you're going to have to spend time looking at the assembler dumps of the code and tweaking the source code to get things right.
>
> High performance code is not going to emanate from those programmers who are not skilled in the art, it is not going to happen by accident, it is not going to happen by following best practices, it is not going to happen just because you're writing in C/C++/D.
>
> D certainly provides what is necessary to write code that blows away conventional C or C++ code.
>

But you understand the danger in creating a situation where experts can't optimise their code even if they want to; if at some later time it becomes an issue, or some new customer comes along with more stringent requirements. These are not unrealistic hypothetical scenarios. Libraries exist, and they have customers by definition. Requirements change over time. Defaulting to an inflexible position is dangerous.

There's another programming principle; beware of early-optimisation. Many people swear by this. These 2 situations are at odds.

It reminds me of when I worked in a machine shop in college. I'd toil for
> hours cutting parts, and the parts were not round, the holes were off center, heck, the surfaces weren't that smooth. There was an older machinist there who'd take pity on me. He'd look at what I was doing, cluck cluck, he'd touch the bit with a grinder, he'd tweak the feed speed, he'd make arcane adjustments to the machine tool, and out would come perfect parts. I am an awe to this day of his skill - I still don't know how he did it. The point is, he and I were using the same tools. He knew how to make them sing, I didn't.
>
> I still cannot drill a goddam hole and get it where I measured it should be.
>


March 13, 2014
On 13 March 2014 15:55, Walter Bright <newshound2@digitalmars.com> wrote:

> On 3/12/2014 10:15 PM, Manu wrote:
>
>> You're seriously comparing a deprecation warning telling you to write
>> 'virtual'
>> infront of virtuals to the migration from D1 to D2?
>>
>
> You're talking about changing practically every D class in existence.
>

Only base classes, and only functions that are actually overridden.
I don't want to raise 'override' again, but we know precisely the magnitude
this change would affect users; significantly less than that.

(The D1 => D2 transition also came with plenty of compiler warnings help.)
>

But the scale of changes required was extremely drastic, and non-trivial in aggregate. That's not comparable to "put this word here", which is what we're talking about.