March 02, 2018
On Friday, 2 March 2018 at 02:35:46 UTC, Meta wrote:
> D1 -> D2 nearly killed D (can't remember which, but it was either Walter or Andrei that have said this on multiple occasions).

This gets repeated over and over again, but I haven't actually seen any evidence for it.

But even if it is true, I'd say it is just because they did it wrong. There never really was a D1->D2. There was just an ongoing evolution of D where one version was arbitrarily forked off and called D1. Seriously, D1.00 and D 2.00 came out at about the same time: Version D 1.001 Jan 23, 2007; 2.000 Jun 17, 2007.

I remember the biggest troubles I had with D2: immutable being introduced and changing, and a bunch of little library renames.... and they weren't really that big of a deal and btw occurred over the next ~2ish *years*. It wasn't all at once - remember "D2" was just the evolving D. D1 was a random, arbitrary snapshot.

If I were to do a D3, I'd make it opt in at the module level, and keep it so all D code can be compiled together - corresponding features added each step. For example, a "d3 module" is @safe by default. But the @safe semantics are still tehre for a "d2 module", you just annotate it elsewhere. Then there's no breakage and you can still change things.
March 02, 2018
On Friday, 2 March 2018 at 04:38:24 UTC, psychoticRabbit wrote:
> On Friday, 2 March 2018 at 03:57:25 UTC, barry.harris wrote:
>>
>> Sorry little rabbit, your are misguided in this belief. Back in day we all used C and this is the reason most "safer" languages exist today.
>
> You can write pretty safe code in C these days, without too much trouble. We have the tooling and the knowledge to make that happen.. developed over decades - and both keep getting better, because the language is not subjected to a constant and frequent release cycle.
>
> Ironically, the demands on programmers to adapt to constant change, is actually making applications less safe. - and least, that's my thesis ;-)
>
> The real problem with using C these days (in some areas), is more to do with its limited abstraction power, not its lack of safety.
>
> And also C is frowned upon (and C++ too for that matter), cause most programmers are so lazy these days, and don't want to write code - but prefer to just 'link algorithms' that someone else wrote.
>
> I include myself in this - hence my interest in D ;-)
>
> Keep those algorithms coming!

Those tools exist since 1979, so C programmers have had quite some time to actually use them.

"To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions."

Dennis Ritchie, https://www.bell-labs.com/usr/dmr/www/chist.html

Also, anyone that wasn't using safer systems programming languages before C got widespread outside UNIX, can spend some time educating themselves on BitSavers or Archive about all the systems outside AT&T that were developed in such languages since 1961.

The first well known, Burroughs B5000, has kept being improved and is sold by Unisys as ClearPath nowadays.

Or PL/8 used by IBM for doing RISC research, creating an compiler using an plugable architecture similar to what many think are LLVM ideas and the respective OS. They only switched to C, when they decided to bet on UNIX for going commercial with RISC.

There are only two reasons we are stuck with C, until we get to radically change computer architectures, UNIX like OSes, and embedded developers that won't use anything else even at point gun.

All the quantum computing research is using languages that don't have anything to do with C.

March 02, 2018
Whilst we are espousing opinions…

On Fri, 2018-03-02 at 08:02 +0000, Paulo Pinto via Digitalmars-d- announce wrote:
> On Friday, 2 March 2018 at 04:38:24 UTC, psychoticRabbit wrote:
> > […]
> > 
> > You can write pretty safe code in C these days, without too much trouble. We have the tooling and the knowledge to make that happen.. developed over decades - and both keep getting better, because the language is not subjected to a constant and frequent release cycle.

You can write safe code in assembly language and even machine code, but do you want to? The same applies to C.

> > Ironically, the demands on programmers to adapt to constant change, is actually making applications less safe. - and least, that's my thesis ;-)
> > 
> > The real problem with using C these days (in some areas), is more to do with its limited abstraction power, not its lack of safety.

The problem with C these days is that people still use it when they really should not. C has it's place, and writing applications is not that place.

> > And also C is frowned upon (and C++ too for that matter), cause most programmers are so lazy these days, and don't want to write code - but prefer to just 'link algorithms' that someone else wrote.

Wrong, wrong, wrong. Those people using C these days either have to use it because a modern language can't yet target their platform, or they are too lazy to change their toolchain and continue with C in the face of overwhelming evidence it is the wrong thing to do.

> > […]
> 
> There are only two reasons we are stuck with C, until we get to radically change computer architectures, UNIX like OSes, and embedded developers that won't use anything else even at point gun.
> 

C is a portable assembly language, it is not really a high level language. There are those who will not change and will use C till they drop dead. That is their problem.

There are those who use C because the only other option is assembly language, so they make the right decision. This is an indicator that high-level language toolchain manufacturers have failed to port to their platform. I'll wager there are still a lot of 8051s out there. I'll also wager the C++ compilers for that target do not realise C++, but a subset that is worse than using C. Even after 14 years of improvement.

It is going to be interesting what happens when Rust begins to have to toolchains to deal with microcontrollers. Hopefully though ARM cores dominate now, especially given the silicon area is reputedly smaller than 8051. I've been out of the smartcard arena for over a decade now, and yet I bet it is all still very much the same.

> […]
> 
-- 
Russel.
==========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk

March 02, 2018
On Fri, 2018-03-02 at 02:35 +0000, Meta via Digitalmars-d-announce wrote:
> […]
> D1 -> D2 nearly killed D (can't remember which, but it was either
> Walter or Andrei that have said this on multiple occasions). A D2
> -> D3 transition might generate a lot of publicity if done very
> carefully, but more than likely it would just put the nails in
> the coffin for good and destroy all the momentum D has built up
> over the past 3 years (I feel like 2015 was a big turning point
> where D finally got back on peoples' radars).

And Java 5 nearly killed Java, as did Java 8 and Java 9. OK so there
was more internecine warfare in the D1 → D2 thing, but hopefully the D2
→ D3 think will not only happen, it will happen relatively soon.

Dx → Dy is the time for important breaking changes. There appear to be an increasing number of things annoying people about D2, ergo the pressure for D3 is building. NOT evolving from D2 to D3 is what will definitely kill D.

-- 
Russel.
==========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk


March 02, 2018
On Friday, March 02, 2018 10:37:04 Russel Winder via Digitalmars-d-announce wrote:
> On Fri, 2018-03-02 at 02:35 +0000, Meta via Digitalmars-d-announce
>
> wrote:
> > […]
> > D1 -> D2 nearly killed D (can't remember which, but it was either
> > Walter or Andrei that have said this on multiple occasions). A D2
> > -> D3 transition might generate a lot of publicity if done very
> > carefully, but more than likely it would just put the nails in
> > the coffin for good and destroy all the momentum D has built up
> > over the past 3 years (I feel like 2015 was a big turning point
> > where D finally got back on peoples' radars).
>
> And Java 5 nearly killed Java, as did Java 8 and Java 9. OK so there
> was more internecine warfare in the D1 → D2 thing, but hopefully the D2
> → D3 think will not only happen, it will happen relatively soon.
>
> Dx → Dy is the time for important breaking changes. There appear to be an increasing number of things annoying people about D2, ergo the pressure for D3 is building. NOT evolving from D2 to D3 is what will definitely kill D.

Really? The possibility of D3 gets mentioned _way_ less than it used to. It gets mentioned ocassionally at this point but not all that often from what I've seen, and almost always from folks who are new to the newsgroup.

Historically, D3 is what folks like to bring up when there's some particular change that they'd like to see and which clearly isn't going to happen in D2, but the idea has never gained any real traction, and as D has matured and grown, the push to create D3 seems to have diminished considerably. We get a lot less of folks trying to push for new features, because it's become clear that D isn't constantly changing everything anymore, whereas when it was younger, we'd make breaking changes all the time. That shift initially resulted in lots of talk about D3, because a number of folks really wanted changes that weren't making it into D2, but that talk has died down over time. And we _have_ still managed to make some significant changes to D without breaking everything or needing D3.

Thus far, we've largely been able to make changes without needing to move to D3, and there really isn't agreement on what would be in a potential D3 anyway. There are some issues which may require D3 to fix (e.g. getting rid of auto-decoding probably would, though maybe someone smart will figure out how within D2) given that we don't want to break tons of D programs when making changes, but overall, things have been going fairly well with regards to evolving D2.

Regardless, Andrei has been pretty adamant about _not_ doing D3 any time soon, and AFAIK, Walter is in agreement on that. They want D2 to actually grow and become successful, not fork the community between D2 and D3. Yes, D would probably survive it, but it would have a negative impact on D in the short term, and it's not clear that it would even buy us a lot - especially since a lot of the stuff that folks like to suggest for D3 are fairly controversial. Not everything is, but there would almost certainly need to be a pretty significant list of things that we clearly wanted to change with D and couldn't do without bumping the version to D3 for D3 to even be considered, and I really don't see that happening any time soon.

For the most part, I think that proposals of real value that don't break everything stand a decent chance of being accepted as DIPs, and most improvements don't require massive breakage. Some, like making @safe the default would, and those aren't going to happen in D2, but that sort of thing certainly isn't enough to merit forking the language - not on its own anyway. And I'm quite sure that even if we were all agreed that breaking the defaults for attributes were worth it, there would be quite a lot of arguing about what the defaults should be. @safe would almost certainly win, but stuff like pure would be far more debatable, and some folks love to bring up the idea of making variables immutable by default, which doesn't play nicely at all with many D idioms, so I doubt that that sort of change would be accepted even if we definitely were doing D3 - but some folks talk like it's a given that that sort of thing should be in D3. Just discussing what would potentially go in D3 would open up a huge pandora's box of what should and shouldn't be changed, and I don't expect that it would easily result in much of the way of consensus.

In any case, I expect that anyone who wants D3 is going to have a very hard time convincing Walter and Andrei that such large breaking changes would be worth it at this point.

- Jonathan M Davis


March 02, 2018
On Friday, 2 March 2018 at 10:21:05 UTC, Russel Winder wrote:
> […]
>
> There are those who use C because the only other option is assembly language, so they make the right decision. This is an indicator that high-level language toolchain manufacturers have failed to port to their platform. I'll wager there are still a lot of 8051s out there. I'll also wager the C++ compilers for that target do not realise C++, but a subset that is worse than using C. Even after 14 years of improvement.
>
> It is going to be interesting what happens when Rust begins to have to toolchains to deal with microcontrollers. Hopefully though ARM cores dominate now, especially given the silicon area is reputedly smaller than 8051. I've been out of the smartcard arena for over a decade now, and yet I bet it is all still very much the same.
>

There are safer alternatives, (Pascal and Basic), but they suffer from the same stigma that has pushed them outside of the market, namely they aren't offered on the chip vendor SDK, thus requiring an additional purchase, which only a few bother with.

http://turbo51.com/

https://www.mikroe.com/compilers


March 02, 2018
On Friday, 2 March 2018 at 10:21:05 UTC, Russel Winder wrote:
>
> ...continue with C in the face of overwhelming evidence
> it is the wrong thing to do.

yeah, the health fanatics who promote their crap to goverments and insurance agencies, use very similar arguments about sugar, salt, alchohol, this and that....

when really, it's all about moderation, not prohibition (or increased taxes on things people say are bad).

and science is so dodgy these days, that even scientific evidence requires evidence.

c rules!

March 02, 2018
On Friday, 2 March 2018 at 11:00:09 UTC, Jonathan M Davis wrote:
> In any case, I expect that anyone who wants D3 is going to have a very hard time convincing Walter and Andrei that such large breaking changes would be worth it at this point.
>
> - Jonathan M Davis

I agree. I don't think there is enough to warrant a D3 at this point.

But still, imagine if every time an architect built a house, it had to be built using the same specs as the previous house. You'd end up with garbage, piled upon garbage. In essence, you'd get C++.

So exploring ideas around what a new design might look like, can be useful too, so let's not discourage that by talking about 'forking' concerns.



March 02, 2018
On Fri, 2018-03-02 at 11:16 +0000, psychoticRabbit via Digitalmars-d- announce wrote:
> On Friday, 2 March 2018 at 10:21:05 UTC, Russel Winder wrote:
> > 
> > ...continue with C in the face of overwhelming evidence
> > it is the wrong thing to do.
> 
> yeah, the health fanatics who promote their crap to goverments and insurance agencies, use very similar arguments about sugar, salt, alchohol, this and that....
> 
> when really, it's all about moderation, not prohibition (or increased taxes on things people say are bad).

You stick with your buffer overruns, I'll do my applications in D and Rust.

> and science is so dodgy these days, that even scientific evidence requires evidence.

Bollocks. Just because a certain section of USA society, and sadly some sections of UK society, either can't do science, or choose to badly report science, does make science dodgy. But that stray off topic for this list into the realms of philosophy of science.

> c rules!

If you want buffer overruns certainly.

-- 
Russel.
==========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk


March 02, 2018
On Fri, 2018-03-02 at 04:00 -0700, Jonathan M Davis via Digitalmars-d- announce wrote:
> […]
> 
> In any case, I expect that anyone who wants D3 is going to have a
> very hard
> time convincing Walter and Andrei that such large breaking changes
> would be
> worth it at this point.

I am happy to accept now is not the time, but to say there will be no D3 is probably as bad a position as to say D3 tomorrow please, and D4 the next day.

Of course the Linux numbering 3 → 4 was fatuous, no architectural or serious breaking change, just a though that the minor number was getting too big.

So having D2.999 is fine per se, but advertises a lack of change and a lack of ambition since the language name is D not D2. Fortran, C++, and Java show an obsessive adherence to backward compatibility and yet they increase their major numbers to give the appearance at least of forward progress.

There is a balance to be had, but I believe keeping D3 as a formal agenda item is a positive thing for the traction of D. Perhaps, of course we should be talking about D 2.x and D 3.0 and remove the D1, D2, etc. from the debate.

-- 
Russel.
==========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk