March 29, 2020
On Sunday, 29 March 2020 at 00:58:12 UTC, H. S. Teoh wrote:
> On Saturday, 28 March 2020 at 21:38:00 UTC, Denis Feklushkin wrote:
> [...]
>> Just make survey around your friends/collegues about: what is a byte? Then compare with wikipedia/dictionary/RFC/etc definition. You will be very surprised.
> [...]
>
> The Wikipedia article clearly states that definitions of "byte" other than 8 bits are *historical*, and that practically all modern hardware has standardized on the 8-bit byte.  I don't understand why this is even in dispute in the first place.

Just because there are very few general-purpose processors for now.

It's like the idea of ​​making variables thread local by default. This does not make sense for now, but in the near future it can be an advantage.

And for example, most of newfangled neuroprocessors use a small size of float to represent synapses.

Isn't it better to stick with the right names initially?

> Frankly, it smells like just a red herring.

I am surprised that this small proposal caused such response.
March 29, 2020
On Sunday, 29 March 2020 at 00:19:57 UTC, NaN wrote:
> On Saturday, 28 March 2020 at 21:38:00 UTC, Denis Feklushkin wrote:
>> On Saturday, 28 March 2020 at 19:50:44 UTC, NaN wrote:
>>
>>>
>>> Dont design based on imaginings of the future, you will almost always get it wrong.
>>
>> This is almost already reality, not future.
>
> I was responding to your statement regarding FPGAs. If they become ubiquitous, and if people want to use D to program them, and if someone does the work to make it happen, then maybe different width basic types *might* be needed.

I do not suggest adding  or resizing types. I suggest name them more correctly to exclude cause of confusion for people without a beard :-)

March 29, 2020
On Friday, 27 March 2020 at 15:56:40 UTC, Steven Schveighoffer wrote:
> Having a new branch of the compiler will provide a way to keep D2 development alive while giving a playground to add new mechanisms, fix long-existing design issues, and provide an opt-in for code breakage.

I think this is doable with some preparation.

1. LTS branches. Something like every 5 years, maintained for 10 years.

Our current system for point-releases is kind of pointless because we NEVER issue a point-release for a non-current major version.

The most important aspect of updating LTS releases is platform support. It seems that macOS versions breaking DMD or DMD-produced binaries is a regular occurrence, with Linux/FreeBSD not far behind.

It would be a good time to reconsider SemVer as well.

2. Integrating D version management into build tools. E.g. dub.sdl would allow declaring which language version a program was written for, and Dub could then download and use that particular compiler version. (IMHO Digger's library component is in a good place for this, with the bonus of being able to select non-release versions such as master commits, PRs, or forks.)

3. All language changes should be done in such a way that libraries could still be written, with a reasonable amount of effort, to support compilers before and after the change. This greatly helped Python's 2/3 transition.

March 29, 2020
On Sunday, 29 March 2020 at 02:09:37 UTC, krzaq wrote:
> On Sunday, 29 March 2020 at 01:21:25 UTC, NaN wrote:
>> Firstly either way you have to remember something, u16 or short. So there's memorization whatever way you slice it.
>
> But you don't have to remember anything other than what you want to use. When you want a 16 bit unsigned integer you don't have to mentally lookup the type you want, because you already spelled it. And if you see a function accepting a long you don't have to think "Is this C? If so, is this Windows or not(==is this LP64)? Or maybe it's D? But what was the size of a long in D? oh, 64"

If you're sitting there thinking "wait what language am I using" you have bigger problems. I've used maybe 10 different languages over 30 years, it's not ever been a problem for me to remember what language I'm using or what the basic types were.


>> Secondly those processors from the last millennium are still the dominant processors of this millennium.
>
> Are they really? I have more ARMs around me than I do x86's. Anyway, they're compatible, but not the same. "double precision" doesn't really mean much outside of hardcore number crunching, and short is (almost?) never used as an optimization on integer, but a limitation of its domain. And, at least for C and C++, any style guide will tell you to use a type with a meaningful name instead.

ARMs were outselling x86 by the end of the 90s, just nobody took any notice till the smartphone boom. (In units shipped at least)


>>> Programming languages should aim to lower the cognitive load of their programmers, not the opposite.
>>
>> I agree, but this is so irrelevant it's laughable.
>>
>
> It is very relevant. Expecting the programmer to remember that some words mean completely different things than anywhere else is not good, and the more of those differences you have, the more difficult it is to use the language. And it's just not the type names, learning that you have to use enum instead of immutable or const for true constants was just as mind-boggling to me as learning that inline means anything but inline in C++.

I'm 100% with you on the enum thing, I don't struggle to remember it but its awful. Its the language equivalent of a "leaky abstraction", from an implementation point of view enum members and manifest constants are pretty much the same thing, so why not use the same keyword? Its like saying well a single int is actually just an array with one member so from now on you have to declare ints as arrays

int[1] oh_really;

My other pet hate is nothrow. That actually means no exceptions, not not that it wont throw, it can still throw errors.

Oh yeah and

assert(0)

i hate that too


>>> To paraphrase your agument:
>>> A mile is 1760 yards
>>> A yard is 3 feet
>>> A foot is 12 inches
>>> What's so hard to understand? If that is causing you problems then you probably need to reconsider your career path.
>>
>> If your job requires you to work in inches, feet and yards every single day then yes you should know that off the top of your head and you shouldn't even have to try.
>>
>> And if you find it difficult then yes you should reconsider your career path. If you struggle with basic arithmetic then you shouldn't really be looking at a career in engineering.
>
> That's circular reasoning. The whole argument is that your day job shouldn't require rote memorization of silly incantations. As for "basic arithmetic" - there is a reason why the whole world, bar one country, moved to a sane unit system.

The reason was because the actual math was easier, not because it was hard to remember what a foot was. Which doesnt apply here, we're just talking about names, not about whether the system makes actually working with the units easier.

March 29, 2020
On Sunday, 29 March 2020 at 04:04:21 UTC, Denis Feklushkin wrote:
> On Sunday, 29 March 2020 at 00:19:57 UTC, NaN wrote:
>> On Saturday, 28 March 2020 at 21:38:00 UTC, Denis Feklushkin wrote:
>>> On Saturday, 28 March 2020 at 19:50:44 UTC, NaN wrote:
>>>
>>>>
>>>> Dont design based on imaginings of the future, you will almost always get it wrong.
>>>
>>> This is almost already reality, not future.
>>
>> I was responding to your statement regarding FPGAs. If they become ubiquitous, and if people want to use D to program them, and if someone does the work to make it happen, then maybe different width basic types *might* be needed.
>
> I do not suggest adding  or resizing types. I suggest name them more correctly to exclude cause of confusion for people without a beard :-)

I think you said something along the lines of... the FPGA are coming, they might have 6 bit bytes, so we should be prepared and start using a naming system that can accommodate that.

Im saying that you shoudnt base decisions on predictions like that because they are almost always wrong.

March 29, 2020
On Sat, 2020-03-28 at 01:42 -0700, Walter Bright via Digitalmars-d wrote:
> On 3/27/2020 10:22 AM, Russel Winder wrote:
> > Clearly D
> > remaining at v2 for ever more would, I feel,  be a Very Bad Idea™
> > since
> > it advertises no changes to the language, i.e. a language with a
> > stalled evolution.
> 
> If this happens it still seems like a marketing failure. After all,
> C++ gets a
> year appended and yet has large changes.

It is always a delicate balance of keeping a language vibrant and alive in the minds of those *not* already committed to it, and seeming a niche language dead to the mainstream.

Switching D to a proper semantic versioning system would, in my view, help keep D in the former category, and out of the latter one. If we get to D 2.999, I would suggest D has moved into to latter category.

Yes I know Torvalds did the major version hack on Linux simply to avoid seeming stuck and stalled, not for any actual technical reasons, but it worked.

-- 
Russel.
===========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk



March 29, 2020
On Sat, 2020-03-28 at 11:01 +0000, Paulo Pinto via Digitalmars-d wrote:
> […]
> 
> Groovy isn't properly a good exemple.

I see no reason why it isn't, it is an evolving language following the semantc versioning model.

> If it wasn't for Gradle and its use in Android, it would be long gone and forgotten.

In you opinion. The evidence I see is that Groovy has more traction in Java sites than is immediately apparent. Clearly Kotlin is challenging the role of Groovy in many respects, but Groovy is still used by many orgsanisation fro dynamic programing. The analogy is where C++ codebases use Python or Lua.

> And even there, there is a big pressure to replace it with Kotlin, in what regards Android build infrastructure.

Kotlin rather than Groovy is the language of choice on the Android platform these days certainly, but there are a lot of JVM installation out there using Java, Kotlin, and Groovy – not to mention Scala, Clojure, etc. – all going along happily. Yes there are a lot of those installations that will only use Java.

> So is the fate of any guest language until the main platform language catches up.

Java can never catch up with Groovy, whereas is can catch up with Kotlin. Kotlin is the guest language you are talking of for most Java installation, not Groovy. Statis Groovy may be a dead thing, but Dynamic Groovy is far from dead.

-- 
Russel.
===========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk



March 29, 2020
On Sunday, 29 March 2020 at 09:47:15 UTC, Russel Winder wrote:
> On Sat, 2020-03-28 at 11:01 +0000, Paulo Pinto via Digitalmars-d wrote:
>> […]
>> 
>> Groovy isn't properly a good exemple.
>
> I see no reason why it isn't, it is an evolving language following the semantc versioning model.
>
>> If it wasn't for Gradle and its use in Android, it would be long gone and forgotten.
>
> In you opinion. The evidence I see is that Groovy has more traction in Java sites than is immediately apparent. Clearly Kotlin is challenging the role of Groovy in many respects, but Groovy is still used by many orgsanisation fro dynamic programing. The analogy is where C++ codebases use Python or Lua.
>
>> And even there, there is a big pressure to replace it with Kotlin, in what regards Android build infrastructure.
>
> Kotlin rather than Groovy is the language of choice on the Android platform these days certainly, but there are a lot of JVM installation out there using Java, Kotlin, and Groovy – not to mention Scala, Clojure, etc. – all going along happily. Yes there are a lot of those installations that will only use Java.
>
>> So is the fate of any guest language until the main platform language catches up.
>
> Java can never catch up with Groovy, whereas is can catch up with Kotlin. Kotlin is the guest language you are talking of for most Java installation, not Groovy. Statis Groovy may be a dead thing, but Dynamic Groovy is far from dead.

The times that Groovy made any headlines in German Java conferences or local JUGs are long gone, I wonder where Groovy is being used above a single digit usage market share on the Java platform.

I was quite surprised that Groovy actually managed to release the 3.0 version.

It is not my opinion, rather what any Java market analysis report will easily confirm.


March 29, 2020
On 3/28/20 1:09 PM, Denis Feklushkin wrote:
> On Friday, 27 March 2020 at 15:56:40 UTC, Steven Schveighoffer wrote:
>> There have been a lot of this pattern happening:
>>
>> 1. We need to add feature X, to fix problem Y.
>> 2. This will break ALL CODE IN EXISTENCE
>> 3. OK, cancel the fix, we'll just live with it.
>>
>> Having a new branch of the compiler will provide a way to keep D2 development alive while giving a playground to add new mechanisms, fix long-existing design issues, and provide an opt-in for code breakage.
>>
>> Some issues I can think of:
> 
> I have long wanted to offer but there was no suitable place. I would like to propose to trivial rename standart type names by this way:
> 
> int -> int32
> ulong -> uint64
> float -> float32
> double -> float64
> byte -> octet

I would say no, for 2 reasons. One, this is basically renaming without benefit. All those types are well defined, and there is no problem with sizing. Two, you can already do this with aliases if this is what you wish for.

> Reason:
> 
> Most developers no longer remember where these names came from and why it are so called. In the future this number will close to 100%. And soon we will have access to all sorts of non-standard FPGA implemented CPUs with a different byte size, for example.

Again, alias can already solve this problem. I would recommend for those implementations, if D were to support them, that byte not be changed to the native byte size, but rather a new type introduced that covers it.

I think D 3.0 doesn't mean "let's break everything", it should be an incremental release, but one that is *allowed* to have fixes we have been wishing for that break things we cannot break with 2.x.

-Steve
March 29, 2020
On Friday, 27 March 2020 at 15:56:40 UTC, Steven Schveighoffer wrote:
> There have been a lot of this pattern happening:
>
> 1. We need to add feature X, to fix problem Y.
> 2. This will break ALL CODE IN EXISTENCE
> 3. OK, cancel the fix, we'll just live with it.
>
> Having a new branch of the compiler will provide a way to keep D2 development alive while giving a playground to add new mechanisms, fix long-existing design issues, and provide an opt-in for code breakage.
>
> Some issues I can think of:
>
> 1. The safe by default debate
> 2. pure by default
> 3. nothrow by default
> 4. String interpolation DIP
> 5. auto-decoding
> 6. range.save
> 7. virtual by default
> 8. ProtoObject
>
> Other languages evolve much quicker than D, but break things only in major updates. D seems to "sort of" break things, there's always a risk in every release. We try to be conservative, but we have this horrible mix of deciding some features can break things, while others are not allowed to, and there's no clear guide as to which breakage fits in which category.
>
> If we went to a more regular major release schedule, and decided for a roadmap for each major release what features would be included, it would allow much better planning, and much more defensible breakage of code. If you know that your code will only compile with D2.x, and you're fine with that, then great, don't upgrade to D3.x. If you desperately want a feature, you may have to upgrade to D3.x, but once you get there, you know your code is going to build for a while.
>
> We could also not plan for many major releases, but at least move to D3 for some major TLC to the language that is held back to prevent breakage.
>
> I work occasionally with Swift, and they move very fast, and break a lot of stuff, but only in major versions. It's a bit fast for my taste, but it seems to work for them. And they get to fix issues that languages like C++ might have been stuck with forever.
>
> The biggest drawback is that we aren't a huge language, with lots of manpower to keep x branches going at once.
>
> I just wanted to throw it out as a discussion point. We spend an awful lot of newsgroup server bytes debating things that to me seem obvious, but have legitimate downsides for not breaking them in a "stable" language.
>
> -Steve

What about phobos and the druntime then?
If we switch to D3 it makes sense to shrink the druntime to a smaller one that doesn't bring all the weight of unused features.

I don't know the rust model and I only have experiences with python libraries working with both py2 and py3.
What about keeping D2 and D3 code interoperable between each other? That would mean offering the possibility to use a D2 library in a D3 codebase and a D3 library in a D2 codebase, in the same way we do with C right now.