April 28, 2022

On Thursday, 28 April 2022 at 09:12:07 UTC, Ola Fosheim Grøstad wrote:

>

So it is all about the eco system and no advantages tied to the language itself (except avoiding memory pointers and such)?

I can't speak for anyone else, but I started using D about a decade ago and back then I always wished other languages had features D had, but now it's the opposite. I'm always missing features in D.

Here are some of the things I like better about C# than D and some things C# has that D doesn't.

  • async/await
  • Properties that actually work and also with a really good syntax IMHO (getters/setters)
  • string interpolation
  • Shortened methods (and properties.) - much better than the proposed version for D
  • nullability built-in ex. object?.method(), as well null-coalescing
  • pattern matching / switch expressions
  • out parameters that can be declared directly in the method call
  • built-in tuples, as well auto-expanding them to variables etc.
April 28, 2022

On Thursday, 28 April 2022 at 12:04:11 UTC, bauss wrote:

>

Here are some of the things I like better about C# than D and some things C# has that D doesn't.
...

+100 to all of those things.

Regrettably this community doesn't see much value in syntax sugar, and are fine with more bulky library solutions instead.

It's not even about the amount of keystrokes like many here claim, it's about readability. All these little things add up as the code grows, and it makes C# a joy for me to read and write.

April 28, 2022

On Thursday, 28 April 2022 at 12:33:48 UTC, SealabJaster wrote:

>

Regrettably this community doesn't see much value in syntax sugar, and are fine with more bulky library solutions instead.

It is possible to have syntactical sugar for library solutions if you have clearly defined protocols. Like D and C++ do with for-loops over containers.

It is in many ways the best approach, but the question is how to define the right protocol mechanism.

April 28, 2022

On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad wrote:

>

On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga wrote:

>

of like it) at work. I've recently returned to tinkering with electronics and programming at home so let me share my view.

Do you use or plan to use microcontrollers? If so, with what language?

I do, mostly STM32s. Based on what Adam and Mike's had shared it wasn't hard to get started, but lacking a readily usable HAL or RTOS was heavily distracting from actual work towards constructing some basic framework instead. Still, all the compile time stuff D has is very useful in that environment, so is e.g. the scope(exit)/scope(failure) feature that makes resource cleanup much less confusing without the need to write any wrappers.

Currently I'm working on an RPi Hat where I had to add some drivers to Linux, where anything other than C won't work, but I have the userspace app on top of it, which is all written in D and it's way more convenient for me to develop it than e.g. in C++, even though I had to manually add a lot of bindings. Plus, I have very pleasant experience with GDC on RPi since the very moment I got my first Pi around 2013. LDC works very well too, but GDC is easier to get OOTB both in native Raspbian and in buildroot. Iain's work is a true game changer.

> >

technology or even non-technology related idea too. Python became the default language for ML, because it was easy enough for people whose main focus wasn't programming, and who didn't require system level performance because available bindings to C libraries were good enough.

Yes, but I think this also has something to do with Python replacing Matlab in academic research institutions. Python is becoming the default platform for analysis and experimentation.

Right! I studied CS at physics department and many teachers were either nuclear or solid state physicists, so we did a lot of Matlab, and Python was only about to enter the field. ROOT was also used in some projects but I could never wrap my head around its weirdness.

> >

What D tried to do was to be "better C++" or "better C", but in 2022 it's about 40 years too late to be successful in that. There're millions of programs in C and C++ that have been good enough to make revenue for many companies and thus convinced others to invest more money, effort and time in more stuff that depends on C and C++.

Yes, and they are ISO standards, so nobody "owns" C or C++. That creates a more "open" evolution that is industry-oriented (the main purpose of ISO is to make industrial tools and interfaces interoperable).

Yup, standardization may play a big role in adoption too. We've worked with customers who strongly insisted on sticking to OpenCL C with no extensions, rather than switching to CUDA or any vendor-specific extensions to OCL, both to have clean hands in terms of safety and to avoid vendor lock-ins, even though that meant worse performance and fatter hardware bills.

> >

do something beyond those. I recall I had some good experience with C# in terms of how quickly I was able to reuse existing libraries and implement any new code, especially with pretty convenient tooling from MS, but that was long time ago when it wasn't seriously usable outside Windows and I didn't have much interest in developing for Windows later.

What made C# easy to use? Was it auto-completions and suggestions in the IDE, or was it something that has to do with the language itself?

The language itself in the times of D1 was very similar in look and feel and expressiveness to D, and they felt equally convenient for me. And coming from {} mindset, both were easy for me to pick up. However, the VS support for C#, both in auto-completions and interactive debug, was mind-blowing. It made coding pretty effortless. Nowadays a properly configured IDE for C, C++ or D can be just as good in terms of completions.
As for debugging, I've been working for some time in projects where languages from python to assembly are mixed with jsons and yamls that control various stages of code generation, with compilers and even HW specs being still under development, so I've learned not to expect anything more than print-, memory dump-, logic analyzer- or even guess-based debugging.

> >

What I've missed the most so far in D was a zero-effort reuse of C libraries, because there's a lot of useful libs in C I already know.

Yes, has the new import-C feature been helpful for you in that regard?

Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.

> >

Of course it's much less tedious to interface C in D than in Python, but I bet if D had a fully-blown ImportC from the very beginning, it could be where C++ is today.

When compared to C++, I'd say that D still needs to get its memory management story right and fix some language short-coming (incomplete features), but memory management is at least being looked at actively now. (People expect something more than malloc/free and roll-your-own ref-counting.)

Right, C++ has been developing faster for the last decade and got much more support from industry, and there are a lot of clever freaks working on it. Still many things that should be easy and straightforward are being made overly complex, for genericity or low-level control or whatever reasons. In that regard I prefer D, where I can prototype fast and gradually add lower-level optimizations as needed. The latter is also where I find Python very cumbersome compared to D.

April 28, 2022

On Thursday, 28 April 2022 at 15:28:40 UTC, Adrian Matoga wrote:

>

On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad wrote:

>

On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga wrote:

>

What I've missed the most so far in D was a zero-effort reuse of C libraries, because there's a lot of useful libs in C I already know.

Yes, has the new import-C feature been helpful for you in that regard?

Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.

GCC-12 has been branched and is currently in prerelease phase, with an RC to be made on Monday if all is going to schedule.

GDC in there is currently in sync with the current DMD HEAD on the stable branch (tag: v2.100.0-beta.1, dmd 313d28b3d, druntime e361d200, phobos ac296f80c).

If you can build it, feel free to send bug reports. :-)

April 28, 2022

On Thursday, 28 April 2022 at 19:02:39 UTC, Iain Buclaw wrote:

>

On Thursday, 28 April 2022 at 15:28:40 UTC, Adrian Matoga wrote:

>

On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad wrote:

>

[...]

Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.

GCC-12 has been branched and is currently in prerelease phase, with an RC to be made on Monday if all is going to schedule.

GDC in there is currently in sync with the current DMD HEAD on the stable branch (tag: v2.100.0-beta.1, dmd 313d28b3d, druntime e361d200, phobos ac296f80c).

If you can build it, feel free to send bug reports. :-)

wow wow

April 28, 2022
On 11/2/2021 11:48 AM, Paulo Pinto wrote:
> Then on the PC and Mac it quickly got the love from Apple (replacing Object Pascal with C++), IBM, Microsoft, Borland, Watcom,  SGI, Sun, HP, among others, and naturally Digital Mars/Symantec as well.

Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.) ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use.

This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon.

This popularity then fed back into the Unix systems.

No, you won't find this account in the D&E of C++ histories, but it's what actually happened.
April 29, 2022

On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:

>

On 11/2/2021 11:48 AM, Paulo Pinto wrote:

>

Then on the PC and Mac it quickly got the love from Apple (replacing Object Pascal with C++), IBM, Microsoft, Borland, Watcom,  SGI, Sun, HP, among others, and naturally Digital Mars/Symantec as well.

Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.) ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use.

This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon.

This popularity then fed back into the Unix systems.

No, you won't find this account in the D&E of C++ histories, but it's what actually happened.

What is D&E? Internet suggests "Design & Engineering"
Is that it?

April 28, 2022
On 4/28/2022 7:56 PM, Tejas wrote:
> What is D&E? Internet suggests "Design & Engineering"
> Is that it?

Design and Evolution
April 29, 2022
On Friday, 29 April 2022 at 01:33:36 UTC, Walter Bright wrote:
> On 11/2/2021 11:48 AM, Paulo Pinto wrote:
>> Then on the PC and Mac it quickly got the love from Apple (replacing Object Pascal with C++), IBM, Microsoft, Borland, Watcom,  SGI, Sun, HP, among others, and naturally Digital Mars/Symantec as well.
>
> Um, Zortech C++ was the first native C++ compiler on DOS in 1987. (The existing ones were all cfront based, and were terribly slow.)

From D&E:

"The size of this overhead depends critically on the time needed to read and write the intermediate C representation and that primarily depends on the disc read/write strat- egy of a system. Over the years I have measured this overhead on various systems and found it to be between 25% and 100% of the "necessary" parts of a compilation. I have also seen C++ compilers that didn't use intermediate C yet were slower than Cfront plus a C compiler."

That's not "terribly slow". And before you bring up "templates are slow to compile", in 1987 cfront did not have templates.

"The earliest implementation of templates that was integrated into a compiler was a version of Cfront that supported class templates (only) written by Sam Haradhvala at Object Design Inc. in 1989."


> ZTC++ produced the first boom in use of C++, accounting for perhaps 90% of C++ use.
>
> This popularity lead to Borland dumping their own OOP C and going with C++, which then led to Microsoft getting on the bandwagon.
>
> This popularity then fed back into the Unix systems.
>
> No, you won't find this account in the D&E of C++ histories, but it's what actually happened.

Well that's the history as you remember it and Stroustrup does list "1st Zortech C++ release" in June 1988. I cannot say if your "90%" figure is correct or not.