On Thursday, 28 April 2022 at 07:54:44 UTC, Ola Fosheim Grøstad wrote:
> On Wednesday, 27 April 2022 at 22:43:25 UTC, Adrian Matoga wrote:
> of like it) at work. I've recently returned to tinkering with electronics and programming at home so let me share my view.
Do you use or plan to use microcontrollers? If so, with what language?
I do, mostly STM32s. Based on what Adam and Mike's had shared it wasn't hard to get started, but lacking a readily usable HAL or RTOS was heavily distracting from actual work towards constructing some basic framework instead. Still, all the compile time stuff D has is very useful in that environment, so is e.g. the scope(exit)/scope(failure) feature that makes resource cleanup much less confusing without the need to write any wrappers.
Currently I'm working on an RPi Hat where I had to add some drivers to Linux, where anything other than C won't work, but I have the userspace app on top of it, which is all written in D and it's way more convenient for me to develop it than e.g. in C++, even though I had to manually add a lot of bindings. Plus, I have very pleasant experience with GDC on RPi since the very moment I got my first Pi around 2013. LDC works very well too, but GDC is easier to get OOTB both in native Raspbian and in buildroot. Iain's work is a true game changer.
> > technology or even non-technology related idea too. Python became the default language for ML, because it was easy enough for people whose main focus wasn't programming, and who didn't require system level performance because available bindings to C libraries were good enough.
Yes, but I think this also has something to do with Python replacing Matlab in academic research institutions. Python is becoming the default platform for analysis and experimentation.
Right! I studied CS at physics department and many teachers were either nuclear or solid state physicists, so we did a lot of Matlab, and Python was only about to enter the field. ROOT was also used in some projects but I could never wrap my head around its weirdness.
> > What D tried to do was to be "better C++" or "better C", but in 2022 it's about 40 years too late to be successful in that. There're millions of programs in C and C++ that have been good enough to make revenue for many companies and thus convinced others to invest more money, effort and time in more stuff that depends on C and C++.
Yes, and they are ISO standards, so nobody "owns" C or C++. That creates a more "open" evolution that is industry-oriented (the main purpose of ISO is to make industrial tools and interfaces interoperable).
Yup, standardization may play a big role in adoption too. We've worked with customers who strongly insisted on sticking to OpenCL C with no extensions, rather than switching to CUDA or any vendor-specific extensions to OCL, both to have clean hands in terms of safety and to avoid vendor lock-ins, even though that meant worse performance and fatter hardware bills.
> > do something beyond those. I recall I had some good experience with C# in terms of how quickly I was able to reuse existing libraries and implement any new code, especially with pretty convenient tooling from MS, but that was long time ago when it wasn't seriously usable outside Windows and I didn't have much interest in developing for Windows later.
What made C# easy to use? Was it auto-completions and suggestions in the IDE, or was it something that has to do with the language itself?
The language itself in the times of D1 was very similar in look and feel and expressiveness to D, and they felt equally convenient for me. And coming from {} mindset, both were easy for me to pick up. However, the VS support for C#, both in auto-completions and interactive debug, was mind-blowing. It made coding pretty effortless. Nowadays a properly configured IDE for C, C++ or D can be just as good in terms of completions.
As for debugging, I've been working for some time in projects where languages from python to assembly are mixed with jsons and yamls that control various stages of code generation, with compilers and even HW specs being still under development, so I've learned not to expect anything more than print-, memory dump-, logic analyzer- or even guess-based debugging.
> > What I've missed the most so far in D was a zero-effort reuse of C libraries, because there's a lot of useful libs in C I already know.
Yes, has the new import-C feature been helpful for you in that regard?
Not yet, as it's not in GDC yet. I expect it to be a huge win for system-level headers, like all the ioctls, V4L2, alsa etc. I'd really feel safer ABI-wise if they were parsed right from the same sysroot as rest of the system.
> > Of course it's much less tedious to interface C in D than in Python, but I bet if D had a fully-blown ImportC from the very beginning, it could be where C++ is today.
When compared to C++, I'd say that D still needs to get its memory management story right and fix some language short-coming (incomplete features), but memory management is at least being looked at actively now. (People expect something more than malloc/free and roll-your-own ref-counting.)
Right, C++ has been developing faster for the last decade and got much more support from industry, and there are a lot of clever freaks working on it. Still many things that should be easy and straightforward are being made overly complex, for genericity or low-level control or whatever reasons. In that regard I prefer D, where I can prototype fast and gradually add lower-level optimizations as needed. The latter is also where I find Python very cumbersome compared to D.