February 16

On Friday, 16 February 2024 at 16:15:30 UTC, Paul Backus wrote:

>

I think for DIP1000 to succeed...

For me, dip1000 has already succeeded. It aligns very well with structured concurrency. I use it heavily to move almost every async state to the stack.

>

Yes, we need to do as much as we can to reduce the burden of adoption, but even if we do, DIP1000 is always going to be a high-impact breaking change.

Most of the code out there that isn't @safe is effectively @trusted. There is no shame in marking it as such. That generally provides a good transition path.

Yes, it won't get any benefits until code is @safe, but it can be done gradually.

There are some pain points of course, sometimes it is difficult to convince the compiler what the code does is safe, or when you have to decipher which of the many attributes one has to use. There is also sometimes a cascading need to sprinkle scope on member functions.

Generally though, I find that code that has adopted dip1000 to have a better api and less surprises on the inside. Its subtle, but is there.

So yeah, it can sometimes be very difficult to get everything @safe. That said, application code mostly doesn't have to care. Let all the tricky dip1000 concepts be constrained to libraries and just write applications using the GC.

February 18

On Friday, 16 February 2024 at 04:38:03 UTC, H. S. Teoh wrote:

>

On Fri, Feb 16, 2024 at 01:44:51AM +0000, Paul Backus via Digitalmars-d wrote: [...]

>

What can we learn from this for D?

First, that the success and popularity of a programming language is mostly determined by factors other than stability and backward compatibility (or lack thereof).

+100. Over the past 5-10 years or so, I've been finding myself wishing that D would introduce some breaking changes so that it could clean up some of its dark, ugly corners that have been like flies in the ointment for a long time.

At the same time, I'd had deprecations and breaking changes that are really rather minor, but extremely frustrating, because:

>

Second, that even without an edition bump, small-scale breaking changes with easy migration paths aren't a big deal.

The worst feeling is when you upgrade your compiler, and suddenly you find yourself having to do major code surgery in order to make previously-fine code work again. Having an easy migration path for breaking changes is very important.

I'd also add that the migration path should be easy: it shouldn't take too much thought to upgrade the code, and should not involve tricky decisions based on subtle semantic differences that require deep understanding of the code to make the right choice.

The std.math.approxEqual deprecation is a major example that I keep running into. It's intended to be replaced by isClose, with all the right intentions. But it was frustrating because (1) it didn't feel necessary -- previous code worked fine even if there were some pathological cases that weren't being handled correctly. (2) The deprecation message didn't give a clear migration path -- isClose has different parameters with subtly different semantics from approxEqual, and it wasn't obvious how you should replace calls to approxEqual with equivalent calls to isClose. There were also no easy defaults that you could use that replicated the previous behaviour; you had to sit down and think about each call, then look up the docs to be sure. (3) The choice of name felt like a serious blunder, even if it was made for all the right reasons.

All of these added up to a very frustrating experience, even if the intentions were right.

If we could have done this over again, I'd have proposed to keep the old semantics of approxEqual, perhaps add another parameter that would use the new semantics. And make sure the deprecation message is clear about how exactly you go about deciding what to put in the new parameter. I.e., for lazy authors not changing anything would let their code continue to work as before; if they wanted to have the new semantics they'd have to explicitly opt in.

>

Third, that even with an edition bump, large-scale breaking changes that make migration difficult should probably be avoided.

Yes, large breaking changes are a no-no. Unless my old code can continue compiling as before, and I have to opt-in to the new stuff. Editions would help with this, but it still depends on the execution. There should always be a good migration path that doesn't require you to rewrite 5-10 year old code that you no longer remember the details of and can no longer confidently reimplement without spending disproportionate amounts of time to re-learn the ins and outs of it.

>

Fourth, that breaking changes should be used to give D programmers more of what they already like about D, not to take the D language in new directions.

TBH, @nogc, dip1000, @live, etc., feel a lot like D trying to go in entirely new directions. The fact that it's been years and still practically nobody understands exactly how it works and what it does, is not a good sign. And all this while things like share and static initialization of AA's are stagnating. Built-in AA's are one of my major reasons for choosing D, and seeing it languish for years with elementary features like static initialization not fixed is quite disheartening. Worse when it feels like D wants to move to newer pastures when its current features are still half-done and has problematic corner cases. I.e., what I like about D is stagnating, while new features that I have little interest in are being pushed on me.

>

To Walter, Atila, and the rest of D's leadership, I hope this post provides some helpful data points for you to take into account when designing D's language editions and planning future language changes.

To everyone else reading this, I'd like to leave you with one last question: what do you like about D? What strengths does D have, as a language, that you'd like to see become even stronger?
[...]

What I like about D:

  • Meta-programming power.

    • CTFE should be improved. By a lot. It was a big disappointment
      that Stefan's newCTFE never materialized. IMO we should be
      improving this story instead of trying to chase rainbows like ARC
      with @live and dip1000 and what-not. We should make this so good
      that I'll never need to use an external codegen utility again. And
      it should not introduce crazy compile times. This is a primary D
      strength, its story should be maximally optimized.

    • The template story should be improved. There should be a way of
      working with templates that cut down on needless bloat. Lots of
      room for exploration here. We shouldn't be confined by C++
      limitations here. This is one of D's primary strengths and where we
      can pioneer even more. One area is improving IFTI to make it work
      for even more common cases. Another is recognizing common patterns
      like chains of ranges, and optimizing symbol generation so that you
      don't end up with unreasonably huge symbols. Esp. when it's a
      one-of-a-kind UFCS chain (it's unlikely you're ever going to have
      exactly the same chain twice with exactly the same template
      arguments -- no point encoding every argument in the symbol, just
      an ID that gets incremented per instantiation is good enough).

    • Compile-time introspection and DbI. This is another huge D
      strength, and we should be working on streamlining it even more.

      • Clean up __traits(), make std.traits more sensible.
      • Fix things like scoping issues with static foreach. Introduce
        local aliases so that static foreach doesn't need crazy hacks
        with {{...}} and temporary templates just for injecting new
        identifiers per iteration without running into multiple
        declaration errors.
      • Improve the syntax for retrieving members of some symbol.
        Something prettier than __traits(getAllMembers,...). This is a
        primary D strength, it should be dressed in better syntax than
        this.
      • Maybe first-class types to make the metaprogramming story even
        more powerful.
  • GC. Instead of bending over backwards trying to woo the @nogc crowd
    who are mass migrating to Rust anyway, what about introducing write
    barriers that would allow us existing D users to use a much more
    competitive GC algorithm? Stop-the-world GC in 2024 shouldn't even be
    a thing anymore. We aren't in 1998 anymore. D should embrace the GC,
    not sacrifice it for the sake of wooing a crowd that isn't likely to
    adopt D regardless. Instead of trying to get away from the GC, what
    about making the GC experience better for existing D users?

  • Built-in AA's. It's been at least a decade. Why is static
    initialization support still sketchy?

  • Built-in unittests. The default experience should be top-of-the-line.
    We shouldn't need to import a dub package for something beyond the
    current dumb built-in test runner. Named unittests, the ability to
    select which tests to run, the ability to run all tests regardless of
    failure and show stats afterwards -- these are all basic
    functionalities that ought to work out-of-the-box.

  • Automatic type & attribute inference. auto was revolutionary when I
    first joined D (C++ got it only years later). We should improve type
    and attribute inference to the max (e.g., in default parameters for
    enums: there should be no need to repeat the enum name). Nobody like
    spelling out attribute soup, just like nobody likes spelling out
    explicit types when it's already obvious from context. The compiler
    should automate this to the max. A little bit of breakage here IMO is
    acceptable as long as it gets us to an even better place. Have
    negated attributes be a thing as well. In fact, make attributes
    first-class citizens so that we can use DbI / metaprogramming to
    manipulate it. This is what we should be focusing our efforts on
    instead of trying to woo an amorphous group of hypothetical potential
    users somewhere out there who aren't particularly likely to adopt D to
    begin with.

+100
And the subtle business mentioned first of cleaning up dark ugly corners with breaking changes gets +110. Here are a couple of bottom level ones that jumped out at me in the last few days.

--- Fix void to be a type like any other and not an unconsidered edge case to trip over.

--- Remove automatic conversions between signed and unsigned so that unconsidered bizarre semantics is excluded from D.

Here's a sane take on GC, not more ideology.
https://bitbashing.io/gc-for-systems-programmers.html
--- Make the GC a 21st century GC! And acknowledge that using it by default makes sense as per the short article.

February 19

On Friday, 16 February 2024 at 16:15:30 UTC, Paul Backus wrote:

>

Which means that in addition to providing a migration path, we need to have strong buy-in from the community that @safe is one of D's core strengths, and improvements to @safe are something they're willing to suffer some breakage for.

For code still in active development, sure. No point in enabling @safe in the first place if there's no willingness to use it with the rules that are actually safe, which means turning DIP1000 on. If people refuse keeping structs/arrays they refer to at safe code in the heap and also refuse taking their time to learn DIP1000, then they essentially don't want real memory safety.

But for code in maintenance-only mode, it's different. Whatever bugs it may have had because of lack of DIP1000 are usually already caught the hard way, or if they remain they manifest only very rarely. That people maintaining code like that don't want to update it doesn't really mean they don't want @safe or DIP1000, as in this case the work-to-benefit ratio is much worse than for new code.

>

If we don't have that buy-in, then, as much as it pains me to say it, the correct move is probably to give up on @safe altogether, and focus our efforts on improving D's other strengths.

Fortunately we don't have to make a hard yes-or-no decision on that one since the language as-is allows keeping @system on for everything.

February 19

On Friday, 16 February 2024 at 01:44:51 UTC, Paul Backus wrote:

>

To everyone else reading this, I'd like to leave you with one last question: what do you like about D? What strengths does D have, as a language, that you'd like to see become even stronger?

It revolves around the fact it's truly general purpose - scripting, application development and systems programming all equally supported. And also that the language doesn't impose restrictions just for sake of some stylistic discipline.

If D ever were to "pick it's camp" and either force me to use the GC and the standard runtime whether I want or not, or alternatively ditch the GC it and impose RAII/ref counting instead, that would drive me away.

February 19

On Friday, 16 February 2024 at 01:44:51 UTC, Paul Backus wrote:

>

Why might that be? Glancing over some of the Rust release notes myself, I notice a couple of trends.

  1. Many of the changes are low-impact and provide a clear migration path for existing code.
  2. Many of the changes involve fixing holes in Rust's memory-safety checks.

So here is my first impression of Rust, I had to build a measuring tool today and the only workable one happened to be in Rust.

Remarks:

  • "rustup" can be get with a curl | sh commandline... it's nice, it means it's a few less click to get, and you can't really have a bad rustup version anyway.
  • I had to install various nightlies using "rustup toolchain", which is an "edition" + a target. A bit like Flutter does.
  • cargo is essentially like dub, I saw no big difference here. The binaries are annoyingly deep in sub-directories to find. One letter shortcuts useful.
  • code didn't build (1000+ packages!), I had to fix the dependencies in ~/.cargo to proceed. I must say it all build pretty fast.
    Now, for build errors, there were an enormous wall of text (almost too much) explaining how to fix the issues, so despite my ignorance of the language it was possible to continue.
  • libraries don't looks as friendly without the GC, for example CLAP vs a D argument parser.

Pros:
I think rustup is a big win here, a principled approach to install this or that version of D for this or that platform would perhaps be a good idea, it is similar to DVM, but DVM isn't much used.

Cons:
A bit of a wordey experience, and necessity to use nightlies for features that are in nightly since 3 years (will they get merged?). Unnatural package count.

February 20

On Monday, 19 February 2024 at 17:14:21 UTC, Dukc wrote:

>

It revolves around the fact it's truly general purpose - scripting, application development and systems programming all equally supported. And also that the language doesn't impose restrictions just for sake of some stylistic discipline.

+1

>

If D ever were to "pick it's camp" and either force me to use the GC and the standard runtime whether I want or not, or alternatively ditch the GC it and impose RAII/ref counting instead, that would drive me away.

To be clear, I am not suggesting that D should force using the GC. Quoting the article I mentioned (https://bitbashing.io/gc-for-systems-programmers.html):

>

Many developers opposed to garbage collection are building “soft” real-time systems. They want to go as fast as possible—more FPS in my video game! Better compression in my streaming codec! But they don’t have hard latency requirements. Nothing will break and nobody will die if the system occasionally takes an extra millisecond.
[...]
Modern garbage collection offers optimizations that alternatives can not. A moving, generational GC periodically recompacts the heap. This provides insane throughput, since allocation is little more than a pointer bump! It also gives sequential allocations great locality, helping cache performance.
[...]
I’m not suggesting that all software would benefit from garbage collection. Some certainly won’t. But it’s almost 2024, and any mention of GC—especially in my milieu of systems programmers—still drowns in false dichotomies and FUD. GC is for dum dums, too lazy or incompetent to write an "obviously" faster version in a language with manual memory management.

It’s just not true. It’s ideology. And I bought it for over a decade until I joined a team that builds systems—systems people bet their lives on—that provide sub-microsecond latency, using a garbage-collected language that allocates on nearly every line. It turns out modern GCs provide amazing throughput, and you don’t need to throw that out for manual memory management just because some of your system absolutely needs to run in n clock cycles. (Those specific parts can be relegated to non-GC code, or even hardware!)

What I am suggesting is that a modern GC for D would be a game-changer, with the default of using the GC being the best answer most of the time. People who didn't believe this could find it out experimentally.

February 20

On Friday, 16 February 2024 at 01:44:51 UTC, Paul Backus wrote:

>

In [a 2019 blog post][1], which I found today on the front page of /r/programming, Michael Orlitzky complains that modern languages have too many breaking changes. He contrasts "grownup" languages like C and Ada, which have remained stable and largely backwards compatible for decades, with languages like Rust, which [regularly breaks compatibility][2] despite having promised otherwise in its 1.0 release.

[...]

Thanks for writing this, some very good points here. I think that making migration easier is something we need to focus on, but that probably needs dmd as a library to be easier to use.

In the case of DIP1000 specifically I think maybe Robert's idea of moving its checks to @trusted may be that way forward, and making @safe regular GC D. Once I'm done with editions I'm going to write a DIP for this.

February 20

On Tuesday, 20 February 2024 at 09:03:15 UTC, Atila Neves wrote:

>

In the case of DIP1000 specifically I think maybe Robert's idea of moving its checks to @trusted may be that way forward, and making @safe regular GC D. Once I'm done with editions I'm going to write a DIP for this.

Please remember that it won't be any better than DIP1000 for backwards compatibility - in fact it's even worse, since:

  • With DIP1000 your slice of static array or pointer to a struct field will still compile if you didn't happen to escape it. With Roberts proposal it will always need fixing.

  • With Roberts proposal your choices for breaking code are either removing @safe or moving your variables to the heap. With DIP1000 you can also do either, but you also have a third choice: adding scope annotations.

February 20

On Tuesday, 20 February 2024 at 00:48:42 UTC, Carl Sturtivant wrote:

>

On Monday, 19 February 2024 at 17:14:21 UTC, Dukc wrote:

>

If D ever were to "pick it's camp" and either force me to use the GC and the standard runtime whether I want or not, or alternatively ditch the GC it and impose RAII/ref counting instead, that would drive me away.

To be clear, I am not suggesting that D should force using the GC.

No worries, wasn't thinking you - rather the numerous doomsayers here over the years that say D can't succeed "if it can't decide what it wants to be". If they are right, it means my criterion for an ideal language is fundamentally backwards.

>

What I am suggesting is that a modern GC for D would be a game-changer, with the default of using the GC being the best answer most of the time. People who didn't believe this could find it out experimentally.

D:s GC is more modern than you might think. Yes, it stops the world and isn't as fast as many of the alternatives, but it's because it has an unique feature. Namely, the GC-collected memory can be referred to with raw pointers. Any GC that can stop it's collection and resume it afterwards must only use special references (that tell the GC when they are assigned to), meaning you can't let regular C functions to handle your GC-collected memory.

Maybe D would still benefit from a GC that requires pointer write gates, but then all D pointers (and other references, save for function pointers) of that program would need that write gate. This is not an option if, say you're writing a library for a foreign language that must work without the DRuntime being initialised. Hence the pointer assignment semantics would have to be behind a compiler switch. Or maybe pointer assignment would call a symbol somewhere in DRuntime that can be redefined by the user.

February 20

On Friday, 16 February 2024 at 04:38:03 UTC, H. S. Teoh wrote:

>

GC. Instead of bending over backwards trying to woo the @nogc crowd

I just want to chime in here: I'm a GC minimalist who avoids it whenever reasonably possible and only uses it when it makes the most sense. And I really don't care about @nogc. Feel free to dump it and work on cooler stuff, you have my support!

>

trying to woo an amorphous group of hypothetical potential users

I feel like this describes a lot of the general philosophy I get observing these forums lately. A perhaps overly cynical interpretation would be that (some of?) the D community is so insecure about losing any more of its already well-known to be small userbase that it's terrified to make any serious meaningful positive changes in the event some unknown person somewhere with a dub package that hasn't been touched in 7 years gets annoyed typing build and decides to move on (hey, I get annoyed typing dub build every time). Or some vague future new user to whom this will happen 7 years hence. Every time I come to General I see another thread with a deep, introspective, heavily passionate argument about why we can't have nice thing because of some astronomically remote edge case and everything gets frozen into a moebius loop of trying to figure out how to account for every possible combinatorial ways it might be used or misused. I'll grant lack of foresight has long been the C Family Curse, but there is such a thing as being too navel-gazing as well.

Every time I see one of these hot topics and start drawing up a response, I watch it spiral deeper into interdependent debates, bringing up every flaw that D, phobos, and C++ have ever experienced in their lifetimes, of why it needs to be absolutely perfect to some exacting cosmological standard so that the nebulous supercorporation that may or may not use it in some unspecified future will be sufficiently satisfied with the implementors' fidelity, and sigh and delete the post. I've discarded more drafts to this forum than I've ever submitted. Why argue with the heavyweights? They've got all the scientific proof that doing anything that may need to change someday is simply impermissible, and all I have is a fondness for nice things. Nice things are what drew me to D in the first place, but now nice things are anathema, because we might be required to take responsibility for them someday. Why get a dog if you have to walk it?