Jump to page: 1 2 3
Thread overview
Thoughts on Backward Compatibility
Feb 16
Dom DiSc
Feb 16
Dom DiSc
Feb 20
cc
Feb 16
ryuukk_
Feb 16
ryuukk_
Feb 16
Dukc
Feb 19
Dukc
Feb 16
monkyyy
Feb 19
Dukc
Feb 20
Dukc
Feb 20
Dukc
February 16

In a 2019 blog post, which I found today on the front page of /r/programming, Michael Orlitzky complains that modern languages have too many breaking changes. He contrasts "grownup" languages like C and Ada, which have remained stable and largely backwards compatible for decades, with languages like Rust, which regularly breaks compatibility despite having promised otherwise in its 1.0 release.

This is a reasonable criticism, and yet what strikes me most about it is how seemingly irrelevant it ultimately is. Rust handily outranks Ada on any measure of language popularity you care to name (TIOBE Index, Stack Overflow Survey, Github statistics), and its adoption is still trending upward. Clearly, there is a sizeable contingent of programmers who do not view its breaking changes as deal-breakers.

Why might that be? Glancing over some of the Rust release notes myself, I notice a couple of trends.

  1. Many of the changes are low-impact and provide a clear migration path for existing code.
  2. Many of the changes involve fixing holes in Rust's memory-safety checks.

For breaking changes that fall under (1), it's easy to understand why Rust programmers put up with them. Switching languages is a much bigger hassle than doing a simple find-and-replace to update a renamed library function.

For breaking changes that fall under (2), it's a little less obvious. If your code has unknowingly been taking advantage of a safety hole, it can often require a fair amount of work to fix (as any D programmer who's tried to test their code with -preview=dip1000 can attest). Why are Rust programmers willing to subject themselves to that kind of aggravation?

The answer is: because that's the reason they chose Rust in the first place! Rust's memory-safety checks are its biggest value-add compared to other languages, and the main driver of its adoption. Making those checks more accurate and effective is giving Rust programmers more of something that they've already demonstrated they want.

It is worth noting that the vast majority of these breaking changes occurred within a single language edition. For example, Rust 1.5.0, which has thirteen breaking changes listed in its release notes, was an update to the 2015 edition that started with Rust 1.0.0 and continued until Rust 1.30.0. Again, this fact does not seem to have had any serious impact on Rust's adoption.

Are there languages where breaking changes have hurt adoption? The most notable example I can think of is Python 3, which has struggled for years to win over Python 2 programmers. And if we look at the changes introduced in Python 3, we can see that they follow very different trends than the Rust changes discussed above:

  1. Many of the changes are high-impact, requiring widespread changes to existing code, and lack a clear migration path.
  2. Many of the changes are focused on performance, correctness, and type safety.

Trend (1) is straightforwardly bad because it increases the cost of migration. Trend (2) may at first seem like a good thing, but the problem is that these qualities are not what most programmers chose Python 2 for. They chose it because it was easy to learn, convenient, and fun, and forcing them to rewrite all of their code to handle Unicode correctly or use generators instead of lists is the exact opposite of that.

What can we learn from this for D?

First, that the success and popularity of a programming language is mostly determined by factors other than stability and backward compatibility (or lack thereof).

Second, that even without an edition bump, small-scale breaking changes with easy migration paths aren't a big deal.

Third, that even with an edition bump, large-scale breaking changes that make migration difficult should probably be avoided.

Fourth, that breaking changes should be used to give D programmers more of what they already like about D, not to take the D language in new directions.

To Walter, Atila, and the rest of D's leadership, I hope this post provides some helpful data points for you to take into account when designing D's language editions and planning future language changes.

To everyone else reading this, I'd like to leave you with one last question: what do you like about D? What strengths does D have, as a language, that you'd like to see become even stronger?

February 15
On Fri, Feb 16, 2024 at 01:44:51AM +0000, Paul Backus via Digitalmars-d wrote: [...]
> ### What can we learn from this for D?
> 
> First, that the success and popularity of a programming language is mostly determined by factors other than stability and backward compatibility (or lack thereof).

+100.  Over the past 5-10 years or so, I've been finding myself wishing that D would introduce some breaking changes so that it could clean up some of its dark, ugly corners that have been like flies in the ointment for a long time.

At the same time, I'd had deprecations and breaking changes that are really rather minor, but extremely frustrating, because:


> Second, that even without an edition bump, small-scale breaking changes with easy migration paths aren't a big deal.

The worst feeling is when you upgrade your compiler, and suddenly you find yourself having to do major code surgery in order to make previously-fine code work again.  Having an easy migration path for breaking changes is very important.

I'd also add that the migration path should be *easy*: it shouldn't take too much thought to upgrade the code, and should not involve tricky decisions based on subtle semantic differences that require deep understanding of the code to make the right choice.

The std.math.approxEqual deprecation is a major example that I keep running into.  It's intended to be replaced by isClose, with all the right intentions. But it was frustrating because (1) it didn't feel necessary -- previous code worked fine even if there were some pathological cases that weren't being handled correctly.  (2) The deprecation message didn't give a clear migration path -- isClose has different parameters with subtly different semantics from approxEqual, and it wasn't obvious how you should replace calls to approxEqual with equivalent calls to isClose.  There were also no easy defaults that you could use that replicated the previous behaviour; you had to sit down and think about each call, then look up the docs to be sure.  (3) The choice of name felt like a serious blunder, even if it was made for all the right reasons.

All of these added up to a very frustrating experience, even if the intentions were right.

If we could have done this over again, I'd have proposed to keep the old semantics of approxEqual, perhaps add another parameter that would use the new semantics. And make sure the deprecation message is clear about how exactly you go about deciding what to put in the new parameter. I.e., for lazy authors not changing anything would let their code continue to work as before; if they wanted to have the new semantics they'd have to explicitly opt in.


> Third, that even with an edition bump, large-scale breaking changes that make migration difficult should probably be avoided.

Yes, large breaking changes are a no-no.  Unless my old code can continue compiling as before, and I have to opt-in to the new stuff. Editions would help with this, but it still depends on the execution. There should always be a good migration path that doesn't require you to rewrite 5-10 year old code that you no longer remember the details of and can no longer confidently reimplement without spending disproportionate amounts of time to re-learn the ins and outs of it.


> Fourth, that breaking changes should be used to give D programmers more of what they already like about D, not to take the D language in new directions.

TBH, @nogc, dip1000, @live, etc., feel a lot like D trying to go in entirely new directions.  The fact that it's been years and still practically nobody understands exactly how it works and what it does, is not a good sign. And all this while things like `share` and static initialization of AA's are stagnating.  Built-in AA's are one of my major reasons for choosing D, and seeing it languish for years with elementary features like static initialization not fixed is quite disheartening.  Worse when it feels like D wants to move to newer pastures when its current features are still half-done and has problematic corner cases.  I.e., what I like about D is stagnating, while new features that I have little interest in are being pushed on me.


> To Walter, Atila, and the rest of D's leadership, I hope this post provides some helpful data points for you to take into account when designing D's language editions and planning future language changes.
> 
> To everyone else reading this, I'd like to leave you with one last question: what do **you** like about D? What strengths does D have, as a language, that you'd like to see become even stronger?
[...]

What I like about D:

- Meta-programming power.
   - CTFE should be improved.  By a lot.  It was a big disappointment
     that Stefan's newCTFE never materialized.  IMO we should be
     improving this story instead of trying to chase rainbows like ARC
     with @live and dip1000 and what-not.  We should make this so good
     that I'll never need to use an external codegen utility again. And
     it should not introduce crazy compile times. This is a primary D
     strength, its story should be maximally optimized.

   - The template story should be improved.  There should be a way of
     working with templates that cut down on needless bloat.  Lots of
     room for exploration here.  We shouldn't be confined by C++
     limitations here. This is one of D's primary strengths and where we
     can pioneer even more.  One area is improving IFTI to make it work
     for even more common cases.  Another is recognizing common patterns
     like chains of ranges, and optimizing symbol generation so that you
     don't end up with unreasonably huge symbols. Esp. when it's a
     one-of-a-kind UFCS chain (it's unlikely you're ever going to have
     exactly the same chain twice with exactly the same template
     arguments -- no point encoding every argument in the symbol, just
     an ID that gets incremented per instantiation is good enough).

   - Compile-time introspection and DbI. This is another huge D
     strength, and we should be working on streamlining it even more.
      - Clean up __traits(), make std.traits more sensible.
      - Fix things like scoping issues with static foreach.  Introduce
        local aliases so that static foreach doesn't need crazy hacks
        with {{...}} and temporary templates just for injecting new
        identifiers per iteration without running into multiple
        declaration errors.
      - Improve the syntax for retrieving members of some symbol.
        Something prettier than __traits(getAllMembers,...). This is a
        primary D strength, it should be dressed in better syntax than
        this.
      - Maybe first-class types to make the metaprogramming story even
        more powerful.

- GC.  Instead of bending over backwards trying to woo the @nogc crowd
  who are mass migrating to Rust anyway, what about introducing write
  barriers that would allow us existing D users to use a much more
  competitive GC algorithm?  Stop-the-world GC in 2024 shouldn't even be
  a thing anymore. We aren't in 1998 anymore.  D should embrace the GC,
  not sacrifice it for the sake of wooing a crowd that isn't likely to
  adopt D regardless.  Instead of trying to get away from the GC, what
  about making the GC experience better for existing D users?

- Built-in AA's.  It's been at least a decade.  Why is static
  initialization support still sketchy?

- Built-in unittests. The default experience should be top-of-the-line.
  We shouldn't need to import a dub package for something beyond the
  current dumb built-in test runner. Named unittests, the ability to
  select which tests to run, the ability to run all tests regardless of
  failure and show stats afterwards -- these are all basic
  functionalities that ought to work out-of-the-box.

- Automatic type & attribute inference. `auto` was revolutionary when I
  first joined D (C++ got it only years later).  We should improve type
  and attribute inference to the max (e.g., in default parameters for
  enums: there should be no need to repeat the enum name). Nobody like
  spelling out attribute soup, just like nobody likes spelling out
  explicit types when it's already obvious from context. The compiler
  should automate this to the max.  A little bit of breakage here IMO is
  acceptable as long as it gets us to an even better place.  Have
  negated attributes be a thing as well.  In fact, make attributes
  first-class citizens so that we can use DbI / metaprogramming to
  manipulate it.  This is what we should be focusing our efforts on
  instead of trying to woo an amorphous group of hypothetical potential
  users somewhere out there who aren't particularly likely to adopt D to
  begin with.


T

-- 
People tell me that I'm skeptical, but I don't believe them.
February 16
On Friday, 16 February 2024 at 04:38:03 UTC, H. S. Teoh wrote:
> TBH, @nogc, dip1000, @live, etc., feel a lot like D trying to go in entirely new directions.

I don't think so. Ok, @nogc is somewhat superfluous (as you better control the GC with explicit commands in the few places where this is necessary), but the rest plays well with @safe and pure and so on.

I would like D to be a better Rust (as I hate the Rust syntax and the struggle with the borrow checker in code that has really noching to do with memory safety).

What I miss most is @safe by default and working @properties (and type-properties: it always bugs me that with user defined properties you need to use myProperty!T instead of T.myProperty - mostly because I tend to always use the wrong systax. I even started to re-define the buildin properties with templates, just so that I can use them all the same way).

February 16

On Friday, 16 February 2024 at 10:10:07 UTC, Dom DiSc wrote:

>

On Friday, 16 February 2024 at 04:38:03 UTC, H. S. Teoh wrote:
I would like D to be a better Rust (as I hate the Rust syntax and the struggle with the borrow checker in code that has really noching to do with memory safety).

nothing

>

What I miss most is @safe by default and working @properties (and type-properties: it always bugs me that with user defined properties you need to use myProperty!T instead of T.myProperty - mostly because I tend to always use the wrong systax.

syntax

>

I even started to re-define the buildin properties with templates, just so that I can use them all the same way).

what I mean is: if I declare a template with a single compile-time parameter as @property:

static @property template myProperty(T) {}

it should automatically be callable with "int.myProperty" or "myType.myProperty". Should be very easy to implement.

February 16

Fast compilation, modules, compile time stuff should have been enough for D to rule in the system language area.. but too much distraction (Java OOP stuff), time has passed and still no way to do better then C in some areas

And i disagree with you on Python3, breaking change didn't hurt its adoption, it gave it a new birth, what hurt its adoption was distro maintainers

I do lot of python these days, and it sadden me whenever i go back to D and i can't do a simple pattern match, i have to use the old ass verbose C style switch

Similar with returning multiple values, can't do it in D unless you import a template everywhere and even then, using them is not as smooth and is too verbose

It's hard to win people over these matter because everyone has its own idea of what's needed, what's important and what is useful

At some point the leadership has to bruteforce and implement things he thinks will be needed, important and useful

Expecting users to contribute is wrong, it's your project, we are only using it to power our own projects

Anyways.. i don't mind breaking changes, as long as:

  • it is properly documented

  • the upgrade path is automated with tooling as much as possible (i wish dmd had a: dmd fmt built in)

  • it was done to make the language future proof

Backward compatibility should have a minimum version, otherwise there is no way to fix past mistakes or to adapt

For me, the strengths of D:

  • fast compilation
  • close to C performance/control wise
  • very easy access to C code / libraries, great to kickstart a project
  • compile time type introspection
  • modules
  • not ruled by a big corp

It however show its lack of adaptiveness, when compared to the competition, its weaknesses:

  • verbosity/repetition in the wrong areas
  • no switch as expression
  • no native tuple
  • no tagged union
  • no static array length inference
  • can't have anonymous struct within structs
  • C++ style of "the solution is a template in phobos"
  • hard to predict what's next
February 16

Small addition to my post for a little prediction

I predict that people using high level languages will be replaced by generative AI tools

Python developers doing generative AI stuff turn back to C whenever they need performance

I want them to turn back to D

If you just have 1 D library that one of these devs depend on, then it's a win for D, because that'll attract ton of developers having an interest in maintaining, extending and funding it, and that's imo what we should all thrive for, enabling people to write pragmatic libraries with high impact, high performance, so no gc, no exceptions, no oop, so it's easy for these people to consume it

D has to compete with what exist today, and what's to come, it needs to be future proof

https://docs.modular.com/mojo/why-mojo.html

February 16

On Friday, 16 February 2024 at 01:44:51 UTC, Paul Backus wrote:

>

To everyone else reading this, I'd like to leave you with one last question: what do you like about D? What strengths does D have, as a language, that you'd like to see become even stronger?

You managed to have me look DIP1000 in a new light. If it's going to be made the default, the choices are:

  1. Put in a lot of thought to make the code compile.
  2. Move stack-allocated stuff to the heap so there's no need to fight with scope.
  3. Mark the code where there are problems @system.

As you said, 1. can be a real problem. If the code has been already battle-tested, it's far too much effort to be worth it if the code in question will not go any major overhauls anymore. 2. is okay in most cases - fortunately - but it can't be a general answer, especially not for a systems language like D.

And we really don't want people doing option 3. It's less of a problem than combining @system and stack references for new code, but the code is still probably going to see some changes. scope is an incredible footgun outside @safe.

Before, I would have said that if the code doesn't compile with DIP1000 it isn't verifiable as safe anyway, so nothing to be done. In practice though, even pre-DIP1000 safety checks are usually much better than @system or @trusted, and especially better than non-@safe code with scope references.

I didn't think much of the language foundations decision to keep pre-DIP1000 semantics the default until we have editions, but considering this it starts to make sense. We need a way to selectively enable pre-DIP1000 semantics for old functions before we move on, otherwise the choices for those who have old code are just too stark.

February 16
On Friday, 16 February 2024 at 04:38:03 UTC, H. S. Teoh wrote:
>
> Built-in AA's are one of my major reasons for choosing D, and seeing it languish for years with elementary features like static initialization not fixed is quite disheartening.  Worse when it feels like D wants to move to newer pastures when its current features are still half-done and has problematic corner cases.  I.e., what I like about D is stagnating, while new features that I have little interest in are being pushed on me.

Static aa initialization is in the language now.

https://dlang.org/changelog/2.106.0.html#dmd.static-assoc-array

-Steve
February 16

On Friday, 16 February 2024 at 15:20:34 UTC, Dukc wrote:

>

Before, I would have said that if the code doesn't compile with DIP1000 it isn't verifiable as safe anyway, so nothing to be done. In practice though, even pre-DIP1000 safety checks are usually much better than @system or @trusted, and especially better than non-@safe code with scope references.

I didn't think much of the language foundations decision to keep pre-DIP1000 semantics the default until we have editions, but considering this it starts to make sense. We need a way to selectively enable pre-DIP1000 semantics for old functions before we move on, otherwise the choices for those who have old code are just too stark.

I think for DIP1000 to succeed, we're going to have to come at this from both directions. Yes, we need to do as much as we can to reduce the burden of adoption, but even if we do, DIP1000 is always going to be a high-impact breaking change. Which means that in addition to providing a migration path, we need to have strong buy-in from the community that @safe is one of D's core strengths, and improvements to @safe are something they're willing to suffer some breakage for.

If we don't have that buy-in, then, as much as it pains me to say it, the correct move is probably to give up on @safe altogether, and focus our efforts on improving D's other strengths.

February 16

On Friday, 16 February 2024 at 01:44:51 UTC, Paul Backus wrote:

>

The answer is: because that's the reason they chose Rust in the first place!

You should at least consider other hypothesis

rust programmers are anti sanity, much like there williness to design a c++ like syntax hell from scratch maybe they masochistically enjoy breaking changes

rust success is political not meritocratic, firefox is on the short list of charities who are way over funded, beg for money, and letting their main project rot; firefox funded rust

bad verbose solutions make people feel smart for handling it, breaking changes for security is stratifying for rust users

While everyone wants good code, theres nothin in the world actually delivering that consistently what we get; so rust success isn't permissive

etc etc etc

« First   ‹ Prev
1 2 3