On Tuesday, 1 April 2025 at 16:21:59 UTC, Atila Neves wrote:
> https://github.com/atilaneves/DIPs/blob/editions/editions.md
TL;DR: Guarantees made by attributes will differ between editions and the DIP proposes nothing to mitigate that when the information where (and therefore which) guarantees are being made is erased.
A big issue are attributes. A simple example is the parameter storage class in
. There are, practically speaking, two variabnts: The preview one (which is const scope
and maybe ref
) and original one (which is const
). They look the same syntactically, but they’re not equal. Seeing a function with an in
parameter, the programmer and the compiler know which one it is because the function is defined in a module and that module is compiled with DIP1000 or not. However, what about a function pointer that has an in
parameter? Preview switches are take-it-or-leave-it: Different choices for -preview=in
make binaries incompatible.
For illustrative purposes, assume the pre-editions (PE) edition would support in
as const
and does not enable DIP1000, whereas a later edition has -preview=in
and -preview=dip1000
enabled. In a PE module, the type void function(in T[])
is functionally equivalent to void function(const T[])
, whereas in the later edition, it’s functionally equivalent to void function(scope const T[])
since T[]
is passed by value with preview-in
. With DIP 1000, the scope
storage class makes the promise that the parameter will not escape, enabling otherwise invalid calls.
The function pointer does not know where the function it points to is defined. There is neither type information nor run-time information on that. Thus, what in
means and what promises it makes is indeterminate and does not depend on the edition of the module the function pointer is lexically in: On a function call in the later edition, scope
enables the array argument to be stack-allocated and thus the enclosing function to be @nogc
, but if the function pointer actually points to a function defined in a PE module, no such guarantee is made (be it by the programmer for a @system
function or the compiler for a @safe
one).
The language can’t even be conservative about it and assume the least set guarantees because that would invalidate all progress made by the later edition.
module old;
const(int)* global;
void f(in int[] xs) @nogc @safe { global = &xs[0]; }
module latest 2026;
bool runtimeCondition();
void g(in int[] xs) @nogc @safe { }
void main() @nogc @safe
{
static import old;
void function(in int[]) @nogc @safe fp = runtimeCondition() ? &old.f : &g;
fp([1, 2, 3]); // stack allocation of the array required: main is @nogc
int x = *old.global; // memory corruption if runtimeCondition() was true
}
You may argue that the old module has bad code. It has.
You may argue that &old.f
could just have type void function(const int[]) @…
in the later edition, and you’d be correct, but it’s happenstance that it’s easily possible.
This is an illustration that’s easy to follow. The issue is the general pattern that semantic differences are specified to be type-erased when crossing edition boundaries. It’s an innate problem.
There are two solutions I see:
- Encode edition information into function types, function pointer types, and delegate types. Then,
&old.f
has type void function(in int[]) @… pre-edition
and &latest.g
and fp
have type void function(in int[]) @… 2026-edition
and those would be incompatible in general, and in this specific instance, only the conversion from 2026-edition
to pre-edition
would be valid.
- Only let stuff cross the editions boundary when the crossing weakens the guarantees. One can forget the
scope
storage class and be fine, but it can’t be added.
Foreseeing all of these is not easy. The illustrative example does not depend on DIP 1000 and preview-in
alone, but the interaction between them.