July 13, 2019
I saw on the dlang.org homepage a sample of code that looks a little strange to me:

    int[char[2]] aa;
    auto arr = "ABBBA";

    foreach (i; 0 .. arr.length - 1)
        aa[arr[i .. $][0 .. 2]]++;

which I changed to:

        aa[arr[i .. i+2][0 .. 2]]++;

in the hopes of only doing:

        aa[arr[i .. i+2]]++;

The question is: Why is the [0..2] still necessary?

Without it, dmd (on that homepage) complains:

 cannot implicitly convert expression arr[i..i + 2LU] of type string to char[2]

The spec says that:

10.22 Slices
...
6. If the slice bounds can be known at compile time, the slice expression is implicitly convertible to an lvalue of static array.

7. The following forms of slice expression can be convertible to a static array type:

e     An expression that contains no side effects.
a, b  Integers (that may be constant-folded).

Form	       The length calculated at compile time
...
arr[e .. e+b]  b

So why isn't arr[i .. i+2] convertible? Is dmd behind the spec here?

July 13, 2019
On 13.07.19 22:35, Kevin Bailey wrote:
>      int[char[2]] aa;
>      auto arr = "ABBBA";
> 
>      foreach (i; 0 .. arr.length - 1)
>          aa[arr[i .. $][0 .. 2]]++;
> 
> which I changed to:
> 
>          aa[arr[i .. i+2][0 .. 2]]++;
> 
> in the hopes of only doing:
> 
>          aa[arr[i .. i+2]]++;
> 
> The question is: Why is the [0..2] still necessary?
> 
[...]
> 
> So why isn't arr[i .. i+2] convertible? Is dmd behind the spec here?

Looks like it. It works in the most basic context:

    char[2] sa = arr[i .. i + 2];

And it works in the AA context when you use the most basic kind of expression:

    aa[arr[0 .. 2]]++;

So as far as I can tell, it's just a bug that more complex expressions fail in more complex contexts. It should work the same.