The code below works fine without optimizations. But with optimizations (the -O flag) turned on it segfaults. The behavior with optimizations turned on is a bit different depending on which version of DMD I try and if I compile for 32 or 64bit.
DMD 2.062 64bit: Segfault
DMD 2.062 32bit: Prints a huge array then segfault
DMD 2.061 64bit: Segfault
DMD 2.061 32bit: Prints a fairly small array (10 elements) with random numbers
DMD head (7dcc72a997) 32bit: Bus error: 10
I'm using Mac OS X 10.8.2.
import std.stdio;
int[]* getDeserializedSlice ()
{
void[] a = [1, 2, 3, 4, 5].dup;
auto b = &a;
if (auto c = b)
auto d = &(cast(int[]) *c)[1 .. 1 + 2];
return null;
}
void main ()
{
writeln(*getDeserializedSlice());
}
Is the above code supposed to work? The test case might look a bit strange, the full source code is here:
https://github.com/jacob-carlborg/orange/blob/master/orange/serialization/Serializer.d#L1672
--
/Jacob Carlborg