Hi,
I discovered a bug in DMD whereby, during optimisation, the loop index variable is removed, despite being used afterwards. I reproduced the bug in this issue.
Using run.dlang.io, I tried to find a regression, but all supported DMD versions (2.060 and newer) are printing the same incorrect output (Exception: i = 0; n = 1). My experiment can be found here. This seems to be a backend issue, as both LDC and LDC-beta are working fine.
However, when analysing the CI outputs from the PR which made me come across this bug, I noticed that all 3 of DMD, LDC and GDC are failing the same test, which I've also been able to reproduce myself:
- LDC: https://cirrus-ci.com/task/6291197929979904?logs=test_druntime#L1449
- DMD: https://cirrus-ci.com/task/5728247976558592?logs=test_druntime#L1390
- GDC: https://cirrus-ci.com/task/4883823046426624?logs=test_druntime#L1411
The failure of this test is caused by the issue I mentioned above and the code with which I reproduced the bug is based on it.
I am unsure what to make of this, since the logs seem to contradict my experiment with LDC. Has anyone else encountered this? How did you proceed?
Thanks,
Teodor
 Permalink
Permalink Reply
Reply