Jump to page: 1 2
Thread overview
[Issue 18290] std.conv.parse throws ConvOverflowException for negative values in hex
Feb 03, 2018
Seb
Oct 21, 2018
Tiberiu Lepadatu
Apr 27, 2019
Stefan
Dec 14, 2019
berni44
Dec 16, 2019
berni44
Dec 16, 2019
berni44
Dec 17, 2022
Iain Buclaw
Dec 17, 2022
Iain Buclaw
February 03, 2018
https://issues.dlang.org/show_bug.cgi?id=18290

Seb <greensunny12@gmail.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Keywords|                            |bootcamp
                 CC|                            |greensunny12@gmail.com

--
October 21, 2018
https://issues.dlang.org/show_bug.cgi?id=18290

Tiberiu Lepadatu <tiberiulepadatu14@gmail.com> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |tiberiulepadatu14@gmail.com

--- Comment #1 from Tiberiu Lepadatu <tiberiulepadatu14@gmail.com> ---
(In reply to Răzvan Ștefănescu from comment #0)
> string input = "80000000";
> auto x = parse!int(input, 16);
> //ConvOverflowException

I feel that this is not a bug. To show that you
are using a hex string you must use "0x" in front.
If you write "0x80000000" instead of "80000000"
there is no bug.

--
October 22, 2018
https://issues.dlang.org/show_bug.cgi?id=18290

--- Comment #2 from Răzvan Ștefănescu <rumbu@rumbu.ro> ---
(In reply to Tiberiu Lepadatu from comment #1)
> (In reply to Răzvan Ștefănescu from comment #0)
> > string input = "80000000";
> > auto x = parse!int(input, 16);
> > //ConvOverflowException
> 
> I feel that this is not a bug. To show that you
> are using a hex string you must use "0x" in front.
> If you write "0x80000000" instead of "80000000"
> there is no bug.

parsing 0x80000000 will read only the "0" before x, which is not the intended behavior, therefore x will become 0, not -2147483648.

In order to show that I am using a hex string, I specify the radix (16). To go further, how do I parse a string with radix 7? How do I prefix the string?

--
April 27, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

Stefan <kdevel@vogtner.de> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |kdevel@vogtner.de

--
December 14, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

berni44 <bugzilla@d-ecke.de> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|NEW                         |RESOLVED
                 CC|                            |bugzilla@d-ecke.de
         Resolution|---                         |INVALID

--- Comment #3 from berni44 <bugzilla@d-ecke.de> ---
80000000(16) = 2147483648(10) which is larger than int.max. Therefore the
exception is correct. If you want to have -214748364, just add a minus sign in
the string.

--
December 16, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

Răzvan Ștefănescu <rumbu@rumbu.ro> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|RESOLVED                    |REOPENED
         Resolution|INVALID                     |---

--- Comment #4 from Răzvan Ștefănescu <rumbu@rumbu.ro> ---
to!string(int.min, 16) returns "80000000"

So, please tell me, which is the hex representation of int.min (-2147483648)?

Where do I put the "-" sign? Because, you know, +2147483648 is not an integer, it's a long.

Just to play dumb, taking your advice to put a "-" before:
parse!int("-80000000", 16) returns 0 :)

--
December 16, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

--- Comment #5 from berni44 <bugzilla@d-ecke.de> ---
(In reply to Răzvan Ștefănescu from comment #4)
> Just to play dumb, taking your advice to put a "-" before:
> parse!int("-80000000", 16) returns 0 :)

OK, thanks. This makes for a much better test:

---
void main()
{
    import std.conv;

    string s = "-80000000";
    assert(parse!int(s,16) == int.min);
}
---

Currently the assertion fails.

A similar problem arises with to!int("-80000000,16), which produces an Exception. With the radix 10 this works, all other radices fail.

--
December 16, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

--- Comment #6 from Răzvan Ștefănescu <rumbu@rumbu.ro> ---
In my opinion, you are mixing decimal math with two's complement math. Having a sign before a hexadecimal representation which already has the sign bit set is pure nonsense.

Conversion between string and integrals (and vice versa) must render exact
results.

As long as to!string(int.min, 16)  returns "80000000", I expect that
parse!int("80000000", 16) to return int.min.

In conclusion, please keep my original issue as is, if you want negative hex numbers, open another issue, don't adjust that one.

Thanks.

--
December 16, 2019
https://issues.dlang.org/show_bug.cgi?id=18290

--- Comment #7 from berni44 <bugzilla@d-ecke.de> ---
(In reply to Răzvan Ștefănescu from comment #6)
> As long as to!string(int.min, 16)  returns "80000000", I expect that
> parse!int("80000000", 16) to return int.min.

Looks like an other bug.

> In conclusion, please keep my original issue as is, if you want negative hex numbers, open another issue, don't adjust that one.

OK. But IMHO it's invalid and should be closed then.

--
December 17, 2022
https://issues.dlang.org/show_bug.cgi?id=18290

Iain Buclaw <ibuclaw@gdcproject.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Priority|P1                          |P3

--
« First   ‹ Prev
1 2