On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote:
>Wouldn't it be nice to deprecate unary minus operator for unsigned types?
It's also useful for rounding: n&-16
6 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Olivier Pisano | On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote: >Wouldn't it be nice to deprecate unary minus operator for unsigned types? It's also useful for rounding: n&-16 |
6 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Kagamin | On Thursday, 13 March 2025 at 07:29:29 UTC, Kagamin wrote: >On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote: >Wouldn't it be nice to deprecate unary minus operator for unsigned types? It's also useful for rounding: n&-16 Again: Please use n&~15 instead. It produces the same result, but without relying on ugly and confusing implicit signed/unsigned conversions. |
5 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Kagamin | On Thursday, 13 March 2025 at 07:29:29 UTC, Kagamin wrote: >On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote: >Wouldn't it be nice to deprecate unary minus operator for unsigned types? It's also useful for rounding: n&-16 typeof(16) is int, signed unary minus would not be deprecated. |
5 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Nick Treleaven | On Friday, 14 March 2025 at 12:17:59 UTC, Nick Treleaven wrote: >On Thursday, 13 March 2025 at 07:29:29 UTC, Kagamin wrote: >On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote: >Wouldn't it be nice to deprecate unary minus operator for unsigned types? It's also useful for rounding: n&-16 typeof(16) is int, signed unary minus would not be deprecated. Exactly, I proposed to deprecate unary minus for 16U (uint), not for 16 (int).
|
5 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Olivier Pisano | On Friday, 14 March 2025 at 16:09:55 UTC, Olivier Pisano wrote: >On Friday, 14 March 2025 at 12:17:59 UTC, Nick Treleaven wrote: >On Thursday, 13 March 2025 at 07:29:29 UTC, Kagamin wrote: >On Tuesday, 11 March 2025 at 10:23:59 UTC, Olivier Pisano wrote: >Wouldn't it be nice to deprecate unary minus operator for unsigned types? It's also useful for rounding: n&-16 typeof(16) is int, signed unary minus would not be deprecated. Exactly, I proposed to deprecate unary minus for 16U (uint), not for 16 (int).
I have just discovered today a LLVM bug for one platform I'm developing to that it fails to convert correctly long to double:
I'm generally against disabling implicit conversions. But maybe it could be useful for that situation. That could be under a compilation flag? |
5 days ago Re: Deprecate implicit conversion between signed and unsigned integers | ||||
---|---|---|---|---|
| ||||
Posted in reply to Atila Neves | This is a continuation of this thread: https://www.digitalmars.com/d/archives/digitalmars/dip/ideas/Deprecate_implicit_conversion_between_signed_and_unsigned_integers_334.html |