Thread overview | |||||||||
---|---|---|---|---|---|---|---|---|---|
|
December 16, 2008 convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Hi, I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd: union pp { char[4] str; uint num; } const uint x = pp("abcd").num This does also doesn't work: const uint x = cast(uint) x"aa aa aa aa"; Any ideas? |
December 16, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to Moritz Warning | Reply to Moritz,
> Hi,
>
> I have problems to convert a char[4] to an uint at compile time. All
> variations (I've tried) of using an enum crashes dmd:
>
> union pp { char[4] str; uint num; }
> const uint x = pp("abcd").num
> This does also doesn't work:
>
> const uint x = cast(uint) x"aa aa aa aa";
>
> Any ideas?
>
template Go (char[4] arg)
{
const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3];
}
import std.stdio;
void main()
{
writef("%x\n", Go!("Good"));
}
|
December 16, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to BCS | On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
> Reply to Moritz,
>
>> Hi,
>>
>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>>
>> union pp { char[4] str; uint num; }
>> const uint x = pp("abcd").num
>> This does also doesn't work:
>>
>> const uint x = cast(uint) x"aa aa aa aa";
>>
>> Any ideas?
>>
>>
>
> template Go (char[4] arg)
> {
> const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
> arg[3];
> }
>
> import std.stdio;
> void main()
> {
> writef("%x\n", Go!("Good"));
> }
Thanks!
That workaround should do it.
Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
|
December 23, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to Moritz Warning | Moritz Warning wrote:
> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>
>> Reply to Moritz,
>>
>>> Hi,
>>>
>>> I have problems to convert a char[4] to an uint at compile time. All
>>> variations (I've tried) of using an enum crashes dmd:
>>>
>>> union pp { char[4] str; uint num; }
>>> const uint x = pp("abcd").num
>>> This does also doesn't work:
>>>
>>> const uint x = cast(uint) x"aa aa aa aa";
>>>
>>> Any ideas?
>>>
>>>
>> template Go (char[4] arg)
>> {
>> const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>> arg[3];
>> }
>>
>> import std.stdio;
>> void main()
>> {
>> writef("%x\n", Go!("Good"));
>> }
>
> Thanks!
> That workaround should do it.
>
> Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs".
-Joel
|
December 23, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to Janderson | On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask@me.com> wrote:
> Moritz Warning wrote:
>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>
>>> Reply to Moritz,
>>>
>>>> Hi,
>>>>
>>>> I have problems to convert a char[4] to an uint at compile time. All
>>>> variations (I've tried) of using an enum crashes dmd:
>>>>
>>>> union pp { char[4] str; uint num; }
>>>> const uint x = pp("abcd").num
>>>> This does also doesn't work:
>>>>
>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>
>>>> Any ideas?
>>>>
>>>>
>>> template Go (char[4] arg)
>>> {
>>> const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>> arg[3];
>>> }
>>>
>>> import std.stdio;
>>> void main()
>>> {
>>> writef("%x\n", Go!("Good"));
>>> }
>> Thanks!
>> That workaround should do it.
>> Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
>
> That would only cast the pointer. It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs".
>
> -Joel
And what about endianness? You can't have a feature in a language that gives different results in different environment.
|
December 23, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to Janderson | On Tue, 23 Dec 2008 00:07:08 -0800, Janderson wrote:
> Moritz Warning wrote:
>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>
>>> Reply to Moritz,
>>>
>>>> Hi,
>>>>
>>>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>>>>
>>>> union pp { char[4] str; uint num; }
>>>> const uint x = pp("abcd").num
>>>> This does also doesn't work:
>>>>
>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>
>>>> Any ideas?
>>>>
>>>>
>>> template Go (char[4] arg)
>>> {
>>> const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>> arg[3];
>>> }
>>>
>>> import std.stdio;
>>> void main()
>>> {
>>> writef("%x\n", Go!("Good"));
>>> }
>>
>> Thanks!
>> That workaround should do it.
>>
>> Maybe it will be possible to just do cast(uint) "abcd" in the future.
>> :>
>
> That would only cast the pointer. It should be something like :
> cast(uint)(*"abcs") or *cast(uint*) "abcs".
>
> -Joel
I like to see "abcd" being a value type like a decimal or hex value. A cast(uint) would be possible and nice in that case.
|
December 23, 2008 Re: convert char[4] to uint at compile time | ||||
---|---|---|---|---|
| ||||
Posted in reply to Denis Koroskin | On Tue, 23 Dec 2008 13:16:28 +0300, Denis Koroskin wrote:
> On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask@me.com> wrote:
>
>> Moritz Warning wrote:
>>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>>
>>>> Reply to Moritz,
>>>>
>>>>> Hi,
>>>>>
>>>>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>>>>>
>>>>> union pp { char[4] str; uint num; }
>>>>> const uint x = pp("abcd").num
>>>>> This does also doesn't work:
>>>>>
>>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>>
>>>>> Any ideas?
>>>>>
>>>>>
>>>> template Go (char[4] arg)
>>>> {
>>>> const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>>> arg[3];
>>>> }
>>>>
>>>> import std.stdio;
>>>> void main()
>>>> {
>>>> writef("%x\n", Go!("Good"));
>>>> }
>>> Thanks!
>>> That workaround should do it.
>>> Maybe it will be possible to just do cast(uint) "abcd" in the future.
>>> :>
>>
>> That would only cast the pointer. It should be something like :
>> cast(uint)(*"abcs") or *cast(uint*) "abcs".
>>
>> -Joel
>
> And what about endianness? You can't have a feature in a language that gives different results in different environment.
The use of uint in my example might be confusing.
I only needed an environment independent bit pattern of 4 bytes.
An integer is used because it's faster than comparing a char[4]
with DMD. :/
(GDC doesn't show such behavior)
|
Copyright © 1999-2021 by the D Language Foundation