Thread overview | |||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
May 16, 2019 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
1 - 17 ms, 553 ╬╝s, and 1 hnsec WTH!! is there any way to just get a normal u rather than some fancy useless asci hieroglyphic? Why don't we have a fancy M? and an h? What's an hnsec anyways? |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Alex | On Thursday, 16 May 2019 at 15:19:03 UTC, Alex wrote: > 1 - 17 ms, 553 ╬╝s, and 1 hnsec > > WTH!! is there any way to just get a normal u rather than some fancy useless asci hieroglyphic? Why don't we have a fancy M? and an h? It's outputting UTF-8, but, your console is not configured to display UTF-8. On Windows, you can do so (before running your program), by running: chcp 65001 Or, within your program, by calling: SetConsoleOutputCP(CP_UTF8); Note that this has some negative side effects, which is why D doesn't do it automatically. (Blame Windows.) > What's an hnsec anyways? Hecto-nano-second, the smallest representable unit of time in SysTime and Duration. |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Vladimir Panteleev | On 5/16/19 4:27 PM, Vladimir Panteleev wrote: > On Thursday, 16 May 2019 at 15:19:03 UTC, Alex wrote: >> What's an hnsec anyways? > > Hecto-nano-second, the smallest representable unit of time in SysTime and Duration. The output shouldn't involve the inner workings of the type. It should be changed to say 10 ns. -Steve |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On Thursday, 16 May 2019 at 15:52:05 UTC, Steven Schveighoffer wrote:
>> Hecto-nano-second, the smallest representable unit of time in SysTime and Duration.
>
> The output shouldn't involve the inner workings of the type. It should be changed to say 10 ns.
If the output is meant for the developer, then I disagree subjectively, as that creates the impression that the lowest resolution or representable unit of time is the nanosecond.
If the output is meant for the user, then hectonanoseconds or nanoseconds are going to be almost always irrelevant. The duration should be formatted appropriately to the use case.
|
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Vladimir Panteleev | On Thursday, 16 May 2019 at 15:27:33 UTC, Vladimir Panteleev wrote:
> On Thursday, 16 May 2019 at 15:19:03 UTC, Alex wrote:
>> 1 - 17 ms, 553 ╬╝s, and 1 hnsec
>>
>> WTH!! is there any way to just get a normal u rather than some fancy useless asci hieroglyphic? Why don't we have a fancy M? and an h?
>
> It's outputting UTF-8, but, your console is not configured to display UTF-8.
>
> On Windows, you can do so (before running your program), by running: chcp 65001
>
> Or, within your program, by calling: SetConsoleOutputCP(CP_UTF8);
>
> Note that this has some negative side effects, which is why D doesn't do it automatically. (Blame Windows.)
>
>> What's an hnsec anyways?
>
> Hecto-nano-second, the smallest representable unit of time in SysTime and Duration.
Thanks...
Why not just use u? If that is too much trouble then detect the code page and use u rather than the extended ascii which looks very out of place?
|
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Alex | On Thursday, 16 May 2019 at 16:49:35 UTC, Alex wrote: > Why not just use u? It generally works fine on all the other filesystems, which today have mostly standardized on UTF-8. > If that is too much trouble then detect the code page and use u rather than the extended ascii which looks very out of place? Well, a more correct solution would be to check if we're printing to the Windows console, and use Unicode APIs, which would allow this to work regardless of the current 8-bit codepage. However, this (and your suggestion) are complicated to implement due to reasons related to how tightly Phobos is tied to C's FILE* for file input and output. |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Vladimir Panteleev | On Thursday, 16 May 2019 at 16:52:22 UTC, Vladimir Panteleev wrote:
> On Thursday, 16 May 2019 at 16:49:35 UTC, Alex wrote:
>> Why not just use u?
>
> It generally works fine on all the other filesystems
* operating systems
|
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Vladimir Panteleev | On 5/16/19 4:55 PM, Vladimir Panteleev wrote: > On Thursday, 16 May 2019 at 15:52:05 UTC, Steven Schveighoffer wrote: >>> Hecto-nano-second, the smallest representable unit of time in SysTime and Duration. >> >> The output shouldn't involve the inner workings of the type. It should be changed to say 10 ns. > > If the output is meant for the developer, then I disagree subjectively, as that creates the impression that the lowest resolution or representable unit of time is the nanosecond. It is what it is. The reason hnsecs is used instead of nsecs is because it gives a time range of 20,000 years instead of 2,000 years. We do have a nanosecond resolution, and it's just rounded down to the nearest 10. For example: auto d = 15.nsecs; assert(d == 10.nsecs); You shouldn't be relying on what a string says to know what the tick resolution is. For example, if I do writefln("%f", 1.0), I get 1.000000. That doesn't mean I should assume floating point precision only goes down to 1/1_000_000. hnsecs is more confusing than nanoseconds. People know what a nanosecond is, a hecto-nano-second is not as familiar a term. > If the output is meant for the user, then hectonanoseconds or nanoseconds are going to be almost always irrelevant. The duration should be formatted appropriately to the use case. Depends on the user and the application. -Steve |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On 5/16/19 9:17 PM, Steven Schveighoffer wrote: > On 5/16/19 4:55 PM, Vladimir Panteleev wrote: >> If the output is meant for the developer, then I disagree subjectively, as that creates the impression that the lowest resolution or representable unit of time is the nanosecond. > > It is what it is. The reason hnsecs is used instead of nsecs is because it gives a time range of 20,000 years instead of 2,000 years. > > We do have a nanosecond resolution, and it's just rounded down to the nearest 10. And to prove my point about it being an obscure term, I forgot it's not 10 nanoseconds, but 100 nanoseconds. oops! > > For example: > > auto d = 15.nsecs; > assert(d == 10.nsecs); This is not what I was trying to say, even though it's true (both are Duration.zero). I meant: auto d = 150.nsecs; assert(d == 100.nsecs); -Steve |
May 16, 2019 Re: 1 - 17 ms, 553 ╬╝s, and 1 hnsec | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On Thursday, 16 May 2019 at 20:17:37 UTC, Steven Schveighoffer wrote: > We do have a nanosecond resolution, and it's just rounded down to the nearest 10. > > For example: > > auto d = 15.nsecs; > assert(d == 10.nsecs); I'm not sure how to feel about this. Maybe there was a better way to handle nanoseconds here. > You shouldn't be relying on what a string says to know what the tick resolution is. I don't like that with your proposal, it seems to add data that's not there. The 0 is entirely fictional. It might as well be part of the format string. > For example, if I do writefln("%f", 1.0), I get 1.000000. %f is a C-ism, %s does not do that. > hnsecs is more confusing than nanoseconds. People know what a nanosecond is, a hecto-nano-second is not as familiar a term. Agreed, which is why Duration.toString shouldn't be used to present durations to users. Developers, however, are expected to know what a hectonanosecond is, same as with all the other technical terms. >> If the output is meant for the user, then hectonanoseconds or nanoseconds are going to be almost always irrelevant. The duration should be formatted appropriately to the use case. > > Depends on the user and the application. If the durations are so small or so precise that it makes sense to display them with such precision, then yes, applications would do better to use nanoseconds instead of hectonanoseconds. |
Copyright © 1999-2021 by the D Language Foundation