Thread overview
Measuring Execution time
Jul 22, 2015
Clayton
Jul 22, 2015
John Colvin
Jul 22, 2015
Clayton
Jul 23, 2015
Clayton
Jul 24, 2015
Yazan D
Jul 24, 2015
Yazan D
Jul 24, 2015
Jonathan M Davis
July 22, 2015
How does one represent Duration in only Micro-seconds, or milliseconds. Trying to measure the execution time of an algorithm and I get "4 ms, 619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or μs .

I would also  appreciate advise on  whether this is the best way to measure the execution time of an algorithm.



import std.datetime;
import std.stdio;

void algorithm( ){
	writeln("Hello!");
}
void main(){

        auto stattime = Clock.currTime();
	algorithm( );
	endttime = Clock.currTime();

	auto duration = endttime - stattime;

	writeln("Hello Duration ==> ", duration);

}
July 22, 2015
On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
> How does one represent Duration in only Micro-seconds, or milliseconds. Trying to measure the execution time of an algorithm and I get "4 ms, 619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or μs .
>
> I would also  appreciate advise on  whether this is the best way to measure the execution time of an algorithm.
>
>
>
> import std.datetime;
> import std.stdio;
>
> void algorithm( ){
> 	writeln("Hello!");
> }
> void main(){
>
>         auto stattime = Clock.currTime();
> 	algorithm( );
> 	endttime = Clock.currTime();
>
> 	auto duration = endttime - stattime;
>
> 	writeln("Hello Duration ==> ", duration);
>
> }

The normal way of doing this would be using std.datetime.StopWatch:

    StopWatch sw;
    sw.start();
    algorithm();
    long exec_ms = sw.peek().msecs;
July 22, 2015
On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
> On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
>> [...]
>
> The normal way of doing this would be using std.datetime.StopWatch:
>
>     StopWatch sw;
>     sw.start();
>     algorithm();
>     long exec_ms = sw.peek().msecs;

Much appreciated, that works well John . Learning goes on... thanks again
July 23, 2015
On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
> On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
>> [...]
>
> The normal way of doing this would be using std.datetime.StopWatch:
>
>     StopWatch sw;
>     sw.start();
>     algorithm();
>     long exec_ms = sw.peek().msecs;

Am wondering how possible is to restrict that all algorithms get run on a specific core( e.g. CPU 0 ) since I wanted my test run on the same environment.
July 23, 2015
On 7/22/15 5:23 AM, Clayton wrote:
> How does one represent Duration in only Micro-seconds, or milliseconds.
> Trying to measure the execution time of an algorithm and I get "4 ms,
> 619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or
> μs .
>
> I would also  appreciate advise on  whether this is the best way to
> measure the execution time of an algorithm.
>
>
>
> import std.datetime;
> import std.stdio;
>
> void algorithm( ){
>      writeln("Hello!");
> }
> void main(){
>
>          auto stattime = Clock.currTime();
>      algorithm( );
>      endttime = Clock.currTime();
>
>      auto duration = endttime - stattime;
>
>      writeln("Hello Duration ==> ", duration);
>
> }

I know John identified Stopwatch, but just an FYI, Duration has the method total: http://dlang.org/phobos/core_time.html#.Duration.total

I think doing:

writeln("Hello Duration ==> ", duration.total!"usecs");

would also work.

-Steve
July 24, 2015
On Thursday, July 23, 2015 13:59:11 Steven Schveighoffer via Digitalmars-d-learn wrote:
> On 7/22/15 5:23 AM, Clayton wrote:
> > How does one represent Duration in only Micro-seconds, or milliseconds. Trying to measure the execution time of an algorithm and I get "4 ms, 619 μs, and 8 hnsecs" , I want to sum all these and get total hnsecs or μs .
> >
> > I would also  appreciate advise on  whether this is the best way to measure the execution time of an algorithm.
> >
> >
> >
> > import std.datetime;
> > import std.stdio;
> >
> > void algorithm( ){
> >      writeln("Hello!");
> > }
> > void main(){
> >
> >          auto stattime = Clock.currTime();
> >      algorithm( );
> >      endttime = Clock.currTime();
> >
> >      auto duration = endttime - stattime;
> >
> >      writeln("Hello Duration ==> ", duration);
> >
> > }
>
> I know John identified Stopwatch, but just an FYI, Duration has the method total: http://dlang.org/phobos/core_time.html#.Duration.total
>
> I think doing:
>
> writeln("Hello Duration ==> ", duration.total!"usecs");
>
> would also work.

Yes, you could do that, but doing timing with the realtime clock is fundamentally wrong, because the clock can change on you while you're timing. That's why using a monotonic clock is better, since it's guaranteed to never move backwards. Unfortunately, while StopWatch does use a monotonic clock, it currently does that by using TickDuration for that rather than MonoTime, so its result is a TickDuration rather than a Duration, so it's a bit harder to use than would be nice, but it is more correct to use StopWatch than to subtract SysTimes. Alternatively, you could just use MonoTime directly. e.g.

auto startTime = MonoTime.currTime;
// do stuff
auto endTime = MonoTime.currTime;

audo duration = endTime - startTime;
writeln("Hello Duration ==> ", duration.total!"usecs");

in which case you get a Duration just like with subtract SysTimes, and the suggestion of using total works just fine.

I need to put together replacements for the benchmarking functions in std.datetime (probably in std.benchmark) which use MonoTime and Duration rather than TickDuration so that we can deprecate the ones in std.datetime which use TickDuration (and deprecate TickDuration itself).

- Jonathan M Davis


July 24, 2015
On Thu, 23 Jul 2015 16:43:01 +0000, Clayton wrote:

> On Wednesday, 22 July 2015 at 09:32:15 UTC, John Colvin wrote:
>> On Wednesday, 22 July 2015 at 09:23:36 UTC, Clayton wrote:
>>> [...]
>>
>> The normal way of doing this would be using std.datetime.StopWatch:
>>
>>     StopWatch sw;
>>     sw.start();
>>     algorithm();
>>     long exec_ms = sw.peek().msecs;
> 
> Am wondering how possible is to restrict that all algorithms get run on a specific core( e.g. CPU 0 ) since I wanted my test run on the same environment.

If you are using Linux, you can use `taskset`.
Example: `taskset -c 0 ./program`. This will run your program on the
first CPU only.
July 24, 2015
There is also http://linux.die.net/man/2/sched_setaffinity if you want to do it programmatically.