July 05, 2017
Hi,

i have a coulpe of different machines with MySQL Servers running on it.
Now, i want to execute queries for all Databases at the same time and collect the Result to process it.

I am new to the parallelism - so maybe i understand something totaly wrong.
What i tring is something like this:

<code>
{
 auto tPool = new TaskPool();
 forach(server ; servers)
 {
  auto task = task!queryWorker(query);
  tPool.put(task);
 }

 tPool.finish(true);
//--------> how to collect the results now? <-------

}

row[] queryWorker(string query) {

 //rows = result of the query

 return rows;
}
</code>

btw.. how to markup code in this forum?
July 05, 2017
On Wednesday, 5 July 2017 at 13:55:22 UTC, Martin wrote:
> Hi,
>
> i have a coulpe of different machines with MySQL Servers running on it.
> Now, i want to execute queries for all Databases at the same time and collect the Result to process it.
>
> I am new to the parallelism - so maybe i understand something totaly wrong.
> What i tring is something like this:
>
> <code>
> {
>  auto tPool = new TaskPool();
>  forach(server ; servers)
>  {
>   auto task = task!queryWorker(query);
>   tPool.put(task);
>  }
>
>  tPool.finish(true);
> //--------> how to collect the results now? <-------
>
> }
>
> row[] queryWorker(string query) {
>
>  //rows = result of the query
>
>  return rows;
> }
> </code>
>
> btw.. how to markup code in this forum?
I tested a much simpler approach with the following setup/structure?:

// a shared array of results where each result is again an array;
Rows results[];

// using parallel foreach
foreach(i,server;servers.parallel){
 result[i] = request(server).array;;
}
Now every array of rows is accessible in result[]?

Tested this construct with parallel curl requests:

time ./parallel_curl
Site www.dlang.org. Page has length:31607
Site forum.dlang.org. Page has length:24358
Site code.dlang.org. Page has length:36477
Site www.google.com. Page has length:10628

real	0m0.836s
user	0m0.137s
sys	0m0.034s

Without parallel:

real	0m2.424s
user	0m0.722s
sys	0m0.209s

This is the code:

import std.stdio;
import std.net.curl;
import std.parallelism;
void main()
{
        enum string[] tospider = ["www.dlang.org","forum.dlang.org","code.dlang.org","www.google.com"];
        char[][tospider.length] results;
        foreach(i,site;tospider.parallel){
         results[i] = get(site);
        }

        foreach(i,e;results){
                writeln("Site ", tospider[i],". Page has length:",e.length);
        }
}


Will try to use this approach to collect some elastic seach results and look if it speeds up on an 8 core machine.