Jump to page: 1 2 3
Thread overview
Stackless resumable functions
Oct 24, 2014
Martin Nowak
Oct 24, 2014
Sean Kelly
Oct 24, 2014
ROOAR
Oct 28, 2014
David Nadlinger
Oct 28, 2014
ROOAR
Oct 28, 2014
Martin Nowak
Oct 24, 2014
Sean Kelly
Oct 28, 2014
Kagamin
Oct 28, 2014
Paulo Pinto
Oct 28, 2014
Kagamin
Oct 29, 2014
Kagamin
Feb 20, 2015
bitwise
Feb 20, 2015
CraigDillabaugh
Feb 21, 2015
bitwise
Feb 22, 2015
ketmar
Feb 23, 2015
bitwise
Feb 23, 2015
ketmar
Feb 23, 2015
bitwise
Feb 24, 2015
ketmar
Feb 24, 2015
bitwise
Feb 24, 2015
ketmar
Feb 24, 2015
bitwise
Feb 24, 2015
bitwise
Feb 24, 2015
ketmar
Feb 22, 2015
ketmar
October 24, 2014
This is so much better than Fibers.
http://youtu.be/KUhSjfSbINE

What I like most about the proposal is that you can adapt await by specializing template functions, similar to how range based foreach works.
It also isn't tied to a particular scheduling mechanism and of course consumes much less memory than stack based suspension.
October 24, 2014
On Friday, 24 October 2014 at 10:33:40 UTC, Martin Nowak wrote:
> What I like most about the proposal is that you can adapt await by specializing template functions, similar to how range based foreach works.
> It also isn't tied to a particular scheduling mechanism and of course consumes much less memory than stack based suspension.

This is how all truly object oriented languages with concurrency works. Block activation records are conceptually on the heap and there is no difference between an object and a function: Simula67, Beta, Self…

It is slower than using a stack though, but if done as in Beta you get a back pointer to the caller (who instantiate the function/object) which can be handy for modelling.
October 24, 2014
On Friday, 24 October 2014 at 10:33:40 UTC, Martin Nowak wrote:
> This is so much better than Fibers.
> http://youtu.be/KUhSjfSbINE
>
> What I like most about the proposal is that you can adapt await by specializing template functions, similar to how range based foreach works.
> It also isn't tied to a particular scheduling mechanism and of course consumes much less memory than stack based suspension.

I'm about halfway through the talk and it's a bit confusing so
far because all of what I'd consider the interesting part seems
to be implemented as a compiler extension and so is invisible by
looking at the code.  He's talking about suspend points in the
function but there's no indication from the code that they are
present where he says.  It seems a bit like these functions are
closures and the compiler is figuring this out according to the
return type or something.  So it's potentially interesting but
difficult to see how this directly compares to classic
coroutines.  I'm hoping all will be clear by the end of the talk.
October 24, 2014
 I really liked this proposal for resumable lambda:

http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4244.pdf
October 24, 2014
Alright, done.  It's a pretty interesting proposal.  They are
effectively closures with coroutine-like semantics.  It seems
like the overhead for a complex system might actually be greater
than with classic coroutines, as closure data allocations could
be happening all over the place, but this is pure speculation.

I think a direct comparison could be drawn between their API and
ours, as std.concurrency now has a Generator object and one of
his early examples is a generator as well.  From a use
perspective, the two are really pretty similar, though our
Generator allocates an entire stack while theirs allocates N
function-level context blocks (one per contained awaitable).

Overall I see this proposal as being complementary to actors as
per std.concurrency.  Theirs provides a fairly simple and
lightweight model for composing code that doesn't normally
compose well (like recursive iterators), which is one traditional
use of coroutines.  But for high levels of concurrency to be
achieved, a scheduler needs to sit behind the await mechanism so
other things can happen when execution is suspended waiting for a
result.  This could integrate well with the Scheduler that is now
a part of std.concurrency, as it would be fairly trivial for a
context switch to occur whenever an awaitable suspend occurs.
October 25, 2014
On Friday, 24 October 2014 at 14:50:53 UTC, Ola Fosheim Grøstad wrote:
> On Friday, 24 October 2014 at 10:33:40 UTC, Martin Nowak wrote:
>> What I like most about the proposal is that you can adapt await by specializing template functions, similar to how range based foreach works.
>> It also isn't tied to a particular scheduling mechanism and of course consumes much less memory than stack based suspension.
>
> This is how all truly object oriented languages with concurrency works. Block activation records are conceptually on the heap and there is no difference between an object and a function: Simula67, Beta, Self…
>
> It is slower than using a stack though, but if done as in Beta you get a back pointer to the caller (who instantiate the function/object) which can be handy for modelling.

It is worth pointing out that one advantage of taking this uniform view is that you can more easily define a system to persist/migrate a transitive closure of objects/fibers and transfer them to other servers.

However, it does not have to be stackless in terms of implementation. A stack is then an optimization, the compiled code can put things on the stack until it at runtime hits a yield (at which point you have to pick it up).
October 28, 2014
On 10/24/14 10:51 AM, ROOAR wrote:
>   I really liked this proposal for resumable lambda:
>
> http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4244.pdf

Is this related to the video? -- Andrei
October 28, 2014
On Tuesday, 28 October 2014 at 02:10:47 UTC, Andrei Alexandrescu wrote:
> On 10/24/14 10:51 AM, ROOAR wrote:
>>  I really liked this proposal for resumable lambda:
>>
>> http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4244.pdf
>
> Is this related to the video? -- Andrei

I haven't been following the developments too closely, but I think the talk essentially describes N4134, which supersedes N3977/N3858. Chris Kohlhoff references the latter in his introduction in N4244, but I haven't read that paper in any detail.

David
October 28, 2014
On Tuesday, 28 October 2014 at 02:10:47 UTC, Andrei Alexandrescu wrote:
> On 10/24/14 10:51 AM, ROOAR wrote:
>>  I really liked this proposal for resumable lambda:
>>
>> http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4244.pdf
>
> Is this related to the video? -- Andrei

It is a separate proposal, not the one shown in the video
October 28, 2014
On Tuesday, 28 October 2014 at 02:10:47 UTC, Andrei Alexandrescu wrote:
> On 10/24/14 10:51 AM, ROOAR wrote:
>>  I really liked this proposal for resumable lambda:
>>
>> http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4244.pdf
>
> Is this related to the video? -- Andrei

There is a good sumarry of the current state in
http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2014/n4232.pdf.
« First   ‹ Prev
1 2 3