June 22, 2021
On Tue, Jun 22, 2021 at 03:56:03PM -0400, Steven Schveighoffer via Digitalmars-d wrote: [...]
> This again, is a misunderstanding of my proposal. I'm not talking about altering delegates in any way.
> 
> There are no delegates involved in `filter`. It is, instead, an inner struct with a hidden context pointer. Could just as well be a struct with context carried with it instead of allocated on the heap.
[...]

Ohhh I see what you mean now. So essentially what you're saying is, whenever there's an alias parameter bound to a lambda, the compiler should, in effect, insert additional parameters to the template function that captures the context the lambda is bound to?

IOW, this:

	auto filter(alias fun, R)(R range) {
		struct Voldemort {
			...
			if (fun(range.front)) ...
			...
		}
		return Voldemort(range);
	}

	int x = 2, y = 3;
	auto r = [1, 2, 3].filter!(e => e == x || e == y);

gets lowered into something like this:

	struct __lambda_context {
		int x, y;
	}
	bool __lambda_fun(__lambda_context ctxt, int e) {
		return e == ctxt.x || e == ctxt.y;
	}
	auto __filter_instance(R)(__lambda_context ctxt, R range) {
		struct Voldemort {
			__lambda_context secretCtxt;
			...
			if (__lambda_fun(secretCtxt, range.front)) ...
			...
		}
		return Voldemort(ctxt, range);
	}

	int x = 2, y = 3;
	auto r = __filter_instance(__lambda_context(x, y), [1, 2, 3]);

Am I right?

__lambda_context may not necessarily carry a copy of the captured variables; possibly it could use pointers instead. Though care would have to be taken not to leave dangling pointers to out-of-scope local variables -- probably you'd have to make the references `scope` unless you copy them by value into the Voldemort struct.


T

-- 
Не дорог подарок, дорога любовь.
June 22, 2021
On Tuesday, 22 June 2021 at 16:13:06 UTC, H. S. Teoh wrote:
>
> The context must be stored *somewhere*, usually on the GC heap, which is why delegates depend on the GC. Except for scope delegates where it's guaranteed that the parent scope won't go out of scope before the delegate does, so the context pointer can just point to the stack where the parent's scope is stored.
>

The way C++ works is that a lambda (which is essentially a template class) when you pass it to a function/method it will be converted to an std::function. Inside of std::function it will store context data on the heap. However, it has some "SSO" capability meaning if the captures are few and small it might fit into meta data of std::function. Essentially this is the same SSO (short string optimization) strategy for std::string.

Not sure if this helps but for small lambdas that does simple things like in the initial example, the context could easily be stored in the struct.

Now the problem is in D the delegate is very small, while in C++ it is probably more going on like reference counting making std::function bigger than in D. Making structs bigger isn't always that popular.
June 23, 2021

On Monday, 21 June 2021 at 20:01:27 UTC, Steven Schveighoffer wrote:

>

I have some code in my codebase that does stuff like this:

struct S
{
   int id;
   int field1;
}

auto foo(S[] arr, int someval) // @nogc not allowed :(
{
   // remove all the elements of arr that have someval for field1
   import std.algorithm;
   return arr.filter!(v => v.field1 == someval);
}

While this works, and is nice, it's using a lambda with local references to filter data. This means a GC closure allocation is involved.

However, the value that is referenced (someval) is the only thing needed, and it seems a huge waste to allocate a stack frame just for that.

Let's look at another way to do this. filter does not have an overload which accepts a value to compare to, but find does. So let's build a new filter that uses it:

auto nogcfilter(alias pred, T, V)(T[] rng, V val)
{
    import std.range;
    import std.algorithm : find;
    static struct Filter {
        T[] src;
        V val;
        auto front() { return src.front; }
        void popFront() {src.popFront; src = src.find!pred(val); }
        auto empty() { return src.empty; }
    }
    return Filter(rng, val);
}

auto foo(S[] arr, int someval) @nogc // :)
{
   // note: the explicit lambda types are needed for some reason
   return arr.nogcfilter!((S v, int a) => v.field1 == a)(someval);
}

Now we get filter capability, but without the GC.

This works great because the context items used do NOT require any mutability. It also works great because the filter function is building a custom type ANYWAY.

Now, we could add this kind of functionality to filter, but being the lazy programmer that I am, what if the compiler could do this for me? In most cases, I don't want to sleuth out what context is needed (which might change as the code evolves). To force the user to both invent the context type (here, it's just an int, but it may require other values) and explicitly pass it is a bit verbose (I already don't like having to pass the int, and declaring the parameter).

What if, a function that accepts a lambda, could also accept a context which the compiler made up and passed in? Here's how it would possibly look (with a new magic type __context):

auto nogcfilter(alias pred, T, __context...)(T[] rng, __context ctxt)
{
    import std.range;
    import std.algorithm : find;
    static struct Filter {
        T[] src;
        __context val;
        auto front() { return src.front; }
        void popFront() {src.popFront; src = src.find!pred(val); }
        auto empty() { return src.empty; }
    }
    return Filter(rng, ctxt);
}

auto foo(S[] arr, int someval) @nogc
{
   return arr.nogcfilter!((v) => v.field1 == someval);
}

The compiler would see there's a lambda involved, automatically pass the context parameter someval into the function itself, and everything magically works. It would only be used when a closure would otherwise be allocated, and the function being called declares the context parameter.

Disclaimer: I am not a language developer.

Does this make sense? Is there anything feasible that might be similar? Would it be more feasible with different syntax?

I realize there are implications if the context parameter is mutable, so perhaps we would have to require only const contexts (though the compiler might be able to figure out that something isn't modified anywhere, and allow it to go in).

-Steve

Why can't it be implemented by using templates?

  • Alex
June 23, 2021

On Tuesday, 22 June 2021 at 14:49:34 UTC, Steven Schveighoffer wrote:

>

I was kinda hoping for the compiler to assist in the generation and management of the context data since it's already doing it. The receiving function just needs a variable to receive and pass the context data to the lambda.

Off the top of my head:

I think it should just be an ordinary template parameter. Then you can let the compiler do the optimization if is possible without changing the semantics.

That way it isn't a language change.

So you could instead just introduce a UDA that signals that a compiler should issue a warning if it cannot do the optimization?

Or am I wrong?

June 23, 2021

On 6/22/21 8:38 PM, 12345swordy wrote:

>

Why can't it be implemented by using templates?

Oh it can, I did in the original message! It just doesn't look as nice.

-Steve

June 23, 2021

On 6/22/21 5:31 PM, H. S. Teoh wrote:

>

On Tue, Jun 22, 2021 at 03:56:03PM -0400, Steven Schveighoffer via Digitalmars-d wrote:
[...]

>

This again, is a misunderstanding of my proposal. I'm not talking
about altering delegates in any way.

There are no delegates involved in filter. It is, instead, an inner
struct with a hidden context pointer. Could just as well be a struct
with context carried with it instead of allocated on the heap.
[...]

[snip]

>

Am I right?

Yes, exactly! The only tricky part is that the filter function has to explicitly accept the context so it can tell the compiler where to put the context.

Essentially, it gives a place to accept the context and has the compiler give it a way to forward that context to the lambda function(s).

It may be trickier than I originally wrote, but the goal is to avoid having to do any work as a user (the library can be instrumented as much as we want), and the compiler just does the right thing.

-Steve

June 23, 2021

On 6/23/21 7:04 AM, Ola Fosheim Grøstad wrote:

>

On Tuesday, 22 June 2021 at 14:49:34 UTC, Steven Schveighoffer wrote:

>

I was kinda hoping for the compiler to assist in the generation and management of the context data since it's already doing it. The receiving function just needs a variable to receive and pass the context data to the lambda.

Off the top of my head:

I think it should just be an ordinary template parameter. Then you can let the compiler do the optimization if is possible without changing the semantics.

That way it isn't a language change.

Well, the library function needs to signify to the compiler that the parameter is meant to accept the value context. Otherwise, really odd things might happen with existing template parameters.

>

So you could instead just introduce a UDA that signals that a compiler should issue a warning if it cannot do the optimization?

Or am I wrong?

A UDA might work.

-Steve

June 23, 2021

On Wednesday, 23 June 2021 at 13:43:21 UTC, Steven Schveighoffer wrote:

>

A UDA might work.

Ok, so you use a templated filter function with a regular delegate for the parameter, but the delegate parameter is adorned with @__nogc_lambda. Then the compiler turns the @__nogc_lambda delegate into a context.

Compilers that don't support @__lambda will simply gc-allocate the lambda? So it is a nonbreaking extension.

Is that good enough? If so, then we can do it without a DIP.

1 2
Next ›   Last »