Thread overview
Standardize base types for metaprogramming
Feb 29
monkyyy
Mar 07
monkyyy
Mar 10
monkyyy
Mar 10
Daniel N
Apr 03
Dukc
Apr 04
Monkyyy
Apr 05
Dukc
Apr 05
monkyyy
February 29

Inside 5 layers of metaprogramming a mild inconsistency can break everything and require nested trait hell static ifs to fix. The base types just don't offer a consistent interface.

int.init!=float.init but what are you supposed to do besides when you write a function header such as foo(T)(T a, T b=T.init)

Also user types need to use op overloads where theres no drop in replacement for base types, I can't just use opCmp()=>child.opCmp; I have to detect if its a base type and handle.

In practice there are three cases for even conceptually simple composite type, user types, the usual base type behavior, a weird edge case base type; don't do that.

I suggest extending all base types with at least 5 values(with example int,float,string):

  1. zero: 0, 0.0., ""
  2. invalid: max-1, nan, "ERROR"
  3. onevalue: 1, 1.0, "1"
  4. negative_value: -1 (signed) max (unsigned), -1, "WARNING"
  5. min: .min, .min_normal, ""

And then define element op overloads(such as opCmp), toString, toHash for each base type

March 07

On Thursday, 29 February 2024 at 21:16:32 UTC, monkyyy wrote:

>

some example code where Id prefer some stable values to be part of a type interface

auto counter(A...)(A args){
	struct Counter(T){
		T end=T.max;
		T front=T(0);
		T step=T(1);
		auto pop()=>Counter!T(end,front+step,step);
		bool empty()=>end<=front;
		auto cap()=>(end-front)/step;
	}
	//auto args_=args.totuple;
	static if(A.length>1){
		auto args_=tuple(args[1],args[0],args[2..$]);
	} else {
		auto args_=tuple(args,int.init);
	}
	return Counter!(typeof(args_[0]))(args_[0..A.length]);
}
unittest{
	auto foo=counter();
	assert(foo.front==0);
	foo=foo.pop;
	assert(foo.front==1);
}
unittest{
	auto foo=counter(10);
	assert(foo.front==0);
	foo=foo.pop;
	assert(foo.front==1);
}
unittest{
	auto foo=counter(0,10);
	assert(foo.front==0);
	foo=foo.pop;
	assert(foo.front==1);
}
unittest{
	auto foo=counter(0,10,2);
	assert(foo.front==0);
	foo=foo.pop;
	assert(foo.front==2);
}
March 09
On 2/29/2024 1:16 PM, monkyyy wrote:
> I suggest extending all base types with at least 5 values(with example int,float,string):
> 
> 1. zero: 0, 0.0., ""
> 2. invalid: max-1, nan, "ERROR"
> 3. onevalue: 1, 1.0, "1"
> 4. negative_value: -1 (signed) max (unsigned), -1, "WARNING"
> 5. min: .min, .min_normal, ""
> 
> And then define element op overloads(such as opCmp), toString, toHash for each base type

I'm not so sure about this. Is "" really a zero? max-1 are not invalid values for ints.

As for the op overloads, CTFE to the rescue!

```
hash_t toHash(int); hash_t toHash(float);
```
etc. when you need them.
March 10
On Saturday, 9 March 2024 at 20:13:42 UTC, Walter Bright wrote:
> 
> I'm not so sure about this. Is "" really a zero? max-1 are not invalid values for ints.

I believe an empty string is the smallest string possible

max-1 is so negative != invalid for unsigned ints

Are you worried about the details or the core idea? Right now I have .init and maybe .max which mild reminder float.inits behavior is slightly contentious, zero vs invalid is trying to compromise with that debate

> As for the op overloads, CTFE to the rescue!

I dont wish to reimpliment them and for all I know the hashes aa's use may change, the object.d toString returns may change in syntax etc.

Easy programming is built on api's


March 10
On Sunday, 10 March 2024 at 02:24:31 UTC, monkyyy wrote:
> On Saturday, 9 March 2024 at 20:13:42 UTC, Walter Bright wrote:
>> 
>> I'm not so sure about this. Is "" really a zero? max-1 are not invalid values for ints.
>
> I believe an empty string is the smallest string possible
>
> max-1 is so negative != invalid for unsigned ints
>

There is some precedens for this, ex memset_s.
https://en.cppreference.com/w/c/string/byte/memset

which uses rsize_t and RSIZE_MAX, defined as (SIZE_MAX >> 1).

Sadly the industry adaptation rate of this feature has been quite poor.


April 03

On Thursday, 29 February 2024 at 21:16:32 UTC, monkyyy wrote:

>

I suggest extending all base types with at least 5 values(with example int,float,string):

  1. zero: 0, 0.0., ""
  2. invalid: max-1, nan, "ERROR"
  3. onevalue: 1, 1.0, "1"
  4. negative_value: -1 (signed) max (unsigned), -1, "WARNING"
  5. min: .min, .min_normal, ""

And then define element op overloads(such as opCmp), toString, toHash for each base type

I have some gripes with this list. For example, "" is not a good "zero" value. If we define one, it should be defined only for types that have a literal zero value. That means ints and floats and other numeric types, nothing else.

I'm more sympathetic to the general notion though. We could well have a property called somthing like .validDefault that would be 0 for floats, '\0' or ' ' as opposed to '\xff' for chars and so on. Maybe it should be undefined for classes though. There's not really a good .validDefault value for object, as a null reference is hardly any more valid than NaNs are. Same for pointers.

The thing with metaprogramming is, you really can't be fully generic no matter the language rules. Even if you had a nice valid value for every built-in type, people will want to define types there can be no good values for. An extreme example is the bottom type. It can only ever be:

  1. A type system breaking unsafe value.
  2. Never instantiated in actual execution flow.
April 04

On Wednesday, 3 April 2024 at 19:14:01 UTC, Dukc wrote:

>

On Thursday, 29 February 2024 at 21:16:32 UTC, monkyyy wrote:

>

[...]

I have some gripes with this list. For example, "" is not a good "zero" value. If we define one, it should be defined only for types that have a literal zero value. That means ints and floats and other numeric types, nothing else.

I'm more sympathetic to the general notion though. We could well have a property called somthing like .validDefault that would be 0 for floats, '\0' or ' ' as opposed to '\xff' for chars and so on. Maybe it should be undefined for classes though. There's not really a good .validDefault value for object, as a null reference is hardly any more valid than NaNs are. Same for pointers.

The thing with metaprogramming is, you really can't be fully generic no matter the language rules. Even if you had a nice valid value for every built-in type, people will want to define types there can be no good values for. An extreme example is the bottom type. It can only ever be:

  1. A type system breaking unsafe value.
  2. Never instantiated in actual execution flow.

I kinda want to drop all debates about details and just want more options then .init and the very questionable set of float and int not even matching.

If it would be just a bunch of polls and what string.zero outputted was different then what I expect, whatever

April 05

On Thursday, 4 April 2024 at 20:14:54 UTC, Monkyyy wrote:

>

On Wednesday, 3 April 2024 at 19:14:01 UTC, Dukc wrote:

>

On Thursday, 29 February 2024 at 21:16:32 UTC, monkyyy wrote:

>

[...]

I have some gripes with this list. For example, "" is not a good "zero" value. If we define one, it should be defined only for types that have a literal zero value. That means ints and floats and other numeric types, nothing else.

I'm more sympathetic to the general notion though. We could well have a property called somthing like .validDefault that would be 0 for floats, '\0' or ' ' as opposed to '\xff' for chars and so on. Maybe it should be undefined for classes though. There's not really a good .validDefault value for object, as a null reference is hardly any more valid than NaNs are. Same for pointers.

The thing with metaprogramming is, you really can't be fully generic no matter the language rules. Even if you had a nice valid value for every built-in type, people will want to define types there can be no good values for. An extreme example is the bottom type. It can only ever be:

  1. A type system breaking unsafe value.
  2. Never instantiated in actual execution flow.

I kinda want to drop all debates about details and just want more options then .init and the very questionable set of float and int not even matching.

It's okay at this phase, but if you'll write a DIP you will have to propose a set.

I think we need to discuss one thing here though. Do you propose that there are some sorts of "alternative .init"s that are universal to all D types? Or is it okay that all the new properties would work for only a subset of D types?

If the former, one option would be that when there is no sensible value for the type in question, the value would be bottom instead. For example,

// at object.d

class Object {
  //
  // implementation...
  //

  enum Object zero = assert(0); // no sensible zero value for this type
}

This would mean we could still use Object.zero to satisfy the typechecker or to infer the Object type for a variable, but trying to actually instantiate it would be a runtime crash.

April 05

On Friday, 5 April 2024 at 08:57:50 UTC, Dukc wrote:

>

Do you propose that there are some sorts of "alternative .init"s that are universal to all D types? Or is it okay that all the new properties would work for only a subset of D types?

I think string.zero should be defined even if its potentially nonsense so you can write

auto replaceinvalid(alias F=a=>a==typeof(a).invalid,T)(T validate,T replacement=T.zero){
  if(F(validate)){return replacement;}
  return validate;
}

Base type can be changed by the user and should be pushed to extermes, if users want a function to work they can define the requirements with a local enum much like to!string can call toString.

If theres some that only work for numbers because the mathy poeple make a demand, fine, but I think a 0,1, and invalid that applys to all base types would massively help with meta programming
0 is theres a throw away value litteral, 1 so you can unittest that 0!=1 in data structures, invalid for nullable, unittests, missing index etc.