September 01, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to EntangledQuanta | I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck. I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it. But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking. First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy. Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications) With those three simple changes to the language I think that this feature will work for every T. |
September 02, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to Jesse Phillips | On Friday, 1 September 2017 at 23:25:04 UTC, Jesse Phillips wrote:
> I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck.
>
> I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it.
>
> But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking.
>
> First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy.
>
> Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications)
>
> With those three simple changes to the language I think that this feature will work for every T.
Specifying that there will be no further linkage is the same as making T finite. T must be finite.
C# uses generics/IR/CLR so it can do things at run time that is effectively compile time for D.
By simply extending the grammar slightly in an intuitive way, we can get the explicit finite case, which is easy:
foo(T in [A,B,C])()
and possibly for your case
foo(T in <module>)() would work
or
foo(T in <program>)()
the `in` keyword makes sense here and is not used nor ambiguous, I believe.
Regardless of the implementation, the idea that we should throw the baby out with the bathwater is simply wrong. At least there are a few who get that. By looking in to it in a serious manner an event better solution might be found. Not looking at all results in no solutions and no progress.
|
September 02, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to EntangledQuanta | On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote:
> Regardless of the implementation, the idea that we should throw the baby out with the bathwater is simply wrong. At least there are a few who get that. By looking in to it in a serious manner an event better solution might be found. Not looking at all results in no solutions and no progress.
Problem is that you didn't define the problem. You showed some code the compiler rejected and expressed that the compiler needed to figure it out. You did change it to having the compiler instantiate specified types, but that isn't defining the problem.
You didn't like the code needed which would generate the functions and you hit a Visual D with the new static foreach.
All of these are problems you could define, and you could have evaluated static foreach as a solution but instead stopped at problems with the tooling.
You also don't appear to care about the complexity of the language. I expressed three required changes some of which may not play nicely with least surprise. You went straight to, we just need to define a syntax for that instead of expressing concern that the compiler will also need to handle errors to the use, such that the user understands that a feature they use is limited to very specific situations.
Consider if you have a module defined interface, is that interface only available for use in that module? If not, how does a different model inherent the interface, does it need a different syntax.
There is a lot more to a feature then having a way to express your desires. If your going to stick to a stance that it must exist and aren't going to accept there are problems with the request why expect others to work through the request.
|
September 02, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to Jesse Phillips | On Saturday, 2 September 2017 at 16:20:10 UTC, Jesse Phillips wrote: > On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote: >> Regardless of the implementation, the idea that we should throw the baby out with the bathwater is simply wrong. At least there are a few who get that. By looking in to it in a serious manner an event better solution might be found. Not looking at all results in no solutions and no progress. > > Problem is that you didn't define the problem. You showed some code the compiler rejected and expressed that the compiler needed to figure it out. You did change it to having the compiler instantiate specified types, but that isn't defining the problem. I think the problem is clearly defined, it's not my job to be a D compiler researcher and spell everything out for everyone else. Do I get paid for solving D's problems? > You didn't like the code needed which would generate the functions and you hit a Visual D with the new static foreach. This sentence makes no sense. "hit a Visual D" what? Do you mean bug? If that is the case, how is that my fault? Amd I suppose to know off the bat that an access violation is caused by Visual D and not dmd when there is no info about the violation? Is it my fault that someone didn't code one of those tools good enough to express enough information for one to figure it out immediately? > All of these are problems you could define, and you could have evaluated static foreach as a solution but instead stopped at problems with the tooling. Huh? I think you fail to understand the real problem. The problem has nothing to do with tooling and I never said it did. The static foreach "solution" came after the fact when SEVERAL people(ok, 2) said it was an impossible task to do. That is where all this mess started. I then came up with a solution which proved that it is possible to do on some level, that is a solution to a problem that was defined, else the solution wouldn't exist. > You also don't appear to care about the complexity of the language. I expressed three required changes some of which may not play nicely with least surprise. You went straight to, we just need to define a syntax for that instead of expressing concern that the compiler will also need to handle errors to the use, such that the user understands that a feature they use is limited to very specific situations. Do you not understand that if a library solution exists then there is no real complexity added? It is called "lowering" by some. The compiler simply "rewrites" whatever new syntax is added in a form that the library solution realized. You are pretended, why?, that what I am proposed will somehow potentially affect every square micron of the D language and compiler, when it won't. Not all additions to a compiler are add *real* complexity. That is a failing of you and many on the D forums who resist change. > Consider if you have a module defined interface, is that interface only available for use in that module? If not, how does a different model inherent the interface, does it need a different syntax. What does that have to do with this problem? We are not talking about interfaces. We are talking about something inside interfaces, so the problem about interfaces is irrelevant to this discussion because it applies to interfaces in general... interfaces that already exist and the problem exists regardless of what I > There is a lot more to a feature then having a way to express your desires. If your going to stick to a stance that it must exist and aren't going to accept there are problems with the request why expect others to work through the request. No, your problem is your ego and your inability to interpret things outside of your own mental box. You should always keep in mind that you are interpreting someone elses mental wordage in your own way and it is not a perfect translation, in fact, we are lucky if 50% is interpreted properly. Now, if I do not have a right to express my desires, then at least state that, but I do have a right not to express any more than that. As far as motivating other people, that is isn't my job. I could care less actually. D is a hobby for me and I do it because I like the power D has, but D is the most frustrating language I have ever used. It's the most(hyperbole) buggy, most incomplete(good docs system: regardless of what the biased want to claim, tool, etc), most uninformative(errors that just toss the whole kitchen sink at you), etc. But I do have hope... which is the only reason I use it. Maybe I'm just an idiot and should go with the crowed, it would at least save me some frustration. C#, since you are familiar with it, you should know there is a huge difference. If D was like C# as far as the organizational structure(I do not mean MS, I mean the docs, library, etc) you would surely agree that D would most likely be the #1 language on this planet? C# has it's shit together. It is, for the most part, and elegant language that is well put together in almost every regard. It was thought out well and not hacked together the way D feels. The problem is that the D community doesn't seem to want to go a similar direction but go in circles. I think D will not progress much further in the next 10 years, if at all as far as improving it's self. The attitude of D programmers tends to be quite lame(it's a ragtag collection of individuals working in disparate means that only come together when there is a common need rather than a team working together for a higher focused purpose). First, you make up stuff as I never said anything about it *must existing* in D. Search the thread and you will see that you are the first one to use that phrase. Second, you fail to understand the difference between a theoretical discussion about what is possible and the practical of what is possible. Second, I am talking about the theoretical aspects of the *ability* to use virtual template function in D. I was told it is impossible, at least at first. Jonathan then came up with a hand written method where one uses a kludge to sort of do it. I then came up with a library solution that shows that it can be implemented and used with a few lines of code that enable such a feature(the two mixins). I also clarified the problem that by stating it is not an issue about virtual templated function but about the "size" of T(in which I do not mean the byte size but the space). With such a solution, it shows that a compiler can internally "add those lines"(effectively, which means that it will do whatever similar work it needs to do and so we can get a similar behavior without having to explicitly use the library solution to provide such functionality). Third, knowing that it is feasible opens the door and at least should pacify those that claim it is impossible. That is actually quite a lot on my part. I could have just shut up and let things be what they are and let the ignorance continue being ignorant. I put the foot in the door. But by doing that it opens up things for discussion about progression, which is what happened next PRECISELY because I pushed through the ignorance and put in the time to get the discussion going. Sure, I could have silently written up a dip and put 4 weeks of effort in it, forked dmd and implemented the code to show how it could be done, etc. But that is not my job, and considering how appreciative people around here are of compiler changes, dips, and advanced featuers, I'd expect it to a total waste of time. I have better things to do with my life than that. Given also the nature of the dmd community and the level of the tooling, docs, and such, I'm not going to invest my life it beyond planting seeds that maybe one day will sprout, but unlikely not because no one cares to water them. So, now we are at the "static foreach" solution that Adam added. I tried it, it looks nice but crashed as I did it. You seem to think I'm suppose to realize immediately with the only error "Access violtion: Object(0x34234)" is suppose to be a tooling problem. I guess I'm just not that smart. But eventually I did figure it out on my own and realized it was with visual D. But even that should be irrelevant as we are taking about a D feature. You then come along and add your 2c, and suggest a few specific issues and your thoughts about them. I then respond essentially agreeing with you but stating I still don't think it can be done for every T but offer a few syntaxes that might work in limiting T to being finite, which, is fundamentally the problem, regardless if you think it is or not and your statements are contradictory where you say "limit the linkage" and "all T". To spell it out: First you say "But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking." Then at the end you say "With those three simple changes to the language I think that this feature will work for every T." Which are contraditory. Assuming we've told the compiler that no future linking is going to occur IS limiting T which means it won't work for EVERY T. By every T I mean every T in existence ever, regardless of any assumptions, rules, etc. If you meant every T in the source code, then yes, but you should have made that explicit since the problem innately depends on T being finite regardless of any implementation. You then basically attack me saying I should have done this and that and it's my fault for not stating the problem(which I did, clearly or not, or we wouldn't be at this point). I should think about the ramifications, etc. But I guess every day is different, right? Anyways, any library solution or kludge is not a solution in my book. The foreach method is no different than the mixin solution as far as added additional lines of code to a project that make the code less clear, less elegant, and less robust. You can make claims all day long that everything that can be implemented in a library should be. If that is the case many compiler features/all should be eliminated, in fact, maybe we should write in binary, as we can add everything to a library that we need? For some reason the DMD compiler and D language is treated like a golden calf that can't be changed. So much worry about adding complexity. If the design is so fragile that additional complexity or changes will potentially cause it to collapse, then it's not the features problems but D/dmd. At least state that case if it is. In that case it will collapse on it's own in due time regardless of what new stuff is added, patches can only take one so far. Anyways, I'm done with this conversation. I've shown light on a problem with D and shown that it has the potential to be solved, I am not going to be the one to solve it. If you want to waste many hours of your life on trying to get find a proper solution and get it accepted, by all means. I will use kludges as it gets me down the road... it doesn't make me happy, but who cares about happiness? |
September 02, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to EntangledQuanta | On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote: > On Friday, 1 September 2017 at 23:25:04 UTC, Jesse Phillips wrote: >> I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck. >> >> I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it. >> >> But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking. >> >> First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy. >> >> Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications) >> >> With those three simple changes to the language I think that this feature will work for every T. > > Specifying that there will be no further linkage is the same as making T finite. T must be finite. > > C# uses generics/IR/CLR so it can do things at run time that is effectively compile time for D. > > By simply extending the grammar slightly in an intuitive way, we can get the explicit finite case, which is easy: > > foo(T in [A,B,C])() > > and possibly for your case > > foo(T in <module>)() would work > > or > > foo(T in <program>)() > > the `in` keyword makes sense here and is not used nor ambiguous, I believe. While I agree that `in` does make sense for the semantics involved, it is already used to do a failable key lookup (return pointer to value or null if not present) into an associative array [1] and input contracts. It wouldn't be ambiguous AFAICT, but having a keyword mean three different things depending on context would make the language even more complex (to read). W.r.t. to the idea in general: I think something like that could be valuable to have in the language, but since this essentially amounts to syntactic sugar (AFAICT), but I'm not (yet) convinced that with `static foreach` being included it's worth the cost. [1] https://dlang.org/spec/expression.html#InExpression |
September 02, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to Moritz Maxeiner | On Saturday, 2 September 2017 at 21:19:31 UTC, Moritz Maxeiner wrote: > On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote: >> On Friday, 1 September 2017 at 23:25:04 UTC, Jesse Phillips wrote: >>> I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck. >>> >>> I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it. >>> >>> But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking. >>> >>> First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy. >>> >>> Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications) >>> >>> With those three simple changes to the language I think that this feature will work for every T. >> >> Specifying that there will be no further linkage is the same as making T finite. T must be finite. >> >> C# uses generics/IR/CLR so it can do things at run time that is effectively compile time for D. >> >> By simply extending the grammar slightly in an intuitive way, we can get the explicit finite case, which is easy: >> >> foo(T in [A,B,C])() >> >> and possibly for your case >> >> foo(T in <module>)() would work >> >> or >> >> foo(T in <program>)() >> >> the `in` keyword makes sense here and is not used nor ambiguous, I believe. > > While I agree that `in` does make sense for the semantics involved, it is already used to do a failable key lookup (return pointer to value or null if not present) into an associative array [1] and input contracts. It wouldn't be ambiguous AFAICT, but having a keyword mean three different things depending on context would make the language even more complex (to read). Yes, but they are independent, are they not? Maybe not. foo(T in Typelist)() in, as used here is not a input contract and completely independent. I suppose for arrays it could be ambiguous. For me, and this is just me, I do not find it ambiguous. I don't find different meanings ambiguous unless the context overlaps. Perceived ambiguity is not ambiguity, it's just ignorance... which can be overcome through learning. Hell, D has many cases where there are perceived ambiguities... as do most things. But in any case, I could care less about the exact syntax. It's just a suggestion that makes the most logical sense with regard to the standard usage of in. If it is truly unambiguous then it can be used. Another alternative is foo(T of Typelist) which, AFAIK, of is not used in D and even most programming languages. Another could be foo(T -> Typelist) or even foo(T from Typelist) or whatever. Doesn't really matter. They all mean the same to me once the definition has been written in stone. Could use `foo(T eifjasldj Typelist)` for all I care. The import thing for me is that such a simple syntax exists rather than the "complex syntax's" that have already been given(which are ultimately syntax's as everything is at the end of the day). > W.r.t. to the idea in general: I think something like that could be valuable to have in the language, but since this essentially amounts to syntactic sugar (AFAICT), but I'm not (yet) convinced that with `static foreach` being included it's worth the cost. > Everything is syntactic sugar. So it isn't about if but how much. We are all coding in 0's and 1's whether we realize it or not. The point if syntax(or syntactic sugar) is to reduce the amount of 0's and 1's that we have to *effectively* code by grouping common patterns in to symbolic equivalents(by definition). This is all programming is. We define certain symbols to mean certain bit patterns, or generic bit matters(an if keyword/symbol is a generic bit pattern, a set of machine instructions(0's and 1's) and substitution placeholders that are eventually filled with 0's and 1's). No one can judge the usefulness of syntax until it has been created because what determines how useful something is is its use. But you can't use something if it doesn't exist. I think many fail to get that. The initial questions should be: Is there a gap in the language? (Yes in this case). Can the gap be filled? (this is a theoretical/mathematical question that has to be answered. Most people jump the gun here and make assumptions) Does the gap need to be filled? Yes in this case, because all gaps ultimately need to be filled, but this then leads the practical issues: Is the gap "large", how much work will it take to fill the gap? Will feeling that gap have utility? etc. These practical questions can only be dealt with once the theoretical can of "is it possible" is dealt with. I have shown it is possible(well, Jonathan gave a proof of concept first, I just implemented an automation for it). I think, at least several of us, should now be convinced that it is theoretically possible since several ways have been shown to be fruitful. We are now at where you have said you are not convinced if a new simpler syntax is warranted. The only real way to know is to implement that syntax experimentally, use it, then compare with the other methods and compare. But of course this is real work that most people are not willing to invest and so they approximate, as you have, an answer. I do not know, as you don't. We have our guesses derived from our experiences and our extrapolations. I can say, that in my case, it would only simplify my code by a few lines(and, of course, remove a library dependency, which I do not like anyways). What it mainly does is reduce kludges and being I'm the type of person that does not like kludges, makes me "happier". If you are ok with kludges, then it won't effect you as much. The only thing I can say are theoretical assertions and it is up for you to decide if they are worth your time to implement them(assuming you were the person). 1. Library solutions are always less desirable in the theoretical world. Ideally we would want a compiler that does everything and does it perfectly. Such an ideal may not be possible, but obviously compiler and language designers feel there is some amorphous ideal and history shows compilers tend to move towards that ideal. Libraries create dependencies on external code which have versioning issues, upkeep, etc. They are a middle ground solution between the practical and theoretical. But they are not something that should be "striven" for. Else, again, we should just write in binary and have everything implemented as a library solution. (which, once we do, we will realize we have a compiler) Library solutions also add complexity to the code itself. It is a trade off of compiler complexity vs user code complexity. The D community seems to love to push the complexity on the user. I feel this is partly do to those that deal with the compiler not really being coders(in the common sense of writing practical business applications for making $$$). For example, What has Walter actually coded as far as "practical stuff"? A video game? Did he even use D? This is not a jab at him, but my guess is that he is more of a mathematician rather than an engineer. You can't really do both and be great at them because there is only so much time in the day... even though they overlap greatly. When you get in to writing massive real world applications that span hundreds of developers, I'd bet D starts to fail miserably... of course, unless you are writing the next clone of pac man or some ad software. It's not that it can't be done, or that it can't be done well, but D starts showing it's weakness the more difficult the problem becomes(and it's strengths). You can write a simple command line utility in just about any language... it's not a test of the languages strengths and weaknesses. 2. Given the nature of the topic, which is virtual templated functions, which is a parallel of virtual functions, it seems IMO that it is more of a core concept that fits nicely with the other pieces. Those pieces are not implemented as a library solution(they could be, but then we are back to 1). Hence, it is not too much of a leap to think that adding this feature as a compiler solution is warranted. Since these are a simple extension of a compiler solution, it seems natural that the compiler should deal with it. If it were a library solution then it would be natural to extend the library... not mix and match, which is what is generally being suggested. Now, it's true that the suggested solutions are relatively straight forward. So, the issue is somewhat moot now. It wasn't, at least for me, when I asked... and given that several people quickly denied that such a solution(any) existed, is what made this thread much longer than it needed to be. I'd prefer a compiler solution... that is my opinion. Do what you will with it. It means nothing at the end of the day. If I had my own compiler I would have already implemented it in the compiler. If my compiler was so fragile that I could not add such a simple rewrite rule(which should be very simple extensions that introduce minimum complexity to the language or compiler), I'd either rewrite the compiler(fix it like it should) or move one to greener fields. Also keep in mind that what is complex to one person is not necessarily so to another. I just don't like to be *told*(not proven) that something is impossible when I very well know it is... it's really not about "liking" but the fact that those same people go and perpetuate their ignorance on other people. I can deal with it because I know better, but many people fall victim to such ignorance and it's one of the reasons why the world has so many problems as it does. |
September 03, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to EntangledQuanta | On Saturday, 2 September 2017 at 23:12:35 UTC, EntangledQuanta wrote: > On Saturday, 2 September 2017 at 21:19:31 UTC, Moritz Maxeiner wrote: >> On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote: >>> On Friday, 1 September 2017 at 23:25:04 UTC, Jesse Phillips wrote: >>>> I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck. >>>> >>>> I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it. >>>> >>>> But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking. >>>> >>>> First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy. >>>> >>>> Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications) >>>> >>>> With those three simple changes to the language I think that this feature will work for every T. >>> >>> Specifying that there will be no further linkage is the same as making T finite. T must be finite. >>> >>> C# uses generics/IR/CLR so it can do things at run time that is effectively compile time for D. >>> >>> By simply extending the grammar slightly in an intuitive way, we can get the explicit finite case, which is easy: >>> >>> foo(T in [A,B,C])() >>> >>> and possibly for your case >>> >>> foo(T in <module>)() would work >>> >>> or >>> >>> foo(T in <program>)() >>> >>> the `in` keyword makes sense here and is not used nor ambiguous, I believe. >> >> While I agree that `in` does make sense for the semantics involved, it is already used to do a failable key lookup (return pointer to value or null if not present) into an associative array [1] and input contracts. It wouldn't be ambiguous AFAICT, but having a keyword mean three different things depending on context would make the language even more complex (to read). > > Yes, but they are independent, are they not? Maybe not. > > foo(T in Typelist)() > > in, as used here is not a input contract and completely independent. I suppose for arrays it could be ambiguous. The contexts being independent of each other doesn't change that we would still be overloading the same keyword with three vastly different meanings. Two is already bad enough imho (and if I had a good idea with what to replace the "in" for AA's I'd propose removing that meaning). > > For me, and this is just me, I do not find it ambiguous. I don't find different meanings ambiguous unless the context overlaps. Perceived ambiguity is not ambiguity, it's just ignorance... which can be overcome through learning. Hell, D has many cases where there are perceived ambiguities... as do most things. It's not about ambiguity for me, it's about readability. The more significantly different meanings you overload some keyword - or symbol, for that matter - with, the harder it becomes to read. > > But in any case, I could care less about the exact syntax. It's just a suggestion that makes the most logical sense with regard to the standard usage of in. If it is truly unambiguous then it can be used. Well, yes, as I wrote, I think it is unambiguous (and can thus be used), I just think it shouldn't be used. > > Another alternative is > > foo(T of Typelist) > > which, AFAIK, of is not used in D and even most programming languages. Another could be > > foo(T -> Typelist) > > or even > > foo(T from Typelist) I would much rather see it as a generalization of existing template specialization syntax [1], which this is t.b.h. just a superset of (current syntax allows limiting to exactly one, you propose limiting to 'n'): --- foo(T: char) // Existing syntax: Limit T to the single type `char` foo(T: (A, B, C)) // New syntax: Limit T to one of A, B, or C --- Strictly speaking, this is exactly what template specialization is for, it's just that the current one only supports a single type instead of a set of types. Looking at the grammar rules, upgrading it like this is a fairly small change, so the cost there should be minimal. > > or whatever. Doesn't really matter. They all mean the same to me once the definition has been written in stone. Could use `foo(T eifjasldj Typelist)` for all I care. That's okay, but it does matter to me. > The import thing for me is that such a simple syntax exists rather than the "complex syntax's" that have already been given(which are ultimately syntax's as everything is at the end of the day). Quoting a certain person (you know who you are) from DConf 2017: "Write a DIP". I'm quite happy to discuss this idea, but at the end of the day, as it's not an insignificant change to the language someone will to do the work and write a proposal. > > >> W.r.t. to the idea in general: I think something like that could be valuable to have in the language, but since this essentially amounts to syntactic sugar (AFAICT), but I'm not (yet) convinced that with `static foreach` being included it's worth the cost. >> > > Everything is syntactic sugar. So it isn't about if but how much. We are all coding in 0's and 1's whether we realize it or not. The point if syntax(or syntactic sugar) is to reduce the amount of 0's and 1's that we have to *effectively* code by grouping common patterns in to symbolic equivalents(by definition). AFAIK the difference between syntax sugar and enabling syntax in PLs usually comes down to the former allowing you to express concepts already representable by other constructs in the PL; when encountered, the syntax sugar could be lowered by the compiler to the more verbose syntax and still be both valid in the PL and recognizable as the concept (while this is vague, a prominent example would be lambdas in Java 8). > > No one can judge the usefulness of syntax until it has been created because what determines how useful something is is its use. But you can't use something if it doesn't exist. I think many fail to get that. Why do you think that? Less than ten people have participated in this thread so far. > The initial questions should be: Is there a gap in the language? (Yes in this case). Can the gap be filled? (this is a theoretical/mathematical question that has to be answered. > Most people jump the gun here and make assumptions) Why do you assume that? I've not seen anyone here claiming template parameter specialization to one of n types (which is the idea I replied to) couldn't be done in theory, only that it can't be done right now (the only claim as to that it can't be done I noticed was w.r.t. (unspecialized) templates and virtual functions, which is correct due to D supporting separate compilation; specialized templates, however, should work in theory). > Does the gap need to be filled? Yes in this case, because all gaps ultimately need to be filled, but this then leads the practical issues: Actually, I disagree here. It only *needs* filling if enough users of the language actually care about it not being there. Otherwise, it's a *nice to have* (like generics and Go, or memory safety and C :p ). [1] https://dlang.org/spec/template.html#parameters_specialization |
September 03, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to Moritz Maxeiner | On Sunday, 3 September 2017 at 02:39:19 UTC, Moritz Maxeiner wrote: > On Saturday, 2 September 2017 at 23:12:35 UTC, EntangledQuanta wrote: >> On Saturday, 2 September 2017 at 21:19:31 UTC, Moritz Maxeiner wrote: >>> On Saturday, 2 September 2017 at 00:00:43 UTC, EntangledQuanta wrote: >>>> On Friday, 1 September 2017 at 23:25:04 UTC, Jesse Phillips wrote: >>>>> I've love being able to inherit and override generic functions in C#. Unfortunately C# doesn't use templates and I hit so many other issues where Generics just suck. >>>>> >>>>> I don't think it is appropriate to dismiss the need for the compiler to generate a virtual function for every instantiated T, after all, the compiler can't know you have a finite known set of T unless you tell it. >>>>> >>>>> But lets assume we've told the compiler that it is compiling all the source code and it does not need to compile for future linking. >>>>> >>>>> First the compiler will need to make sure all virtual functions can be generated for the derived classes. In this case the compiler must note the template function and validate all derived classes include it. That was easy. >>>>> >>>>> Next up each instantiation of the function needs a new v-table entry in all derived classes. Current compiler implementation will compile each module independently of each other; so this feature could be specified to work within the same module or new semantics can be written up of how the compiler modifies already compiled modules and those which reference the compiled modules (the object sizes would be changing due to the v-table modifications) >>>>> >>>>> With those three simple changes to the language I think that this feature will work for every T. >>>> >>>> Specifying that there will be no further linkage is the same as making T finite. T must be finite. >>>> >>>> C# uses generics/IR/CLR so it can do things at run time that is effectively compile time for D. >>>> >>>> By simply extending the grammar slightly in an intuitive way, we can get the explicit finite case, which is easy: >>>> >>>> foo(T in [A,B,C])() >>>> >>>> and possibly for your case >>>> >>>> foo(T in <module>)() would work >>>> >>>> or >>>> >>>> foo(T in <program>)() >>>> >>>> the `in` keyword makes sense here and is not used nor ambiguous, I believe. >>> >>> While I agree that `in` does make sense for the semantics involved, it is already used to do a failable key lookup (return pointer to value or null if not present) into an associative array [1] and input contracts. It wouldn't be ambiguous AFAICT, but having a keyword mean three different things depending on context would make the language even more complex (to read). >> >> Yes, but they are independent, are they not? Maybe not. >> >> foo(T in Typelist)() >> >> in, as used here is not a input contract and completely independent. I suppose for arrays it could be ambiguous. > > The contexts being independent of each other doesn't change that we would still be overloading the same keyword with three vastly different meanings. Two is already bad enough imho (and if I had a good idea with what to replace the "in" for AA's I'd propose removing that meaning). Why? Don't you realize that the contexts matters and it's what separates the meaning? In truly unambiguous contexts, it shouldn't matter. It may require one to decipher the context, which takes time, but there is nothing inherently wrong with it and we are limited to how many symbols we use(unfortunately we are generally stuck with the querty keyboard design, else we could use symbols out the ying yang and make things much clearer, but even mathematics, which is a near perfect language, "overloads" symbols meanings). You have to do this sort of thing when you limit the number of keywords you use. Again, ultimately it doesn't matter. A symbol is just a symbol. For me, as long as the context is clear, I don't see what kind of harm it can cause. You say it is bad, but you don't give the reasons why it is bad. If you like to think of `in` has having only one definition then the question is why? You are limiting yourself. The natural languages are abound with such multi-definitions. Usually in an ambiguous way and it can cause a lot of problems, but for computer languages, it can't(else we couldn't actually compile the programs). Context sensitive grammars are provably more expressive than context free. https://en.wikipedia.org/wiki/Context-sensitive_grammar Again, I'm not necessarily arguing for them, just saying that one shouldn't avoid them just to avoid them. > >> >> For me, and this is just me, I do not find it ambiguous. I don't find different meanings ambiguous unless the context overlaps. Perceived ambiguity is not ambiguity, it's just ignorance... which can be overcome through learning. Hell, D has many cases where there are perceived ambiguities... as do most things. > > It's not about ambiguity for me, it's about readability. The more significantly different meanings you overload some keyword - or symbol, for that matter - with, the harder it becomes to read. I don't think that is true. Everything is hard to read. It's about experience. The more you experience something the more clear it becomes. Only with true ambiguity is something impossible. I realize that in one can design a language to be hard to parse due to apparent ambiguities, but am I am talking about cases where they can be resolved immediately(at most a few milliseconds). You are making general statements, and it is not that I disagree, but it depends on context(everything does). In this specific case, I think it is extremely clear what in means, so it is effectively like using a different token. Again, everyone is different though and have different experiences that help them parse things more naturally. I'm sure there are things that you might find easy that I would find hard. But that shouldn't stop me from learning about them. It makes me "smarter", to simplify the discussion. >> >> But in any case, I could care less about the exact syntax. It's just a suggestion that makes the most logical sense with regard to the standard usage of in. If it is truly unambiguous then it can be used. > > Well, yes, as I wrote, I think it is unambiguous (and can thus be used), I just think it shouldn't be used. Yes, but you have only given the reason that it shouldn't be used because you believe that one shouldn't overload keywords because it makes it harder to parse the meaning. My rebuttal, as I have said, is that it is not harder, so your argument is not valid. All you could do is claim that it is hard and we would have to find out who is more right. I have a logical argument against your absolute restriction though... in that it causes one to have to use more symbols. I would imagine you are against stuff like using "in1", "in2", etc because they visibly are to close to each other. If you want "maximum" readability you are going to have to mathematically define that in a precise way then could up with a grammar that expresses it. I think you'll find that the grammar will depend on each individual person. At best you could then take an average which satisfies most people up to some threshold... in which case, at some point in time later, that average will shift and your grammar will no longer be valid(it will no longer satisfy the average). Again, it's not that I completely disagree with you on a practical level. Lines have to be drawn, but it's about where to precisely draw that line. Drawing it in the wrong place leads to certain solutions that are generally problematic. That's how we know they are wrong, we draw a line and later realize it cause a bunch of problems then we say "oh that was the wrong way to do it". Only with drawing a bunch of wrong lines can we determine which ones are the best and use that info to predict better locations. >> >> Another alternative is >> >> foo(T of Typelist) >> >> which, AFAIK, of is not used in D and even most programming languages. Another could be >> >> foo(T -> Typelist) >> >> or even >> >> foo(T from Typelist) > > I would much rather see it as a generalization of existing template specialization syntax [1], which this is t.b.h. just a superset of (current syntax allows limiting to exactly one, you propose limiting to 'n'): > > --- > foo(T: char) // Existing syntax: Limit T to the single type `char` > foo(T: (A, B, C)) // New syntax: Limit T to one of A, B, or C > --- Yes, if this worked, I'd be fine with it. Again, I could care less. `:` == `in` for me as long as `:` has the correct meaning of "can be one of the following" or whatever. But AFAIK, : is not "can be one of the following"(which is "in" or "element of" in the mathematical sense) but can also mean "is a derived type of". All I'm after is the capability to do something elegantly, and when it doesn't exist, I "crave" that it does. I don't really care how it is done(but remember, it must be done elegantly). I am not "confused"(or whatever you want to call it) by symbolic notation. As long as it's clearly defined so I can learn the definition and it is not ambiguous. There are all kinds of symbols that can be used, again, we are limited by querty(for speed, no one wants to have to use alt-codes in programming, there is a way around this but would scare most people). e.g., T ∈ X is another expression(more mathematical, I assume that ∈ will be displayed correctly, it is Alt +2208) that could work, but ∈ is not ascii an so can't be used(not because it can't but because of peoples lack of will to progress out of the dark ages). > Strictly speaking, this is exactly what template specialization is for, it's just that the current one only supports a single type instead of a set of types. > Looking at the grammar rules, upgrading it like this is a fairly small change, so the cost there should be minimal. > If that is the case then go for it ;) It is not a concern of mine. You tell me the syntax and I will use it. (I'd have no choice, of course, but if it's short and sweet then I won't have any problem). The main reason I suggest syntax is because none exist and I assume, maybe wrongly, that people will get what I am saying easier than writing up some example library solution and demonstrating that. if I say something like class/struct { foo(T ∈ X)(); } defines a virtual template function for all T in X. Which is equivalent to class/struct { foo(X1)(); ... foo(Xn)(); } I assume that most people will understand, more or less the notation I used to be able to interpret what am trying to get at. It is a mix of psuedo-programming and mathematics, but it is not complex. ∈ might be a bit confusing but looking it up and learning about it will educate those that want to be educated and expand everyones ability to communicate better. I could, of course, be more precise, but I try to be precise only when it suits me(which may be fault, but, again, I only have so many hours in the day to do stuff). >> >> or whatever. Doesn't really matter. They all mean the same to me once the definition has been written in stone. Could use `foo(T eifjasldj Typelist)` for all I care. > > That's okay, but it does matter to me. That's fine. I am willing to compromise. Lucky for you, symbols/tokens and context are not a big deal to me. Of course, I do like short and sweet, so I am biased too, but I have much more leeway it seems. >> The import thing for me is that such a simple syntax exists rather than the "complex syntax's" that have already been given(which are ultimately syntax's as everything is at the end of the day). > > Quoting a certain person (you know who you are) from DConf 2017: "Write a DIP". > I'm quite happy to discuss this idea, but at the end of the day, as it's not an insignificant change to the language someone will to do the work and write a proposal. > My main issues with going through the trouble is that basically I have more important things to do. If I were going to try to get D to do all the changes I actually wanted, I'd be better off writing my own language the way I envision it and want it... but I don't have 10+ years to invest in such a beast and to do it right would require my full attention, which I'm not willing to give, because again, I have better things to do(things I really enjoy). So, all I can do is hopefully stoke the fire enough to get someone else interested in the feature and have them do the work. If they don't, then they don't, that is fine. But I feel like I've done something to try to right a wrong. >> >> >>> W.r.t. to the idea in general: I think something like that could be valuable to have in the language, but since this essentially amounts to syntactic sugar (AFAICT), but I'm not (yet) convinced that with `static foreach` being included it's worth the cost. >>> >> >> Everything is syntactic sugar. So it isn't about if but how much. We are all coding in 0's and 1's whether we realize it or not. The point if syntax(or syntactic sugar) is to reduce the amount of 0's and 1's that we have to *effectively* code by grouping common patterns in to symbolic equivalents(by definition). > > AFAIK the difference between syntax sugar and enabling syntax in PLs usually comes down to the former allowing you to express concepts already representable by other constructs in the PL; when encountered, the syntax sugar could be lowered by the compiler to the more verbose syntax and still be both valid in the PL and recognizable as the concept (while this is vague, a prominent example would be lambdas in Java 8). Yes, but everything is "lowered" it's just how you define it. It is all lowering to 0's and 1's. Syntactic sugar is colloquially used like you have defined it, but in the limit(the most general sense), it's just stuff. Why? Because what is sugar to one person is salt to another(this is hyperbole, of course, but you should be able to get my point). e.g., You could define syntactic sugar to be enhancement that can be directly rewritten in to a currently expressible syntax in the language. That is fine. But then what if that expressible syntax was also syntactic sugar? You end up with something like L(L(L(L(x)))) where L is a "lowering" and x is something that is not "lowered". But if you actually were able to trace the evolution of the compiler, You'd surely notice that x is just L(...L(y)...) for some y. A programming language is simply something that takes a set of bits and transforms them to another set of bits. No more and no less. Everything else is "syntactic sugar". The definition may be so general as to be useless, but it is what a programming language is(mathematically at least). Think about it a bit. How did programmers program before modern compilers came along? They used punch cards or levers, which are basically setting "bits" or various "function"(behaviors) that the machine would carry out. Certain functions and combinations of functions were deemed more useful and were combined in to "meta-functions" and given special bits to represent them. This process has been carried out ad-nauseam and we are were we are today because of this process(fundamentally) But the point is, at each step, someone can claim that the current "simplifying" of complex functions in to a "meta-function" just "syntactic sugar". This process though is actually what creates the "power" in things. Same thing happens at the hardware level... same thing happens with atoms and molecules(except we are not in control of the rules of how those things combine). >> >> No one can judge the usefulness of syntax until it has been created because what determines how useful something is is its use. But you can't use something if it doesn't exist. I think many fail to get that. > > Why do you think that? Less than ten people have participated in this thread so far. I am not talking about just this thread, I am talking about in all threads and all things in which humans attempt to determine the use of something. e.g., the use of computers(used to be completely useless for most people because they failed to see the use in it(it wasn't useful to them)). The use of medicine... the use of a new born baby, the use of life. The use of a turtle. People judge use in terms of what it does for them on a "personal" level, and my point is, that this inability to see the use of something in an absolute sense(how useful is it to the whole, be it the whole of the D programming community, the whole of humanity, the whole of life, or whatever) is a sever shortcoming of almost all humans. It didn't creep up too much in this thread but I have definitely see in it other threads. Most first say "Well, hell, that won't help me, that is useless". They forget that it may be useless to them at that moment, but might be useful to them and might be useful to other people. Why something is useless to someone, though, almost entirely depends on their use of it. You can't know how useful something is until you use it... and this is why so many people judge the use of something the way they do(They can't help it, it's sort of law of the universe). Let me explain, as it might not be clear: Many people many years ago used to think X was useless. Today, those same people cannot live without X. Replace X with just about anything(computers, music, oop, etc). But if you asked those people back then they would have told you those things are useless. But through whatever means(the way life is) things change and things that were previously useless become useful. They didn't know that at first because they didn't use those things to find out if they were useful. The same logic SHOULD be applied to everything. We don't know how useful something is until we use it *enough* to determine if it is useful. But this is not the logic most people use, including many people in the D community. They first judge, and almost exclusively(depends on the person), how it relates to their own person self. This is fundamentally wrong IMO and, while I don't have mathematical proof, I do have a lot of experience that tells me so(history being a good friend). > >> The initial questions should be: Is there a gap in the language? (Yes in this case). Can the gap be filled? (this is a theoretical/mathematical question that has to be answered. > >> Most people jump the gun here and make assumptions) > > Why do you assume that? I've not seen anyone here claiming template parameter specialization to one of n types (which is the idea I replied to) couldn't be done in theory, only that it can't be done right now (the only claim as to that it can't be done I noticed was w.r.t. (unspecialized) templates and virtual functions, which is correct due to D supporting separate compilation; specialized templates, however, should work in theory). Let me quote the first two responses: "It can't work this way. You can try std.variant." and "It is not possible to have a function be both virtual and templated. A function template generates a new function definition every time that it's a called with a new set of template arguments. So, the actual functions are not known up front, and that fundamentally does not work with virtual functions, where the functions need to be known up front, and you get a different function by a look-up for occurring in the virtual function call table for the class. Templates and virtual functions simply don't mix. You're going to have to come up with a solution that does not try and mix templates and virtual functions." Now, I realize I might have no been clear about things and maybe there is confusion/ambiguity in what I meant, how they interpreted it, or how I interpreted their response... but there is definitely no sense of "Yes, we can make this work in some way..." type of mentality. e.g., "Templates and virtual functions simply don't mix." That is an absolute statement. It isn't even qualified with "in D". >> Does the gap need to be filled? Yes in this case, because all gaps ultimately need to be filled, but this then leads the practical issues: > > Actually, I disagree here. It only *needs* filling if enough users of the language actually care about it not being there. Otherwise, it's a *nice to have* (like generics and Go, or memory safety and C :p ). Yes, on some level you are right... but again, who's to judge? the current users or the future users? You have to take in to account the future users if you care about the future of D, because those will be the users of it and so the current users actually have only a certain percentage of weight. Also, who will be more informed about the capabilities and useful features of D? The current users or the future users? Surely when you first started using D, you were ignorant of many of the pro's and con's of D. Your future self(in regard to that time period when you first started using D) new a lot more about it? ie., you know more now than you did, and you will know more in the future than you do now. The great thing about knowledge it grows with time when watered. You stuck around with D, learned it each "day" and became more knowledgeable about it. At the time, there were people making decisions about the future of D features, and now you get to experience them and determine their usefulness PRECISELY because of those people in the past filling in the gaps. EVERYTHING that D currently has it didn't have in the past. Hence, someone had to create it(Dip or no dip)... thank god they did, or D would just be a pimple on Walters brain. But D can't progress any further unless the same principles are applied. Sure it is more bulky(complex) and sure not everything has to be implemented in the compiler to make progress... But the only way we can truly know what we should do is first to do things we think are correct(and don't do things we know are wrong). So, when people say "this can't be done" and I know it damn well can, I will throw a little tantrum... maybe they will give me a cookie, who knows? Sure, I could be wrong... but I could also be right(just as much as they could be wrong or be right). This is why we talk about things, to combine our experiences and ideas to figure out how well something will work. The main problem I see, in the D community, is that very little cooperation is done in those regards unless it's initiated by the core team(that isn't a bad thing in some sense but it isn't a good thing in another sense). I guess some people just haven't learned the old proverb "Where there's a will, there's a way". > [1] https://dlang.org/spec/template.html#parameters_specialization As I mentioned, and I'm unclear if it : behaves exactly that way or not, but : seems to do more than be inclusive. If it's current meaning can still work with virtual templated functions, then I think it would be even better. But ultimately all this would have to be fleshed out properly before any real work could be done. |
September 03, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to EntangledQuanta | On Sunday, 3 September 2017 at 04:18:03 UTC, EntangledQuanta wrote: > On Sunday, 3 September 2017 at 02:39:19 UTC, Moritz Maxeiner wrote: >> On Saturday, 2 September 2017 at 23:12:35 UTC, EntangledQuanta wrote: >>> [...] >> >> The contexts being independent of each other doesn't change that we would still be overloading the same keyword with three vastly different meanings. Two is already bad enough imho (and if I had a good idea with what to replace the "in" for AA's I'd propose removing that meaning). > > Why? Don't you realize that the contexts matters and [...] Because instead of seeing the keyword and knowing its one meaning you also have to consider the context it appears in. That is intrinsically more work (though the difference may be very small) and thus harder. > > Again, I'm not necessarily arguing for them, just saying that one shouldn't avoid them just to avoid them. > > >> >>> [...] >> >> It's not about ambiguity for me, it's about readability. The more significantly different meanings you overload some keyword - or symbol, for that matter - with, the harder it becomes to read. > > I don't think that is true. Everything is hard to read. It's about experience. The more you experience something the more clear it becomes. Only with true ambiguity is something impossible. I realize that in one can design a language to be hard to parse due to apparent ambiguities, but am I am talking about cases where they can be resolved immediately(at most a few milliseconds). Experience helps, of course, but it doesn't change that it's still just that little bit slower. And everytime we encourage such overloading encourages more, which in the end sums up. > > You are making general statements, and it is not that I disagree, but it depends on context(everything does). In this specific case, I think it is extremely clear what in means, so it is effectively like using a different token. Again, everyone is different though and have different experiences that help them parse things more naturally. I'm sure there are things that you might find easy that I would find hard. But that shouldn't stop me from learning about them. It makes me "smarter", to simplify the discussion. I am, because I believe it to be generally true for "1 keyword |-> 1 meaning" to be easier to read than "1 keyword and 1 context |-> 1 meaning" as the former inherently takes less time. > > >>> [...] >> >> Well, yes, as I wrote, I think it is unambiguous (and can thus be used), I just think it shouldn't be used. > > Yes, but you have only given the reason that it shouldn't be used because you believe that one shouldn't overload keywords because it makes it harder to parse the meaning. My rebuttal, as I have said, is that it is not harder, so your argument is not valid. All you could do is claim that it is hard and we would have to find out who is more right. As I countered that in the above, I don't think your rebuttal is valid. > > I have a logical argument against your absolute restriction though... in that it causes one to have to use more symbols. I would imagine you are against stuff like using "in1", "in2", etc because they visibly are to close to each other. It's not an absolute restriction, it's an absolute position from which I argue against including such overloading on principle. If it can be overcome by demonstrating that it can't sensibly be done without more overloading and that it adds enough value to be worth the increases overloading, I'd be fine with inclusion. > >>> [...] >> >> I would much rather see it as a generalization of existing template specialization syntax [1], which this is t.b.h. just a superset of (current syntax allows limiting to exactly one, you propose limiting to 'n'): >> >> --- >> foo(T: char) // Existing syntax: Limit T to the single type `char` >> foo(T: (A, B, C)) // New syntax: Limit T to one of A, B, or C >> --- > > Yes, if this worked, I'd be fine with it. Again, I could care less. `:` == `in` for me as long as `:` has the correct meaning of "can be one of the following" or whatever. > > But AFAIK, : is not "can be one of the following"(which is "in" or "element of" in the mathematical sense) but can also mean "is a derived type of". Right, ":" is indeed an overloaded symbol in D (and ironically, instead of with "in", I think all its meanings are valuable enough to be worth the cost). I don't see how that would interfere in this context, though, as we don't actually overload a new meaning (it's still "restrict this type to the thing to the right"). > > > If that is the case then go for it ;) It is not a concern of mine. You tell me the syntax and I will use it. (I'd have no choice, of course, but if it's short and sweet then I won't have any problem). I'm discussing this as a matter of theory, I don't have a use for it. > >>> [...] >> >> Quoting a certain person (you know who you are) from DConf 2017: "Write a DIP". >> I'm quite happy to discuss this idea, but at the end of the day, as it's not an insignificant change to the language someone will to do the work and write a proposal. >> > > My main issues with going through the trouble is that basically I have more important things to do. If I were going to try to get D to do all the changes I actually wanted, I'd be better off writing my own language the way I envision it and want it... but I don't have 10+ years to invest in such a beast and to do it right would require my full attention, which I'm not willing to give, because again, I have better things to do(things I really enjoy). > > So, all I can do is hopefully stoke the fire enough to get someone else interested in the feature and have them do the work. If they don't, then they don't, that is fine. But I feel like I've done something to try to right a wrong. That could happen, though historically speaking, usually things have gotten included in D only when the major proponent of something like this does the hard work (otherwise they seem to just fizzle out). > >>> [...] >> >> AFAIK the difference between syntax sugar and enabling syntax in PLs usually comes down to the former allowing you to express concepts already representable by other constructs in the PL; when encountered, the syntax sugar could be lowered by the compiler to the more verbose syntax and still be both valid in the PL and recognizable as the concept (while this is vague, a prominent example would be lambdas in Java 8). > > Yes, but everything is "lowered" it's just how you define it. Yes and w.r.t to my initial point, I did define it as "within the PL itself, preserving the concept". > > >>> [...] >> >> Why do you think that? Less than ten people have participated in this thread so far. > > I am not talking about just this thread, I am talking about in all threads and all things in which humans attempt to determine the use of something. [...] Fair enough, though personally I'd need to see empirical proof of those general claims about human behaviour before I could share that position. >>> [...] >> >> Why do you assume that? I've not seen anyone here claiming template parameter specialization to one of n types (which is the idea I replied to) couldn't be done in theory, only that it can't be done right now (the only claim as to that it can't be done I noticed was w.r.t. (unspecialized) templates and virtual functions, which is correct due to D supporting separate compilation; specialized templates, however, should work in theory). > > Let me quote the first two responses: > > "It can't work this way. You can try std.variant." That is a reply to your mixing (unspecialized) templates and virtual functions, not to your idea of generalizing specialized templates. > > and > > "It is not possible to have a function be both virtual and templated. A function template generates a new function definition every time that it's a called with a new set of template arguments. [...]" Same here. > > Now, I realize I might have no been clear about things and maybe there is confusion/ambiguity in what I meant, how they interpreted it, or how I interpreted their response... but there is definitely no sense of "Yes, we can make this work in some way..." type of mentality. > > e.g., "Templates and virtual functions simply don't mix." > > That is an absolute statement. It isn't even qualified with "in D". > >>> [...] >> >> Actually, I disagree here. It only *needs* filling if enough users of the language actually care about it not being there. Otherwise, it's a *nice to have* (like generics and Go, or memory safety and C :p ). > > Yes, on some level you are right... but again, who's to judge? [...] Ultimately, Walter and Andrei, as AFAIK they decide what gets into the language. |
September 03, 2017 Re: Bug in D!!! | ||||
---|---|---|---|---|
| ||||
Posted in reply to Moritz Maxeiner | On Sunday, 3 September 2017 at 11:48:38 UTC, Moritz Maxeiner wrote: > On Sunday, 3 September 2017 at 04:18:03 UTC, EntangledQuanta wrote: >> On Sunday, 3 September 2017 at 02:39:19 UTC, Moritz Maxeiner wrote: >>> On Saturday, 2 September 2017 at 23:12:35 UTC, EntangledQuanta wrote: >>>> [...] >>> >>> The contexts being independent of each other doesn't change that we would still be overloading the same keyword with three vastly different meanings. Two is already bad enough imho (and if I had a good idea with what to replace the "in" for AA's I'd propose removing that meaning). >> >> Why? Don't you realize that the contexts matters and [...] > > Because instead of seeing the keyword and knowing its one meaning you also have to consider the context it appears in. That is intrinsically more work (though the difference may be very small) and thus harder. > > ... Yes, In an absolute sense, it will take more time to have to parse the context. But that sounds like a case of "pre-optimization". If we are worried about saving time then what about the tooling? compiler speed? IDE startup time? etc? All these take time too and optimizing one single aspect, as you know, won't necessarily save much time. Maybe the language itself should be designed so there are no ambiguities at all? A single simple for each function? A new keyboard design should be implemented(ultimately a direct brain to editor interface for the fastest time, excluding the time for development and learning)? So, in this case I have to go with the practical of saying that it may be theoretically slower, but it is such an insignificant cost that it is an over optimization. I think you would agree, at least in this case. Again, the exact syntax is not import to me. If you really think it matters that much to you and it does(you are not tricking yourself), then use a different keyword. When I see something I try to see it at once rather than reading it left to right. It is how music is read properly, for example. One can't read left to right and process the notes in real time fast enough. You must "see at once" a large chunk. When I see foo(A in B)() I see it at once, not in parts or sub-symbols(subconsciously that may be what happens, but it either is so quick or my brain has learned to see differently that I do not feel it to be any slower). that is, I do not read it like f, o, o (, A, , i,... but just like how one sees an image. Sure, there are clustering such as foo and (...), and I do sub-parse those at some point, but the context is derived very quickly. Now, of course, I do make assumptions to be able to do that. Obviously I have to sorta assume I'm reading D code and that the expression is a templated function, etc. But that is required regardless. It's like seeing a picture of an ocean. You can see the global characteristics immediately without getting bogged down in the details until you need it. You can determine the approximate time of day(morning, noon, evening, night) relatively instantaneously without even knowing much else. To really counter your argument: What about parenthesis? They too have the same problem with in. They have perceived ambiguity... but they are not ambiguity. So your argument should be said about them too and you should be against them also, but are you? [To be clear here: foo()() and (3+4) have 3 different use cases of ()'s... The first is templated arguments, the second is function arguments, and the third is expression grouping] If you are, then you are being logical and consistent, If you are not, then you are not being logical nor consistent. If you fall in the latter case, I suggest you re-evaluate the way you think about such things because you are picking and choosing. Now, if you are just stating a mathematical fast that it takes longer, then I can't really deny that, although I can't technically prove it either as you can't because we would require knowing exactly how the brain processes the information. > >> >> >>>> [...] >>> >>> Well, yes, as I wrote, I think it is unambiguous (and can thus be used), I just think it shouldn't be used. >> >> Yes, but you have only given the reason that it shouldn't be used because you believe that one shouldn't overload keywords because it makes it harder to parse the meaning. My rebuttal, as I have said, is that it is not harder, so your argument is not valid. All you could do is claim that it is hard and we would have to find out who is more right. > > As I countered that in the above, I don't think your rebuttal is valid. Well, hopefully I countered that in my rebuttal of your rebuttal of my rebuttal ;) Again, you don't actually know how the brain processes information(no one does, it is all educated guesses). You use the concept that the more information one has to process the more time it takes... which seems logical, but it is not necessarily applicable directly to the interpretation of written symbols. Think of an image. We can process a ton of information nearly instantly, and if the logic applied, we would expect images to take much longer to "read" than the written word, yet it is exactly the opposite... and yet, symbols are just images(with a specific order we must follow to make sense of them). Have you ever thought of a programming language that was based on images? Maybe that would be a much quicker way and much faster to "read" the source? Of course, some might claim that all life is is source code and "real life" is just the most natural representation of code. >> >> I have a logical argument against your absolute restriction though... in that it causes one to have to use more symbols. I would imagine you are against stuff like using "in1", "in2", etc because they visibly are to close to each other. > > It's not an absolute restriction, it's an absolute position from which I argue against including such overloading on principle. > If it can be overcome by demonstrating that it can't sensibly be done without more overloading and that it adds enough value to be worth the increases overloading, I'd be fine with inclusion. My feeling is though you are actually just making principles based on whim rather than a true logical basis, I could be wrong. Depending on how you answer my questions above will let me know better. To simplify it down: Do you have the sample problems with all the ambiguities that already exist in almost all programming languages that everyone is ok with on a practical level on a daily basis? >> >>>> [...] >> >> If that is the case then go for it ;) It is not a concern of mine. You tell me the syntax and I will use it. (I'd have no choice, of course, but if it's short and sweet then I won't have any problem). > > I'm discussing this as a matter of theory, I don't have a use for it. Ok, I do, which is what led me to the problem, as all my "enhancements" do. I try something I think is an "elegant" way to simplify complexity in my program(from the user of the code's perspective, which will generally be me)... I run in to a wall, I post a message, and I usually get shot down immediately with "It can't be done"... then I have to find a way to do it. I find the way[usually using string mixins, thank god for them]. Post it... someone else then usually comes along with a better or simpler way. Usually when I say something like "This should be in the compiler", I immediately get shot down again with "It adds complexity to the compiler". In which case I try to to explain that everything adds complexity and this solution would add very little complexity since one can already do it in the library in a simple way... Usually the library solution is not robust and hence not good(I only worked it out enough for my use cases). ...and so the wheel goes around and around. But the logic is usually the same. "we can't do that".... which I eventually just interpret as "we don't wanna do that because we have better things to do", which is fine if at least that was admitted in the first place instead of wasting my time trying to explain that it can be done, coming up with a solution, etc. (of course, it's ultimately my fault since I am the one in control of my time, I mainly do it because it could help others in the same position that I was in) >> >>>> [...] >>> >>> Quoting a certain person (you know who you are) from DConf 2017: "Write a DIP". >>> I'm quite happy to discuss this idea, but at the end of the day, as it's not an insignificant change to the language someone will to do the work and write a proposal. >>> >> >> My main issues with going through the trouble is that basically I have more important things to do. If I were going to try to get D to do all the changes I actually wanted, I'd be better off writing my own language the way I envision it and want it... but I don't have 10+ years to invest in such a beast and to do it right would require my full attention, which I'm not willing to give, because again, I have better things to do(things I really enjoy). >> >> So, all I can do is hopefully stoke the fire enough to get someone else interested in the feature and have them do the work. If they don't, then they don't, that is fine. But I feel like I've done something to try to right a wrong. > > That could happen, though historically speaking, usually things have gotten included in D only when the major proponent of something like this does the hard work (otherwise they seem to just fizzle out). Yes. Because things take time and we only have so much. I am fine with that. I'm fine with a great idea going no where because no one has the time to invest in it. It's unfortunate but life is life... it's only when people ultimately are trying to deceive that or are just truly ignorant when I start to have a problem with them. > >> >>>> [...] >>> >>> AFAIK the difference between syntax sugar and enabling syntax in PLs usually comes down to the former allowing you to express concepts already representable by other constructs in the PL; when encountered, the syntax sugar could be lowered by the compiler to the more verbose syntax and still be both valid in the PL and recognizable as the concept (while this is vague, a prominent example would be lambdas in Java 8). >> >> Yes, but everything is "lowered" it's just how you define it. > > Yes and w.r.t to my initial point, I did define it as "within the PL itself, preserving the concept". > >> >> >>>> [...] >>> >>> Why do you think that? Less than ten people have participated in this thread so far. >> >> I am not talking about just this thread, I am talking about in all threads and all things in which humans attempt to determine the use of something. [...] > > Fair enough, though personally I'd need to see empirical proof of those general claims about human behaviour before I could share that position. Lol, you should have plenty of proof. Just look around. Just look at your own experiences in your life. I don't know much about you but I imagine that you have all the proof you need. Look how businesses are ran. Look how people "solve" problems. Look at the state of the world. You can make claims that it's this and that, as I can... but there is a common denominator among it all. Also just think about how humans are able to judge things. Surely they can only judge it based on what they know? How can we judge things based on what we don't know? Seems impossible, right? Take someone you know that makes constantly makes bad decisions... why? Are they geniuses or idiots? I think it's pretty provable that the more intelligent a person is the better they are able to make decisions about something... and this is general. A programmer is surely able to make better decisions about coding than a non-programmer? Look at all the business people in the world who know absolutely nothing about technological factors but make such decisions about them on a daily basis... and the ramifications of those decisions are easily seen. I'm not saying it's a simple problem, but there are relatively simple overarching rules involved. The more a person knows about life the more they can make better decisions about life. (but the life thing is the complex part, I don't disagree) To make this tie in to what we are talking about: If someone never used templated functions in D, how can they make decisions on whether templated functions are useful or not? Should be obvious. The complexity comes in with they actually have used them... but then we have to know "How much do they use them", "How do they use them", "What other things do they know about that influence there usage of them", etc? Most people are satisfies with just stopping at some arbitrary point when they get tired and have to go to bed... I'm not one of those people(for better or worse). > >>>> [...] >>> >>> Why do you assume that? I've not seen anyone here claiming template parameter specialization to one of n types (which is the idea I replied to) couldn't be done in theory, only that it can't be done right now (the only claim as to that it can't be done I noticed was w.r.t. (unspecialized) templates and virtual functions, which is correct due to D supporting separate compilation; specialized templates, however, should work in theory). >> >> Let me quote the first two responses: >> >> "It can't work this way. You can try std.variant." > > That is a reply to your mixing (unspecialized) templates and virtual functions, not to your idea of generalizing specialized templates. That might have been the reply, and it may be valid in a certain context, and may actually be the correct reply in the context I gave(I could have done a better job, I admit), BUT, if D already implemented such a specialization feature, a different response would have occurred such as: "You need to limit T to be in a finite set", which I would have merrily moved along. But it tries to force me to in a solution that is not acceptable. In fact, I was using specialization as `T` could only be from a finite set... but, again, D does not allow me any way to specify that, so how could I properly formulate a solution that would make sense without going in to a lot of detail... a lot of details that I actually don't know because I'm not a full time D aficionado. The code I posted was a simplification, possibly an oversimplification, of my real code in which I tried to express something I wanted to do, knew that there should be no real technical limitations(in what I wanted, not in how D does it), and thought that D should be able to D it in some way(mainly because it can do just about anything in some way due to it's rich feature set). >> >> and >> >> "It is not possible to have a function be both virtual and templated. A function template generates a new function definition every time that it's a called with a new set of template arguments. [...]" > > Same here. But it's not true... unless you mean that "it is not possible currently in D to do this. Neither of those statements are logically valid, because it is possible(Only with a restricted number of template parameter values). It is only true about an infinite number, which didn't apply to me since I had a finite number. Basically an absolute statement is made: something like "All numbers are odd", which is absolute false even if it is partially true. "All odd numbers are odd" is obviously true. One should even clarify, if the context isn't clear so no confusion arise. "It is not possible to have a function be both virtual and templated." Surely you disagree with that statement? While there is some ambiguity, since templated functions are actually syntactic sugar while virtual functions are actually coded, we can obviously have a virtual templated function. (Not in D currently, but there is no theoretical reason why it can't exist, we've already discussed on that) "It is not possible to have a function be both virtual and [arbitrarily] templated." Would, I believe, be a true statement. while "It is not possible to have a function be both virtual and [finitely] templated." would be a false statement. In fact, I bet if you asked Jonathan, what he believed when he wrote that, that he believed it to be true for all cases(finite or not, as he probably never even thought about the finite case enough to realize it matters). Anyways, we've beat this horse to death! I think we basically agree on the bulk of things, so it's not a big deal. Most of the issue with communication is the lack of clarity and the ambiguity in things(wars have been started and millions of people have died over such things as have many personal relationships destroyed). I'd like to see such a feature implemented in D one day, but I doubt it will for whatever reasons. Luckily D is powerful enough to still get at a solid solution.. unlike some languages, and I think that is what most of us here realize about D and why we even bother with it. |
Copyright © 1999-2021 by the D Language Foundation