March 24, 2018
On 03/23/2018 02:09 PM, Manu wrote:
> If I have to continue to generate
> tables offline and paste big tables of data in the source code (and
> then re-generate them manually when I change something); then
> situation is identical to C++, therefore, stick with C++.\
> 

WAT?

When in the world would anyone ever need to do any of that? And I mean *regardless* of language? What kind of build system are you even using, punch cards and sneakernet?

Just run your custom generator tool from your buildscript (it's a trivial one-liner), and have it generate a full-fledged .d file ready for import (or a data file which is then string-imported by your main program).

It's quick and trivial, I've been doing it for a project written in Haxe (and just converted it to C# the other day as part of a big Flash -> Unity3D conversion) and it works quick-and-easy even there (of course, the Haxe/C#-generating tool is written in D, the tool was just easier to write that way).

DMD itself used to do that too when it was still C-based. Nobody ever needed to manually re-regenerate it (it was automatically invoked when necessary by the makefile), and *certainly* nobody ever needed to go copy-pasting any generated data.
March 24, 2018
On 3/23/2018 11:09 AM, Manu wrote:
> Like, in this particular project, being able to generate all tables at
> compile time is the thing that distinguishes the D code from the C++
> code; it's the *whole point*... If I have to continue to generate
> tables offline and paste big tables of data in the source code (and
> then re-generate them manually when I change something); then
> situation is identical to C++, therefore, stick with C++.\

This file:

  https://github.com/dlang/dmd/blob/master/src/dmd/backend/optabgen.c

computes tables, and writes several tables out to several .c files, which are then #include'd into the main build. It all happens automagically using the makefile:

  https://github.com/dlang/dmd/blob/master/src/win32.mak#L420

I've been using this technique continually since the early 1980's :-)

Some IDEs have problems with it, because they cannot support layered builds like this, but good old make does it just fine.

I can't recall ever seeing anyone else use this technique (other than Nick!), but it works and isn't that bad.

The dmd front end used to do this as well, but that has since been replaced with CTFE since it was converted to D.
March 24, 2018
On Sat, Mar 24, 2018 at 01:42:56AM -0700, Walter Bright via Digitalmars-d wrote: [...]
> This file:
> 
>   https://github.com/dlang/dmd/blob/master/src/dmd/backend/optabgen.c
> 
> computes tables, and writes several tables out to several .c files, which are then #include'd into the main build. It all happens automagically using the makefile:
> 
>   https://github.com/dlang/dmd/blob/master/src/win32.mak#L420
> 
> I've been using this technique continually since the early 1980's :-)
> 
> Some IDEs have problems with it, because they cannot support layered builds like this, but good old make does it just fine.

Thus proving that IDEs suck. ;-)


> I can't recall ever seeing anyone else use this technique (other than Nick!), but it works and isn't that bad.

It's not all that uncommon.  I've worked with projects (and still do)
where code is generated by a tool at build time, and then #include'd by
other source code.  Any project that uses lex/yacc (or their clones
flex/bison) does this. One of my own recent projects involved a clever
(IMO) trick of using the C preprocessor on a C header file (truetype, to
be precise) to generate D code that then gets compiled by a D compiler,
by suitably (re)defining certain macros.

Being able to do all this in CTFE instead is nice, but hardly a *necessity*.  And to be frank, the slowness of CTFE hampers serious use cases like generating parsers (it's definitely doable, as proven by Pegged, but it does come with an unattractive increase in compilation time (when will we see newCTFE materialize... *hint hint* :-P)).


T

-- 
Life begins when you can spend your spare time programming instead of watching television. -- Cal Keegan
March 24, 2018
On 24 March 2018 at 01:42, Walter Bright via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
> On 3/23/2018 11:09 AM, Manu wrote:
>>
>> Like, in this particular project, being able to generate all tables at compile time is the thing that distinguishes the D code from the C++ code; it's the *whole point*... If I have to continue to generate tables offline and paste big tables of data in the source code (and then re-generate them manually when I change something); then situation is identical to C++, therefore, stick with C++.\
>
>
> This file:
>
>   https://github.com/dlang/dmd/blob/master/src/dmd/backend/optabgen.c
>
> computes tables, and writes several tables out to several .c files, which are then #include'd into the main build. It all happens automagically using the makefile:
>
>   https://github.com/dlang/dmd/blob/master/src/win32.mak#L420
>
> I've been using this technique continually since the early 1980's :-)
>
> Some IDEs have problems with it, because they cannot support layered builds like this, but good old make does it just fine.
>
> I can't recall ever seeing anyone else use this technique (other than Nick!), but it works and isn't that bad.
>
> The dmd front end used to do this as well, but that has since been replaced with CTFE since it was converted to D.

I understand table generation, that is the standard approach. It's
made awkward by the fact that build systems are hard, and numerous,
and a user of a lib with such build requirement inherits that baggage
into their project.
I'm not sure why I seem to have to defend the idea that it's a *great
thing* that D (in theory; according to the advertising brochure) does
away with these requirements.
It just occurred to me too that it's not even that simple. The
instantiation sites (which are in user code) dictate what tables need
to be emit. It's not feasible to generate all possible tables...
there's a combinatorial explosion of possible inputs. I instantiate
only what tables the user needs on demand. It's impossible to
pre-generate 'all' tables; there's no such quantity.
March 24, 2018
On 2018-03-23 13:34, bauss wrote:

> What do you mean?
> 
> https://gcc.gnu.org/bugzilla/buglist.cgi?component=c%2B%2B&product=gcc&resolution=--- 

That's only limited to 500, here's a list of 10 000:

https://gcc.gnu.org/bugzilla/buglist.cgi?component=c%2B%2B&limit=0&order=bug_status%2Cpriority%2Cassigned_to%2Cbug_id&product=gcc&query_format=advanced

-- 
/Jacob Carlborg
March 24, 2018
On 2018-03-23 20:25, Jonathan M Davis wrote:

> Really? I've dealt with relatively few projects that use github as a bug
> tracker, and it's been my experience that most anything that's really
> serious has its own bugtracker (usually some form of bugzilla) - though most
> such projects predate github by a long shot. I'd think that signing up for a
> bugtracker would be par for the course and that if anything, the fact that a
> project was using github issues instead of its own bugtracker would imply
> that it was small, which doesn't necessarily give a good impression -
> especially for a compiler.

I think it's related to the culture of the language. Have a look at Ruby on Rails [1] for example. Basically the biggest Ruby project there is, it's using GitHub for issue tracking.

[1] https://github.com/rails/rails/issues

-- 
/Jacob Carlborg
March 24, 2018
On 2018-03-24 00:37, Nick Sabalausky wrote:

> OAuth is a phisher's paradise.
> 
> But that aside, it's never made any sense to me for projects to self-impose a policy of "If you've found a bug, and you're non-registered, we don't want to hear about it."
> 
> I would think any self-respecting project would WANT to lower the barrier to being notified of problems, not put roadblocks in the way: That's what outsourced call centers are for!

And even worse, projects that use a bot to automatically close issues that are too old and haven't seen any activity. Just because it's old and the maintainers didn't want to work on the bug doesn't mean the bug is gone.

-- 
/Jacob Carlborg
March 24, 2018
On 03/24/2018 09:53 AM, H. S. Teoh wrote:
> On Sat, Mar 24, 2018 at 01:42:56AM -0700, Walter Bright via Digitalmars-d wrote:
> [...]
>> I can't recall ever seeing anyone else use this technique (other than
>> Nick!), but it works and isn't that bad.
> 
> It's not all that uncommon.  I've worked with projects (and still do)
> where code is generated by a tool at build time, and then #include'd by
> other source code.  Any project that uses lex/yacc (or their clones
> flex/bison) does this. One of my own recent projects involved a clever
> (IMO) trick of using the C preprocessor on a C header file (truetype, to
> be precise) to generate D code that then gets compiled by a D compiler,
> by suitably (re)defining certain macros.
> 

And the excellent, classic book "The Pragmatic Programmer" promoted it as a technique worth having in one's toolbelt (That book, along with "Writing Solid Code", left a big lasting impact on me.)

IIRC, in the earlier days of Gameboy Advance homebrew (back when I still had time for that sort of thing!) it was also the first common technique for including images/audio in a ROM image. (Until other tools were developed to handle the task better.)
March 24, 2018
On 03/24/2018 12:37 PM, Manu wrote:
> 
> I understand table generation, that is the standard approach. It's

Huh? Then I guess I don't understand why you implied that the alternative to CTFE was manually regenerating and copy-pasting tables:

>> On 3/23/2018 11:09 AM, Manu wrote:
>>> Like, in this particular project, being able to generate all tables at
>>> compile time is the thing that distinguishes the D code from the C++
>>> code; it's the *whole point*... If I have to continue to generate
>>> tables offline and paste big tables of data in the source code (and
>>> then re-generate them manually when I change something);


> I'm not sure why I seem to have to defend the idea that it's a *great
> thing* that D (in theory; according to the advertising brochure) does
> away with these requirements.

No need to defend it, we're all sold on it already. To clarify: I wasn't saying "no need for CTFE", I was saying: "If you *have* to work around an unfortunate CTFE limitation, like the missing x^^y, then it's not hard to do so *without* all that manual work you suggested".


> made awkward by the fact that build systems are hard,

The decent ones aren't. (And if you happen to be stuck with MSBuild, well, then my sincere condolences. I periodically use Unity3D and I wish soooo much it wasn't tied to MSBuild...or CLR for that matter, but I digress...frequently ;))

Frankly, if a buildsystem makes doing XYZ (ex: "executing a CLI command upon build") harder than it would be in a shellscript, then the given buildsystem sucks and you may as well replace it with a plain old script.


> and a user of a lib with such build requirement inherits that baggage
> into their project.

Meh, its about half an ounce of baggage for the user. (But again, yes, CTFE is still better...at least when it doesn't slow down their build too much). And its zero-baggage if the generated files aren't dependent on the user's code, though I see now (below) that's not the case for your project.


> It just occurred to me too that it's not even that simple. The
> instantiation sites (which are in user code) dictate what tables need
> to be emit. It's not feasible to generate all possible tables...
> there's a combinatorial explosion of possible inputs. I instantiate
> only what tables the user needs on demand. It's impossible to
> pre-generate 'all' tables; there's no such quantity.

I guess that does complicate it somewhat (and again, to be clear, being able to just do CTFE would obviously be far better than this) but FWIW, that still might not be difficult to overcome, depending on the exact nature of the problem: Whatever inputs are necessary for table generation, let the user specify them as cmdline args to your generator tool. Again, not ideal, but a perfectly feasible workaround in a pinch, and doesn't require abandoning all the *other* benefits of D.
March 24, 2018
On 3/24/2018 9:37 AM, Manu wrote:
> I'm not sure why I seem to have to defend the idea that it's a *great
> thing* that D (in theory; according to the advertising brochure) does
> away with these requirements.

It is indeed a great idea. I'm just making the case that it isn't a blocker to not have it. It's inconvenient.

It's like you can code anything in C, even OOP. It's just inconvenient, tedious, and error-prone to.