June 14, 2011
"Andrei Alexandrescu" <SeeWebsiteForEmail@erdani.org> wrote in message news:it7pd2$2m07$1@digitalmars.com...
> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>
> Destroy.
>

After all that talk about how we need to be very cautious about adding new features to the compiler and work with the existing language whenever possible, only a few days later now we're seriously considering adding an entire *build system* to the compiler? And let's not fool ourselves: in order for this not to be half-baked, it would have to completely take over all the roles handled by a full-featured build-and-package-management system.

Just off the top of my head:

- Putting it in the compiler forces it all to be written in C++. As an external tool, we could use D.

- By default, it ends up downloading an entire library one inferred source file at a time. Why? Libraries are a packaged whole. Standard behavior should be for libraries should be treated as such.

- Are we abandoning zdmd now? (Or is it "dmdz"?)

- Does it automatically *compile* the files it downloads or merely use them to satisfy imports? If the latter, then the whole proposal becomes pointless - you'll just need to tie it in with RDMD anyway, so you may as well just keep it outside the compiler. If the former, then you're implicitly having DMD creep into RDMD's territory - So either be explicit about it and take it all the way putting all of rdmd into there, or get rid of it and let the build tools handle package-management matters.

- Does every project that uses libX have to download it separately? If not (or really even if so), how does the compiler handle different versions of the lib and prevent "dll hell"? Versioning seems to be an afterthought in this DIP - and that's a guaranteed way to eventually find yourself in dll hell.

- How do you tell it to "update libX"? Not by expecting the user to manually clear the cache, I hope.

- With a *real* package management tool, you'd have a built-in (and configurable) list of central data sources. If you want to use something you don't have installed, and it exists in one of the stores (maybe even one of the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just grab it, no changes to your source needed at all, and any custom steps needed would be automatically handled. And if it was only in a data store that you didn't already have in your list, all you have to do is add *one* line. Which is just as easy as the DIP, but that *one* step will also suffice for any other project that needs libX - no need to add the line for *each* of your libX-using projects. Heck, you wouldn't even need to edit a file, just do "package-tool addsource http://...". The DIP doesn't even remotely compare.

- I think you're severely overestimating the amount of extra dmd-invokations that would be needed by using an external build tool. I beleive this is because your idea centers around discovering one file at a time instead of properly handling packages at the *package* level. Consider this:

You tell BuildToolX to build MyApp. It looks at MyApp.config to see what libs it needs. It discovers LibX is needed. It fetches LibX.config, and finds it's dependencies. Etc, building up a dependency graph. It checks for any problems with the dependency graph before doing any real work (something the DIP can't do). Then it downloads the libs, and *maybe* runs some custom setup on each one. If the libs don't have any custom setup, you only have *one* DMD invokation (two if you use RDMD). If the libs do have any custom setup, and it involves running dmd, then that *only* happens the first time you build MyApp (until you update one of the libs, causing it's one-time setup to run once more).

I think this proposal is a hasty idea that just amounts to chasing after "the easy way out".



June 14, 2011
"Nick Sabalausky" <a@a.a> wrote in message news:it8kkv$20hr$1@digitalmars.com...
> "Andrei Alexandrescu" <SeeWebsiteForEmail@erdani.org> wrote in message news:it7pd2$2m07$1@digitalmars.com...
>> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>>
>> Destroy.
>>
>
> After all that talk about how we need to be very cautious about adding new features to the compiler and work with the existing language whenever possible, only a few days later now we're seriously considering adding an entire *build system* to the compiler? And let's not fool ourselves: in order for this not to be half-baked, it would have to completely take over all the roles handled by a full-featured build-and-package-management system.
>
> Just off the top of my head:
>
> - Putting it in the compiler forces it all to be written in C++. As an external tool, we could use D.
>
> - By default, it ends up downloading an entire library one inferred source file at a time. Why? Libraries are a packaged whole. Standard behavior should be for libraries should be treated as such.
>
> - Are we abandoning zdmd now? (Or is it "dmdz"?)
>
> - Does it automatically *compile* the files it downloads or merely use them to satisfy imports? If the latter, then the whole proposal becomes pointless - you'll just need to tie it in with RDMD anyway, so you may as well just keep it outside the compiler. If the former, then you're implicitly having DMD creep into RDMD's territory - So either be explicit about it and take it all the way putting all of rdmd into there, or get rid of it and let the build tools handle package-management matters.
>
> - Does every project that uses libX have to download it separately? If not (or really even if so), how does the compiler handle different versions of the lib and prevent "dll hell"? Versioning seems to be an afterthought in this DIP - and that's a guaranteed way to eventually find yourself in dll hell.
>
> - How do you tell it to "update libX"? Not by expecting the user to manually clear the cache, I hope.
>
> - With a *real* package management tool, you'd have a built-in (and configurable) list of central data sources. If you want to use something you don't have installed, and it exists in one of the stores (maybe even one of the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just grab it, no changes to your source needed at all, and any custom steps needed would be automatically handled. And if it was only in a data store that you didn't already have in your list, all you have to do is add *one* line. Which is just as easy as the DIP, but that *one* step will also suffice for any other project that needs libX - no need to add the line for *each* of your libX-using projects. Heck, you wouldn't even need to edit a file, just do "package-tool addsource http://...". The DIP doesn't even remotely compare.
>
> - I think you're severely overestimating the amount of extra dmd-invokations that would be needed by using an external build tool. I beleive this is because your idea centers around discovering one file at a time instead of properly handling packages at the *package* level. Consider this:
>
> You tell BuildToolX to build MyApp. It looks at MyApp.config to see what libs it needs. It discovers LibX is needed. It fetches LibX.config, and finds it's dependencies. Etc, building up a dependency graph. It checks for any problems with the dependency graph before doing any real work (something the DIP can't do). Then it downloads the libs, and *maybe* runs some custom setup on each one. If the libs don't have any custom setup, you only have *one* DMD invokation (two if you use RDMD). If the libs do have any custom setup, and it involves running dmd, then that *only* happens the first time you build MyApp (until you update one of the libs, causing it's one-time setup to run once more).
>

Also, if you do want to throw away the "*.config" file (which might not be a good idea) and truly have "no editing needed" by inferring library dependencies from dmd's deps output, you still don't need a lot of extra dmd invokations: Just one extra deps-gathering invokation each time a deps-gathering invokation finds unsatisfied depenencies, and *only* the first time you build.

> I think this proposal is a hasty idea that just amounts to chasing after "the easy way out".
>
>
> 


June 14, 2011
On 6/14/11 4:38 PM, Nick Sabalausky wrote:
> - Putting it in the compiler forces it all to be written in C++. As an
> external tool, we could use D.

Having the compiler communicate with a download tool supplied with the distribution seems to be a very promising approach that would address this concern.

> - By default, it ends up downloading an entire library one inferred source
> file at a time. Why? Libraries are a packaged whole. Standard behavior
> should be for libraries should be treated as such.

Fair point, though in fact the effect is that one ends up downloading exactly the used modules from that library and potentially others.

Although it may seem that libraries are packaged as a whole, that view ignores the interdependencies across them. This proposal solves the interdependencies organically.

> - Are we abandoning zdmd now? (Or is it "dmdz"?)

It is a related topic. That project, although it has been implemented, hasn't unfortunately captured the interest of people.

> - Does it automatically *compile* the files it downloads or merely use them
> to satisfy imports?

We need to arrange things such that the downloaded files are also compiled and linked together with the project.

> - Does every project that uses libX have to download it separately? If not
> (or really even if so), how does the compiler handle different versions of
> the lib and prevent "dll hell"? Versioning seems to be an afterthought in
> this DIP - and that's a guaranteed way to eventually find yourself in dll
> hell.

Versioning is a policy matter that can, I think, be addressed within the URL structure. This proposal tries to support versioning without explicitly imposing it or standing in its way.

> - How do you tell it to "update libX"? Not by expecting the user to manually
> clear the cache, I hope.

The external tool that would work in conjunction with dmd could have such a flag.

> - With a *real* package management tool, you'd have a built-in (and
> configurable) list of central data sources.

I don't see why you can't have with this approach too.

> If you want to use something you
> don't have installed, and it exists in one of the stores (maybe even one of
> the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just
> grab it, no changes to your source needed at all, and any custom steps
> needed would be automatically handled. And if it was only in a data store
> that you didn't already have in your list, all you have to do is add *one*
> line. Which is just as easy as the DIP, but that *one* step will also
> suffice for any other project that needs libX - no need to add the line for
> *each* of your libX-using projects. Heck, you wouldn't even need to edit a
> file, just do "package-tool addsource http://...". The DIP doesn't even
> remotely compare.

I think it does. Clearly a command-line equivalent for the pragma needs to exist, and the appropriate pragmas can be added to dmd.conf. With the appropriate setup, a program would just issue:

using dsource.libX;

and get everything automatically.

> - I think you're severely overestimating the amount of extra dmd-invokations
> that would be needed by using an external build tool.

I'm not estimating much. It's Adam who shared impressions from actual use.

> I beleive this is
> because your idea centers around discovering one file at a time instead of
> properly handling packages at the *package* level.

The issue with package-level is that http does not have a protocol for listing files in a directory. However, if we arrange to support zip files, the tool could detect that a zip file is at the location of the package and download it entirely.

> Consider this:
>
> You tell BuildToolX to build MyApp. It looks at MyApp.config to see what
> libs it needs. It discovers LibX is needed. It fetches LibX.config, and
> finds it's dependencies. Etc, building up a dependency graph. It checks for
> any problems with the dependency graph before doing any real work (something
> the DIP can't do). Then it downloads the libs, and *maybe* runs some custom
> setup on each one. If the libs don't have any custom setup, you only have
> *one* DMD invokation (two if you use RDMD). If the libs do have any custom
> setup, and it involves running dmd, then that *only* happens the first time
> you build MyApp (until you update one of the libs, causing it's one-time
> setup to run once more).
>
> I think this proposal is a hasty idea that just amounts to chasing after
> "the easy way out".

I'm just trying to define a simple backend that facilitates sharing code and using of shared code, without arrogating the role and merits of a more sophisticated package management tool and without standing in the way of one. Ideally, the backend should be useful to such a tool - e.g. I imagine a tool could take a plain file format and transform it into a series of pragmas directing library locations.

As always, criticism is appreciated, particularly of the kind that prompts pushing things forward - as was the case with the idea of a download tool that's a separate executable, companion to dmd.


Thanks,

Andrei
June 14, 2011
Am 14.06.2011 22:22, schrieb Andrei Alexandrescu:
> It's not hard, in fact that's almost how we want to implement it: a straight function that wraps a call to wget. The difference is that you'd migrate the function into a separate utility, and I think that's a good idea. (Walter prefers it inside the compiler.)

It even seems to be possible that this 'plug-in' tool can become a source code provider which not only downloads
the files for dmd to read them, but pipe their content directly back into the compiler. I really can't imagine what
possibillities this may offer.

June 14, 2011
What is dmdz?
June 14, 2011
On 14/06/2011 14:53, Andrei Alexandrescu wrote:
> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>
> Destroy.
>
>
> Andrei

More thoughts:

* The compiler should be a compiler
* Adding this makes the compiler a downloader and a compiler
* If the compiler is a downloader, it should also be a builder and a
  package manager
* Compiler now contains more C++ code that isn't to do with compiling.

The DIP mentions speed as a reason to integrate it into the compiler rather than have it separate. How about making dmd a bit more modular if you want the speed of having it in the compiler? Make a dmd library, the compiler can just be a main wrapper around it. This way:

* The downloader/builder/package manager can be separate
* The tool can be written in D, and still have the speed you want
* It paves the way for other tools that could use dmd as a library

The way I see this proposal is as a response to cpan/gem/pecl etc, a much needed package manager for D. I don't believe integrating it into the compiler is the right way to go, nor do I believe that a pragma is the right way to do it - I even refuse to use pragma(lib) as it doesn't work with incremental compilation - this wouldn't either.

-- 
Robert
http://octarineparrot.com/
June 14, 2011
On Jun 14, 2011, at 2:56 PM, Andrei Alexandrescu wrote:
> 
> Versioning is a policy matter that can, I think, be addressed within the URL structure. This proposal tries to support versioning without explicitly imposing it or standing in its way.

For now anyway.  If we want to support using multiple versions of the same lib in an app then some more thought will have to go into how this will work.
June 14, 2011
On 6/14/11 6:21 PM, Robert Clipsham wrote:
> On 14/06/2011 14:53, Andrei Alexandrescu wrote:
>> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>>
>> Destroy.
>>
>>
>> Andrei
>
> More thoughts:
>
> * The compiler should be a compiler
> * Adding this makes the compiler a downloader and a compiler
> * If the compiler is a downloader, it should also be a builder and a
> package manager
> * Compiler now contains more C++ code that isn't to do with compiling.

All of these issues seem to be addressed by the emerging idea that dmd should cooperate with a companion binary that effects the downloading.

> The DIP mentions speed as a reason to integrate it into the compiler
> rather than have it separate. How about making dmd a bit more modular if
> you want the speed of having it in the compiler? Make a dmd library, the
> compiler can just be a main wrapper around it. This way:
>
> * The downloader/builder/package manager can be separate
> * The tool can be written in D, and still have the speed you want
> * It paves the way for other tools that could use dmd as a library
>
> The way I see this proposal is as a response to cpan/gem/pecl etc, a
> much needed package manager for D. I don't believe integrating it into
> the compiler is the right way to go, nor do I believe that a pragma is
> the right way to do it - I even refuse to use pragma(lib) as it doesn't
> work with incremental compilation - this wouldn't either.

The notion that the compiler communicates pragmas to its separated package manager during compilation - would that float your boat?


Andrei
June 14, 2011
On Tue, Jun 14, 2011 at 5:25 PM, Andrej Mitrovic <andrej.mitrovich@gmail.com
> wrote:

> What is dmdz?
>

http://www.digitalmars.com/d/archives/digitalmars/D/dmdz_107472.html http://www.digitalmars.com/d/archives/digitalmars/D/dmdz_take_2_110937.html


June 15, 2011
Nick Sabalausky wrote:
> Just one extra deps-gathering invokation each time a deps-gathering invokation finds unsatisfied depenencies, and *only* the first time you build.

It could probably cache the last successful command...