June 15, 2011
I left this thought half finished. Where would it cache? What if there's several different projects or configurations in one folder?

If redoing the process just works each time, it really simplifies those scenarios.
June 15, 2011
Sean Kelly wrote:
> If we want to support using multiple versions of the
> same lib in an app then some more thought will have to go into how
> this will work.

Versioning is absolutely trivial if the version is part of the module name.

This is a better way to do it long term too; it's future proof. Download any module, no matter how old, and it will still compile.

To update your code to the new version, do a "grep import foo_1" and replace them with "import foo_2" and recompile. This is probably less effort than actually updating your code to use version 2!
June 15, 2011
Nick Sabalausky:
> - By default, it ends up downloading an entire library one inferred source file at a time. Why? Libraries are a packaged whole. Standard behavior should be for libraries should be treated as such.

I don't agree. You don't import a library - you import a module. It's natural to just download that module and get what you need that way.

> Does every project that uses libX have to download it separately?

My approach is to download the libraries to a local subdirectory.

$ cd foo
$ dir
   app.d  # btw, app.d uses "import foo.bar;"
$ build app
$ dir
   app  app.o  app.d  foo/


If you want to share a library, you can link the local subdir to a central lib dir using your operating system's features. (symlinks, junctions, whatever)

I'm not sure what Andrei had in mind, but I like my approach because it's easy to implement, clear to see what it is actually doing, and packing your application for distribution is as simple as zipping up the directory. Dependencies included automatically.

> It'll just grab it, no changes to your source needed at all, and any custom steps needed would be automatically handled

My approach again allowed a central repo, which may direct you elsewhere using standard http.

It builds the default url by:

http://centraldomain.com/repository/package/module.d


I think the DIP should do this too if a liburl is not specified.
June 15, 2011
"Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:it91b0$aa0$1@digitalmars.com...
> Nick Sabalausky wrote:
>> Just one extra deps-gathering invokation each time a deps-gathering invokation finds unsatisfied depenencies, and *only* the first time you build.
>
> It could probably cache the last successful command...

Nothing would need to be cached. After the initial "gather everything and build" build, all it would ever have to do is exactly what RDMD already does right now: Run DMD once to find the deps, check them to see if anything needs rebuilt, and if so, run DMD the second time to build. There'd never be any need for more than those two invokations (and the first one tends to be much faster anyway) until a new library dependency is introduced.


June 15, 2011
> After the initial "gather everything and
> build" build, all it would ever have to do is exactly what RDMD
> already does right now: Run DMD once to find the deps, check them
> to see if anything needs rebuilt, and if so, run DMD the second
> time to build.

Does rdmd handle cases where the dependencies have dependencies?

Suppose app.d imports foo.d which imports bar.d

dmd app.d
can't find module in foo.d

retry:

dmd app.d foo.d
can't find module bar.d

try again:

dmd app.d foo.d bar.d

success.


Is it possible to cut out any one of those steps without caching that third dmd line? Until you try to compile foo.d, it can't know bar.d is required...
June 15, 2011
"Andrei Alexandrescu" <SeeWebsiteForEmail@erdani.org> wrote in message news:4DF7D92A.8050606@erdani.org...
> On 6/14/11 4:38 PM, Nick Sabalausky wrote:
>> - Putting it in the compiler forces it all to be written in C++. As an external tool, we could use D.
>
> Having the compiler communicate with a download tool supplied with the distribution seems to be a very promising approach that would address this concern.
>

A two way "compiler <-> build tool" channel is messier than "build tool invoked compier", and I don't really see much benefit.

>> - By default, it ends up downloading an entire library one inferred
>> source
>> file at a time. Why? Libraries are a packaged whole. Standard behavior
>> should be for libraries should be treated as such.
>
> Fair point, though in fact the effect is that one ends up downloading exactly the used modules from that library and potentially others.
>

I really don't see a problem with that. And you'll typically end up needing most, if not all, anyway. It's very difficult to see this as an actual drawback.

> Although it may seem that libraries are packaged as a whole, that view ignores the interdependencies across them. This proposal solves the interdependencies organically.
>

How does my proposal not handle that? I think it does.

>> - Are we abandoning zdmd now? (Or is it "dmdz"?)
>
> It is a related topic. That project, although it has been implemented, hasn't unfortunately captured the interest of people.
>

Not surprising since there's been very little mention of it. In fact, I've been under the impression that it wasn't even finished. Is this not so? If it is done, I bet I'm not the only one that didn't know. Plus, I bet most people aren't even aware of it at all. RDMD gets trotted out and promoted *far* more often and I come across a lot of D users (usually newbies) who aren't even aware of *it*.

>> - Does it automatically *compile* the files it downloads or merely use
>> them
>> to satisfy imports?
>
> We need to arrange things such that the downloaded files are also compiled and linked together with the project.
>

And that's akward under your the model you're proposing. But by handling package management in a separate tool, it's a non-issue.

>> - Does every project that uses libX have to download it separately? If
>> not
>> (or really even if so), how does the compiler handle different versions
>> of
>> the lib and prevent "dll hell"? Versioning seems to be an afterthought in
>> this DIP - and that's a guaranteed way to eventually find yourself in dll
>> hell.
>
> Versioning is a policy matter that can, I think, be addressed within the URL structure. This proposal tries to support versioning without explicitly imposing it or standing in its way.
>

That's exactly my point. If you leave it open like that, everyone will come up with thier own way to do it, many will not even give it any attention at all, and most of those approaches will end up being wrong WRT avoiding dll hell. Hence, dll hell will get in and library users will end up having to deal it. The only way to avoid it is to design it out of the system up from *with explicitly imposing it*.

>> - How do you tell it to "update libX"? Not by expecting the user to
>> manually
>> clear the cache, I hope.
>
> The external tool that would work in conjunction with dmd could have such a flag.
>

That's a messier solution than what I outlined.

>> - With a *real* package management tool, you'd have a built-in (and configurable) list of central data sources.
>
> I don't see why you can't have with this approach too.
>

The problem is you end up having both. One one them, the default one, is a mess and shouldn't really be used, and then the other is the one that you'd already get anyway with a real package management tool.

>> If you want to use something you
>> don't have installed, and it exists in one of the stores (maybe even one
>> of
>> the built-in ones), you don't have to edit *ANYTHING AT ALL*. It'll just
>> grab it, no changes to your source needed at all, and any custom steps
>> needed would be automatically handled. And if it was only in a data store
>> that you didn't already have in your list, all you have to do is add
>> *one*
>> line. Which is just as easy as the DIP, but that *one* step will also
>> suffice for any other project that needs libX - no need to add the line
>> for
>> *each* of your libX-using projects. Heck, you wouldn't even need to edit
>> a
>> file, just do "package-tool addsource http://...". The DIP doesn't even
>> remotely compare.
>
> I think it does. Clearly a command-line equivalent for the pragma needs to exist, and the appropriate pragmas can be added to dmd.conf. With the appropriate setup, a program would just issue:
>
> using dsource.libX;
>
> and get everything automatically.
>

The approach in the DIP encourages such things to not be used and leaves them as afterthoughts. I think this is backwards.

>> - I think you're severely overestimating the amount of extra
>> dmd-invokations
>> that would be needed by using an external build tool.
>
> I'm not estimating much. It's Adam who shared impressions from actual use.
>
>> I beleive this is
>> because your idea centers around discovering one file at a time instead
>> of
>> properly handling packages at the *package* level.
>
> The issue with package-level is that http does not have a protocol for listing files in a directory. However, if we arrange to support zip files, the tool could detect that a zip file is at the location of the package and download it entirely.
>

There is no need to deal with individual files. Like I've said, that's the wrong level to be dealing with this anyway.

>> Consider this:
>>
>> You tell BuildToolX to build MyApp. It looks at MyApp.config to see what
>> libs it needs. It discovers LibX is needed. It fetches LibX.config, and
>> finds it's dependencies. Etc, building up a dependency graph. It checks
>> for
>> any problems with the dependency graph before doing any real work
>> (something
>> the DIP can't do). Then it downloads the libs, and *maybe* runs some
>> custom
>> setup on each one. If the libs don't have any custom setup, you only have
>> *one* DMD invokation (two if you use RDMD). If the libs do have any
>> custom
>> setup, and it involves running dmd, then that *only* happens the first
>> time
>> you build MyApp (until you update one of the libs, causing it's one-time
>> setup to run once more).
>>
>> I think this proposal is a hasty idea that just amounts to chasing after "the easy way out".
>
> I'm just trying to define a simple backend that facilitates sharing code and using of shared code, without arrogating the role and merits of a more sophisticated package management tool and without standing in the way of one. Ideally, the backend should be useful to such a tool - e.g. I imagine a tool could take a plain file format and transform it into a series of pragmas directing library locations.
>

I appreciate the motivation behind it, but I see the whole approach as:

1. Not really helping a package management tool, and likely even getting in its way.

2. Encouraging people to use a dangerously ad-hoc "package management" instead of a proper fully-thought-out one.

I see this as adding more to the language/compiler in order to make the wrong things easier.

> As always, criticism is appreciated, particularly of the kind that prompts pushing things forward - as was the case with the idea of a download tool that's a separate executable, companion to dmd.
>

Maybe I'm tired, or maybe it's just the unfortunate nature of text, but I can't tell if you're saying you appreciate the criticism I've given here or implying that you want better criticism than what I've given...?


June 15, 2011
"Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:it93ah$ekb$1@digitalmars.com...
>> After the initial "gather everything and
>> build" build, all it would ever have to do is exactly what RDMD
>> already does right now: Run DMD once to find the deps, check them
>> to see if anything needs rebuilt, and if so, run DMD the second
>> time to build.
>
> Does rdmd handle cases where the dependencies have dependencies?
>
> Suppose app.d imports foo.d which imports bar.d
>
> dmd app.d
> can't find module in foo.d
>
> retry:
>
> dmd app.d foo.d
> can't find module bar.d
>
> try again:
>
> dmd app.d foo.d bar.d
>
> success.
>
>
> Is it possible to cut out any one of those steps without caching that third dmd line? Until you try to compile foo.d, it can't know bar.d is required...

RDMD never needs to invoke DMD more than twice. Once to find "all" the dependencies, and once to do the actual compile. When DMD is run to find a file's dependencies, it finds the *entire* dependency graph, not just one step of it.



June 15, 2011
"Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:it91me$b2g$1@digitalmars.com...
> Sean Kelly wrote:
>> If we want to support using multiple versions of the
>> same lib in an app then some more thought will have to go into how
>> this will work.
>
> Versioning is absolutely trivial if the version is part of the module name.
>
> This is a better way to do it long term too; it's future proof. Download any module, no matter how old, and it will still compile.
>
> To update your code to the new version, do a "grep import foo_1" and replace them with "import foo_2" and recompile. This is probably less effort than actually updating your code to use version 2!

First of all, that can't be automated reliably (at least not within reason). Suppose there's two spaces. Or (heaven forbid!) a tab. Or anything else that just happens to be non-standard that you just happened to not take into account in your grep line. Such as mixin-generated imports, which are only going to be more common now that there's going to be usable inside functions. Plus not everyone's a command-line whiz, which makes that a lot of manual tedium for them.

Second, you shouldn't need to edit code to compile against a different version of a lib.

I'm not *necessarily* opposed to the idea versions being part of the name (although it does prevent you from being able to reliably do an ordered comparison of versions unless you have a standardize naming scheme - in which case it's effectively not really part of the name anyway). But it shouldn't seep into the user code unless the user of the library wants it to.



June 15, 2011
Nick Sabalausky wrote:
> RDMD never needs to invoke DMD more than twice.

rdmd also doesn't attempt to download libraries.
June 15, 2011
"Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:it927g$c7c$1@digitalmars.com...
> Nick Sabalausky:
>> - By default, it ends up downloading an entire library one inferred source file at a time. Why? Libraries are a packaged whole. Standard behavior should be for libraries should be treated as such.
>
> I don't agree. You don't import a library - you import a module. It's natural to just download that module and get what you need that way.
>

You import a module *from* a library. Even if you only import one module, that module is likely going to import others from the same lib, which may import others still, and chances are you'll end up needing most of the modules anyway.

Also, if a library needs any special "setup" step, then this won't even work anyway.

Plus I see no real benefit to being able to have a "partial" library installation.

>> Does every project that uses libX have to download it separately?
>
> My approach is to download the libraries to a local subdirectory.
>
> $ cd foo
> $ dir
>   app.d  # btw, app.d uses "import foo.bar;"
> $ build app
> $ dir
>   app  app.o  app.d  foo/
>
>
> If you want to share a library, you can link the local subdir to a central lib dir using your operating system's features. (symlinks, junctions, whatever)
>

I think a substantial number of people (*especially* windows users - it's unrealistic to expect windows users to use anything like junctions) would expect to be able to use an already-installed library without special setup for every single project that uses it.

And here's a real killer: If someone downloads your lib, or the source for your app, should they *really* be expected to wire up all your lib's/app's dependencies manually? This works against the whole point of easy package management.

> doing, and packing your application for distribution is as simple as zipping up the directory. Dependencies included automatically.
>

That can't always be done, shouldn't always be done, and not everyone wants to. There *are* benefits to packages being independent, but this throws them away. Yes, there are downsides to not having dependencies included automatically, but those are already solved by a good package management system.

>> It'll just grab it, no changes to your source needed at all, and any custom steps needed would be automatically handled
>
> My approach again allowed a central repo, which may direct you elsewhere using standard http.
>
> It builds the default url by:
>
> http://centraldomain.com/repository/package/module.d
>
>
> I think the DIP should do this too if a liburl is not specified.

A central repo per se, isn't really a good idea. What there should be is a standard built-in list of official repos (even if there's initially only one). Then others can be added. The system shouldn't have "single point-of-failure" built-in.

I think things like apt-get and 0install are very good models for us to follow. In fact, we should probably think about whether we want to actualy just *use* 0install, either outright or behind-the-scenes.