June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | On 6/15/11 10:52 AM, Adam D. Ruppe wrote:
> It occurs to me that you ought to treat network things identically
> to local modules in every way...
>
> dmd app.d ../library.d
>
> just works. I propose:
>
> dmd app.d http://whatever.com/library.d
>
> should just work too - dmd would need only to recognize module name
> starts with xxxx:// and pass it to the dget program to translate.
Thought of that, too, and also of the idea of two posters in this thread to have dget pipe the module to stdout.
The main issue we need to address is what __FILE__ is for such modules and how we point users to the location of possible compilation and runtime errors.
Andrei
| |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On Wed, 15 Jun 2011 11:56:25 -0400, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote: > On 6/15/11 10:52 AM, Adam D. Ruppe wrote: >> It occurs to me that you ought to treat network things identically >> to local modules in every way... >> >> dmd app.d ../library.d >> >> just works. I propose: >> >> dmd app.d http://whatever.com/library.d >> >> should just work too - dmd would need only to recognize module name >> starts with xxxx:// and pass it to the dget program to translate. > > Thought of that, too, and also of the idea of two posters in this thread to have dget pipe the module to stdout. > > The main issue we need to address is what __FILE__ is for such modules and how we point users to the location of possible compilation and runtime errors. Change as little as possible internally IMO, __FILE__ should be the url that dmd receives from the command line. That also brings up a good point -- dget has to tell dmd where it got the file from so it can fill in __FILE__ in the case where it's not in a pragma. -Steve | |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Adam D. Ruppe | On 6/15/11 11:00 AM, Adam D. Ruppe wrote:
>> I understand the appeal, but I also understand the inherent
>> limitations of HTTP versus a file system. I don't think it's wise
>> pounding the former into the shape of the other.
>
>
> AIUI, you can't search for a module on a local filesystem either,
> aside from guessing a name. In D, a module name doesn't necessarily
> have to match the file name.
>
> If you import foo.bar;, you have two options:
>
> call it foo/bar.d
>
> pass whatever.d that has "module foo.bar;" in it on the command
> line explicitly.
>
>
> Being able to do a list directory entries feature doesn't really
> help locate a given named D module anyway.
OK, good point. Still search is going on across all -I paths, which I think we shouldn't extend to URLs.
Andrei
| |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On Wed, 15 Jun 2011 12:35:07 -0400, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote: > On 6/15/11 11:00 AM, Adam D. Ruppe wrote: >>> I understand the appeal, but I also understand the inherent >>> limitations of HTTP versus a file system. I don't think it's wise >>> pounding the former into the shape of the other. >> >> >> AIUI, you can't search for a module on a local filesystem either, >> aside from guessing a name. In D, a module name doesn't necessarily >> have to match the file name. >> >> If you import foo.bar;, you have two options: >> >> call it foo/bar.d >> >> pass whatever.d that has "module foo.bar;" in it on the command >> line explicitly. >> >> >> Being able to do a list directory entries feature doesn't really >> help locate a given named D module anyway. > > OK, good point. Still search is going on across all -I paths, which I think we shouldn't extend to URLs. > I thought of a good reason why -Iurl is likely to cause problems: -Ihttp://xxx.yyy.zzz/package1 -Ihttp://aaa.bbb.ccc/package2 import a; if xxx.yyy.zzz/package1 and aaa.bbb.ccc/package2 both contain a module called a, then which one is used depends on the conditions of the network. For example, xxx.yyy.zzz could go offline momentarily, and so package2 is used. Or xxx.yyy.zzz/package1 could be moved to a different url. I know this is a remote possibility, but like you said, internet urls are not the same things as files. I propose the following: 1. the parameter to -I (called an import path) can have an optional equals sign in it. If the equals sign is present, then the form is: full.module.path=<path_to_module> If this is the case, then failure to find full.module.path.x module inside the given path is recorded as an error (i.e. no other paths are searched). 2. we have a pragma(importpath, "<valid_import_path>"); which makes dmd treat the imports from that specific file as if -I<valid_import_path> was first in the path list. If the imported file is indirect (i.e. foo.d imports bar.d which imports baz.d), then the pragma'd path does not count when parsing the indirect file. 3. <path_to_module> can be either a file path or a url. 4. remote paths are treated like I've specified elsewhere, where the -I parameters of dmd (+ the pragmas) would be passed to dget. I think this should give us maximum flexibility, and fit within the current system quite well -- as well as giving us more specific import control even for local paths. -Steve | |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On 15/06/2011 16:15, Andrei Alexandrescu wrote: >> pragma(lib) doesn't (and can't) work as it is, why do you want to add >> more useless pragmas? > > Then we should yank it or change it. That pragma was defined in a > completely different context from today's, and right now we have a much > larger user base to draw experience and insight from. Note that rebuild had pragma(link) which got around this problem - it was the build tool, it could keep track of all of these without modifying object files or other such hackery. So I guess pragma(lib) could be fixed in the hypothetical tool. >> Command line arguments are the correct way to go >> here. > > Why? At this point enough time has been collectively spent on this that > I'm genuinely curious to find a reason that would have me "huh, haven't > thought about it that way. Fine, no need for the dip." I'm assuming you hadn't read my reasoning for being against pragma(lib) at this point, let me know if this isn't the case. >> Not to mention that paths won't be standardized across machines >> most likely so the latter would be useless. > > version() for the win. version() isn't much use when it isn't completely standardized - take C/C++, the place headers/libraries are vary greatly between distros, /usr/lib, /usr/lib32, /usr/lib64, /usr/local/lib, etc etc. version() is of no use here - the path would need to be defined on the command line. >>>> Also, I don't see the major difference in someone who's making a piece >>>> of software from adding the include path to their source file vs. >>>> adding >>>> it to their build script. >>> >>> Because in the former case the whole need for a build script may be >>> obviated. That's where I'm trying to be. >> >> This can't happen in a lot of cases, eg if you're interfacing with a >> scripting language, you need certain files automatically generating >> during build etc. > > Sure. For those cases, use tools. For everything else, there's liburl. This is where you have me confused. What is the scope of this tool? If it's not destined to become a full D package manager, an equivalent of gem/cpan/pecl etc, what's the point? >> Admittedly, for the most part, you'll just want to be >> able to build libraries given a directory or an executable given a file >> with _Dmain() in. > > That's the spirit. This is what the proposal aims at: you have the root > file and the process takes care of everything - no configs, no metadata, > no XML info, no command-line switches, no fuss, no muss. I believe for these cases it should be zero effort - but the tool should support custom builds, unless it's not eventually gonna become a package manager. > With such a feature, "hello world" equivalents demoing dcollections, qt, > mysql (some day), etc. etc. will be as simple as few-liners that anyone > can download and compile flag-free. I find it difficult to understand > how only a few find that appealing. > >> There'll still be a lot of cases where you want to >> specify some things to be dynamic libs, other static libs, and what if >> any of it you want in a resulting binary. > > Sure. But won't you think it's okay to have the DIP leave such cases to > other tools without impeding them in any way? Again, see above. I really don't see the point in this tool if it's not eventually going to become a complete package manager. Just seems like a half baked solution to the problem if it can't handle it. >>> Sounds good. I actually had the same notion, just forgot to mention it >>> in the dip (fixed). >> >> I'd agree with Steven that we need command line arguments for it, I >> completely disagree about pragmas though given that they don't work (as >> mentioned above). Just because I know you're going to ask: >> >> # a.d has a pragma(lib) in it >> $ dmd a.d >> $ dmd b.d >> $ dmd a.o b.o >> <Linker errors> >> >> This is unavoidable unless you put metadata in the object files, and >> even then you leave clutter in the resulting binary, unless you specify >> that the linker should remove it (I don't know if it can). > > I now understand, thanks. So I take it a compile-and-link command would > succeed, whereas a compile-separately succession of commands wouldn't? > That wouldn't mean the pragma doesn't work, just that it only works > under certain build scenarios. Correct. This is why I don't like pragma(lib) and the new things you are proposing. As nice as it is, if it doesn't work with incremential building and one at a time building, it's not much use. >>> This assumes the URL contains the package prefix. That would work, but >>> imposes too much on the URL structure. I find the notation -Upackage=url >>> more general. >> >> I personally think there should be a central repository listing packages >> and their URLs etc, which massively simplifies what needs passing on a >> command line. Eg -RmyPackage would cause myPackage to be looked up on >> the central server, which will have the relevant URL etc. >> >> Of course, there should be some sort of override method for private >> remote servers. > > That is tantamount to planting a flag in the distributed dmd.conf. > Sounds fine. Indeed, the central repository can be in a dmd.conf rather than hard coded. >>>> As I said in another post, you could also specify a zip file or tarball >>>> as a base path, and the whole package is downloaded instead. We may >>>> need >>>> some sort of manifest instead in order to verify the import will be >>>> found instead of downloading the entire package to find out. >>> >>> Sounds cool. >> >> I don't believe this tool should exist without compression being default. > > Hm. Well fine. Just seems silly to not use compression, given how fast it is to compress/decompress and how much bandwidth it saves. -- Robert http://octarineparrot.com/ | |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On 6/15/11 12:35 PM, Steven Schveighoffer wrote:
> I propose the following:
Excellent. I'm on board with everything. Could you please update the DIP reflecting these ideas?
Thanks,
Andrei
| |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Robert Clipsham | On Wed, 15 Jun 2011 15:56:54 +0100, Robert Clipsham wrote:
> On 15/06/2011 15:33, Andrei Alexandrescu wrote:
>> On 6/15/11 9:13 AM, Steven Schveighoffer wrote:
>>> We have been getting along swimmingly without pragmas for adding local include paths. Why do we need to add them using pragmas for network include paths?
>>
>> That doesn't mean the situation is beyond improvement. If I had my way
>> I'd add pragma(liburl) AND pragma(libpath).
>
> pragma(lib) doesn't (and can't) work as it is, why do you want to add
> more useless pragmas?
In what sense doesn't pragma(lib) work? This is news to me.
Graham
| |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Graham Fawcett | On 15/06/2011 20:02, Graham Fawcett wrote: > In what sense doesn't pragma(lib) work? This is news to me. > > Graham Scroll down in my post, I explain it. > On Wed, 15 Jun 2011 15:56:54 +0100, Robert Clipsham wrote: > Just because I know you're going to ask: > > # a.d has a pragma(lib) in it > $ dmd a.d > $ dmd b.d > $ dmd a.o b.o > <Linker errors> -- Robert http://octarineparrot.com/ | |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | "Andrei Alexandrescu" <SeeWebsiteForEmail@erdani.org> wrote in message news:itagdr$29mt$1@digitalmars.com... > On 6/15/11 8:33 AM, Steven Schveighoffer wrote: >> I can't really think of any other issues. > > Allow me to repeat: the scheme as you mention it is unable to figure and load dependent remote libraries for remote libraries. It's essentially a flat scheme in which you know only the top remote library but nothing about the rest. > > The dip takes care of that by using transitivity and by relying on the presence of dependency information exactly where it belongs - in the dependent source files. Dependency information is already in the source: The "import" statement. The actual path to the depndencies does not belong in the source file - that *is* a configuration matter, and cramming it into the source only makes configuring harder. > Separating that information from source files has two liabilities. First, it breaks the whole transitivity thing. I think that's solvable. | |||
June 15, 2011 Re: DIP11: Automatic downloading of libraries | ||||
|---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | First i didn't read all of the posts in this thread, so some of these might already be answered. In the first paragraph the DIP talks about Automatic downloading of *libraries* while all the posts here talk about downloading files. This is also reflected in the "Package case" paragraph since the compiler / separate tool will first try to download a .di file. Which generally is a d import or header file, which doesn't need to include the implementation, so the compiled library should also be downloaded or linking would fail, right? Also the proposal doesn't do anything with versioning, while larger updates will probably get a different url, bug fixes might still introduce regressions that silently break an application that uses the library. And now you'll have to track down witch library introduced the bug, and more importantly your app broke overnight and while you didn't change anything. (other that recompiling) To find out how downloading the files would work i did some tests with GtkD. Building GtkD itself takes 1m56. Building an Helloworld app that uses the prebuild library takes 0m01. The Helloworld app need 133 files from GtkD. Building the app and the files it needs takes 0m24. The source of the HelloWord application can be found here: http://www.dsource.org/projects/gtkd/browser/trunk/demos/gtk/HelloWorld.d -- Mike Wey | |||
Copyright © 1999-2021 by the D Language Foundation
Permalink
Reply