June 14, 2011
On 6/14/11 12:21 PM, Adam D. Ruppe wrote:
> bearophile:
>> Isn't this also an argument for the (ancient request of) inclusion
>> of a normal build feature into the D compiler?
>
> I think if the compiler is going to be downloading files, it is
> necessarily looking for those files... thus it'd become a bit of
> a smarter build tool as a side effect of this change.
>
> It'd be silly if it said "this module is missing, and I'll download
> it, but I refuse to actually look at it once it's downloaded!"

Agreed. I think we should have a pragma that mentions "add this file to the build" (possibly by adapting the existing pragma(lib, ...)), and that pragma(liburl, ...) should imply that other pragma.

Andrei
June 14, 2011
I like it (specifically because of its simplicity).  It's not going to work for projects that require a more complex build process using a build tool but for simple modules it's a rather elegant solution.  Projects that need a build tool don't need to use it and can just continue using a build tool and manually managing their external packages (hopefully eventually using whatever gem/CPAN-style package proposal is finally adopted).

I think it's a great stopgap until the D community has the manpower to create (and more importantly, maintain) something like gem.  There are certainly some details to work out but I like the overall idea.

For people new to any language the most confusing (and usually poorly documented) part is the build environment.  "Where do I get this package, where do I have to put it to use it, how do I even build it?"  Having to learn that for every external package you want to use is a big roadblock to anyone who is new.  This proposal doesn't eliminate entirely but it does get rid of the simpler cases for those who choose to use it.

Regards,
Brad Anderson

On Tue, Jun 14, 2011 at 7:53 AM, Andrei Alexandrescu < SeeWebsiteForEmail@erdani.org> wrote:

> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>
> Destroy.
>
>
> Andrei
>


June 14, 2011
On Tue, 14 Jun 2011 12:20:33 -0500, Andrei Alexandrescu wrote:

> On 6/14/11 12:21 PM, Adam D. Ruppe wrote:
>> bearophile:
>>> Isn't this also an argument for the (ancient request of) inclusion of a normal build feature into the D compiler?
>>
>> I think if the compiler is going to be downloading files, it is necessarily looking for those files... thus it'd become a bit of a smarter build tool as a side effect of this change.
>>
>> It'd be silly if it said "this module is missing, and I'll download it, but I refuse to actually look at it once it's downloaded!"
> 
> Agreed. I think we should have a pragma that mentions "add this file to
> the build" (possibly by adapting the existing pragma(lib, ...)), and
> that pragma(liburl, ...) should imply that other pragma.

pragma(resolve_all_dependencies_just_like_rdmd_does)?

As a data point: the Glasgow Haskell Compiler has a "--make" option which discovers and includes all dependent libraries, in the "rdmd" style. Only recently (ghc 7?), they made it the default behaviour for the compiler.

I'd *really* like to see "dmd --make". Even better, "dmd --disable-make", because "--make" is the default.

Graham
June 14, 2011
On 14/06/2011 14:53, Andrei Alexandrescu wrote:
> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>
> Destroy.
>
>
> Andrei

This doesn't seem like the right solution to the problem - the correct solution, in my opinion, is to have a build tool/package manager handle this, not the compiler.

Problems I see:
 * Remote server gets hacked, everyone using the library now
   executes malicious code
 * Remote source changes how it is built, your code suddenly breaks and
   has to be updated, rather than being handled automatically
 * Adds a lot of unnecessary bloat and/or dependency on external modules
   + Want to compress source code? dmd now depends on decompression libs
   + Want to use git? dmd now depends on git
   + Remote code uses new compression method that an older dmd doesn't
     support
 * Remote server is down - build takes forever while waiting
   + Make dmd time out after a couple of seconds - build fails
 * Makes the assumption that the build machine is has internet
   connectivity, if it doesn't building suddenly gets a lot more
   complicated
 * Source code changes location, build breaks unless a redirect is
   possible - if it changes protocol it's useless

I could go on. I believe the real solution to this is to have (as discussed a lot recently) a proper D package management tool like PHP's pecl, ruby's gem or perl's cpan. Of course, this doesn't mean we have to lose the ability to list dependencies in D etc. In fact, it seems like the perfect opportunity to get people to switch to using D to build projects (everyone should, I do it and it's the best build tool I've *ever* used).

Hypothetical D Package Manager:
foobar
|
 ` pragma(depend, "foo", "v1.2.x");
 ` pragma(depend, "bar", "v1.4.3");

$ dpm install foobar
 -> Do packages exist locally, are they the right version?
 -> Do they exist remotely, and do the given versions exist?
 -> Get the remote packages
 -> Get the D Build Tool to build it (or use a binary, if available)
$ dbt build foobar
 -> Is there a default.dbt or foobar.dbt file?
 -> If not, attempt to build a binary, use -lib to attempt to build as
    a library
 -> If there is, pass it to dmd, it's actually a D file describing how
    to build

Of course, the dbt file would have access to some helper functions, eg library("myDir").build for building a library out of all the files in myDir (should be a way to specify the files etc). dbt would obviously take care of compiler flags/compiler etc.

I started implementing this the other day, got a few lines into a main() then realised I didn't have enough time to build the tool I wanted :>

-- 
Robert
http://octarineparrot.com/
June 14, 2011
On Tue, 14 Jun 2011 12:35:44 -0400, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote:

> On 6/14/11 11:32 AM, Graham Fawcett wrote:
>> On Tue, 14 Jun 2011 15:59:15 +0000, Adam D. Ruppe wrote:
>>
>>>> One other interesting aspect is that the string literal can be
>>>> CTFE-constructed,
>>>
>>> Oh, or it could be in version {} blocks. I like that.
>>>
>>> I think we should actually whip up a working model. It needn't be a
>>> compiler feature at this point - we can use pragma(msg, "BUILD: " ~
>>> param) for now and have a helper program scan dmd's output.
>>
>> +1, sounds fun. :)
>>
>> Rather than pragma(msg), you could also use pragma(liburl), and run dmd
>> with "-ignore -v". You can parse the pragma from there. (I think you'd
>> need to write `pragma(liburl, "name-in-quotes", "url-in-quotes")`, a
>> slight diversion from Andrei's syntax, but otherwise it would work.)
>>
>> Graham
>
> I just realized that one advantage of the download being integrated in the compiler is that the compiler is the sole tool with full knowledge and control of what modules are imported. A tool could repeatedly run the compiler with -v and see what modules it couldn't load, to then download them. (Also, of course, a tool could rely on extralinguistic library management control that has its own advantages and disadvantages as we discussed.)

I think it should be split as follows:

dmd: determine *what* to download (i.e. I need to import module x.y.z)
external tool: determine *where* and *how* to download it. (i.e. module x.y.z lives on http://somerepository.org/x, go get it and save it)

The advantages being:

1. there exists umpteen billion already-existing tools that fetch and install data over the network
2. dmd does not contain parts that have nothing to do with compiling, which could potentially screw up the compiling part.
3. Depending on the tool language, the barrier to development of it would be significantly reduced.  Most people feel quite uncomfortable messing with compiler source code, but have no problems editing something like a shell script, or even a simple d-based tool.
4. The compiler is written in C++, and hence so would the part that does this have to be... yuck!

-Steve
June 14, 2011
On 6/14/11 1:22 PM, Robert Clipsham wrote:
> On 14/06/2011 14:53, Andrei Alexandrescu wrote:
>> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>>
>> Destroy.
>>
>>
>> Andrei
>
> This doesn't seem like the right solution to the problem - the correct
> solution, in my opinion, is to have a build tool/package manager handle
> this, not the compiler.
>
> Problems I see:
> * Remote server gets hacked, everyone using the library now
> executes malicious code

This liability is not different from a traditional setup.

> * Remote source changes how it is built, your code suddenly breaks and
> has to be updated, rather than being handled automatically

This is a deployment issue affecting this approach and any other relying on downloading stuff.

> * Adds a lot of unnecessary bloat and/or dependency on external modules
> + Want to compress source code? dmd now depends on decompression libs

Indeed, I think compression will indeed be commonly requested. The same has happened about Java - initially it relied on downloading .class files, but then jar files were soon to follow.

It's been a feature asked in this forum, independently of downloads. A poster implemented a complete rdmd-like program that deals with .zip files.

> + Want to use git? dmd now depends on git

Not if the server can serve files, or if you use a different tool.

> + Remote code uses new compression method that an older dmd doesn't
> support

If compression handling is needed, dmd can standardize on it just like jar files do.

> * Remote server is down - build takes forever while waiting

So does downloading or building with another tool.

> + Make dmd time out after a couple of seconds - build fails

So would build directed with any other tool.

> * Makes the assumption that the build machine is has internet
> connectivity, if it doesn't building suddenly gets a lot more
> complicated

Fair point.

> * Source code changes location, build breaks unless a redirect is
> possible - if it changes protocol it's useless

See my answer with a central repo.

My understanding is that you find automated download during the first build untenable, but manual download prior to the first build acceptable. I don't see such a large fracture between the two cases as you do.


Andrei
June 14, 2011
On 6/14/11 1:58 PM, Steven Schveighoffer wrote:
> I think it should be split as follows:
>
> dmd: determine *what* to download (i.e. I need to import module x.y.z)

It can't (unless it does the download too) due to transitive dependencies.

> external tool: determine *where* and *how* to download it. (i.e. module
> x.y.z lives on http://somerepository.org/x, go get it and save it)
>
> The advantages being:
>
> 1. there exists umpteen billion already-existing tools that fetch and
> install data over the network
> 2. dmd does not contain parts that have nothing to do with compiling,
> which could potentially screw up the compiling part.
> 3. Depending on the tool language, the barrier to development of it
> would be significantly reduced. Most people feel quite uncomfortable
> messing with compiler source code, but have no problems editing
> something like a shell script, or even a simple d-based tool.
> 4. The compiler is written in C++, and hence so would the part that does
> this have to be... yuck!

Not sure I grok 3 and 4, but as far as I can tell the crux of the matter is that dependencies are already embedded in .d files. That's why I think it's simpler to just let dmd take care of them all instead of maintaining dependency description files in separation from the .d files.

The umpteen billion tools don't know what it takes to download and build everything starting from one or a few root modules. They could execute the download, yes (and of course we'll use such a library for that), but we need a means to drive them.

I think Adam's tool is a good identification and alternative solution for the problem that the pragma solves.


Andrei
June 14, 2011
On 14/06/2011 20:14, Andrei Alexandrescu wrote:
> On 6/14/11 1:58 PM, Steven Schveighoffer wrote:
>> I think it should be split as follows:
>>
>> dmd: determine *what* to download (i.e. I need to import module x.y.z)
>
> It can't (unless it does the download too) due to transitive dependencies.

It can work with the build tool though - build foo -> depends on bar. Build tool gets bar -> build bar -> depends baz, etc.

-- 
Robert
http://octarineparrot.com/
June 14, 2011
On Tue, 14 Jun 2011 15:14:28 -0400, Andrei Alexandrescu <SeeWebsiteForEmail@erdani.org> wrote:

> On 6/14/11 1:58 PM, Steven Schveighoffer wrote:
>> I think it should be split as follows:
>>
>> dmd: determine *what* to download (i.e. I need to import module x.y.z)
>
> It can't (unless it does the download too) due to transitive dependencies.

dmd: I need module foo.awesome.  Where is it?
filesystem: nope, don't have it
dmd: damn, I guess I'll need to check with the downloader, hey downloader you have that?
downloader: hm... oh, yeah!  I'll get it for you
filesystem: got it
dmd: ok, now what's in foo.awesome?  Oh, hm... foo.awesome needs bar.gnarly.  Let me guess filesystem...
filesystem: yeah, I suck today, go ask downloader
...

How hard is that?  I mean the actual downloading of files is pretty straightforward, at some point the problem reduces to "download a file".  Why do we have to reinvent *that* wheel.

>
>> external tool: determine *where* and *how* to download it. (i.e. module
>> x.y.z lives on http://somerepository.org/x, go get it and save it)
>>
>> The advantages being:
>>
>> 1. there exists umpteen billion already-existing tools that fetch and
>> install data over the network
>> 2. dmd does not contain parts that have nothing to do with compiling,
>> which could potentially screw up the compiling part.
>> 3. Depending on the tool language, the barrier to development of it
>> would be significantly reduced. Most people feel quite uncomfortable
>> messing with compiler source code, but have no problems editing
>> something like a shell script, or even a simple d-based tool.
>> 4. The compiler is written in C++, and hence so would the part that does
>> this have to be... yuck!
>
> Not sure I grok 3 and 4, but as far as I can tell the crux of the matter is that dependencies are already embedded in .d files. That's why I think it's simpler to just let dmd take care of them all instead of maintaining dependency description files in separation from the .d files.

And it would, why wouldn't it?  I think you may not be getting something here...

> The umpteen billion tools don't know what it takes to download and build everything starting from one or a few root modules. They could execute the download, yes (and of course we'll use such a library for that), but we need a means to drive them.

dmd would drive them.

> I think Adam's tool is a good identification and alternative solution for the problem that the pragma solves.

I haven't seen it.  Just thinking out loud...

-Steve
June 14, 2011
On 14/06/2011 20:07, Andrei Alexandrescu wrote:
> On 6/14/11 1:22 PM, Robert Clipsham wrote:
>> On 14/06/2011 14:53, Andrei Alexandrescu wrote:
>>> http://www.wikiservice.at/d/wiki.cgi?LanguageDevel/DIPs/DIP11
>>>
>>> Destroy.
>>>
>>>
>>> Andrei
>>
>> This doesn't seem like the right solution to the problem - the correct
>> solution, in my opinion, is to have a build tool/package manager handle
>> this, not the compiler.
>>
>> Problems I see:
>> * Remote server gets hacked, everyone using the library now
>> executes malicious code
>
> This liability is not different from a traditional setup.

Perhaps, but with a proper package management tool this can be avoided with sha sums etc, this can't happen with a direct get. Admittedly this line of defense falls if the intermediate server is hacked.

>> * Remote source changes how it is built, your code suddenly breaks and
>> has to be updated, rather than being handled automatically
>
> This is a deployment issue affecting this approach and any other relying
> on downloading stuff.

It doesn't affect anything if a proper package management/build tool is in use, as the remote code specifies how it is built, rather than the local code.

>> * Adds a lot of unnecessary bloat and/or dependency on external modules
>> + Want to compress source code? dmd now depends on decompression libs
>
> Indeed, I think compression will indeed be commonly requested. The same
> has happened about Java - initially it relied on downloading .class
> files, but then jar files were soon to follow.
>
> It's been a feature asked in this forum, independently of downloads. A
> poster implemented a complete rdmd-like program that deals with .zip files.
>
>> + Want to use git? dmd now depends on git
>
> Not if the server can serve files, or if you use a different tool.

But then you lose the advantages of using git to get the source at all.

>> + Remote code uses new compression method that an older dmd doesn't
>> support
>
> If compression handling is needed, dmd can standardize on it just like
> jar files do.
>
>> * Remote server is down - build takes forever while waiting
>
> So does downloading or building with another tool.

Not so if you get all the source at once rather than depending on getting it during build.

>> + Make dmd time out after a couple of seconds - build fails
>
> So would build directed with any other tool.
>
>> * Makes the assumption that the build machine is has internet
>> connectivity, if it doesn't building suddenly gets a lot more
>> complicated
>
> Fair point.

For the previous few points, where you're unable to download the package for whatever reason, it means you have to duplicate build instructions. Do this, otherwise here's how to do it all manually.

>> * Source code changes location, build breaks unless a redirect is
>> possible - if it changes protocol it's useless
>
> See my answer with a central repo.
>
> My understanding is that you find automated download during the first
> build untenable, but manual download prior to the first build
> acceptable. I don't see such a large fracture between the two cases as
> you do.

I don't have a problem with automatically downloading source during a first build, I do see a problem with getting the compiler to do it though. I don't believe the compiler should have anything to do with getting source code, unless the compiler also becomes a package manager and build tool.

-- 
Robert
http://octarineparrot.com/