November 30, 2012
On Friday, 30 November 2012 at 17:12:10 UTC, Rob T wrote:
> On Friday, 30 November 2012 at 08:05:25 UTC, Jacob Carlborg wrote:
>> That's the exact same thing as I'm proposing, except it's the compiler handling it.
>
> Which would be better because it is integrated and not an external tool.
>

Why would it be better and why would you care one way or the other? Some people are pointing out that integration of features related to code, but that are not actual code is not necessarily desirable (e.g. comment processing). Many of us use git, so why don't we have dmd do our diffs against the git repository for us as well? Hyperbole, but what happened to the single responsibility principle? Let the compiler do its thing, compile - but let a more knowledgeable entity than the source give it its flags.

What I don't understand is how the writer of source would know how it is to be built everywhere and in every way a client might want it. They probably wouldn't - so leave the description of that out of the source entirely.

When rdmd does work for the simple case, you would not complain: well this stinks having to type rdmd instead of dmd. No, you'd use it and be happy with its simplicity. Lack of integration into dmd has not hurt at all. It does not satisfy all cases - but it seems it adds great value without disrupting dmd. Why wouldn't a more complex build system want the same thing?

[snip]
>
> No doubt the compiler should be a library. Why isn't it?

20/20 hindsight maybe.

> If it was a library, then perhaps it could use itself in some very interesting ways. The compiler should also accept plugins for extensibility. I have not looked at the code yet, but I suspect what we have under the hood will make me want to cry.

The beauty of encapsulation ... don't look.

> To me, building is just an ugly hack and patch process caused by a broken system that is unable to build itself. It's a total mess. The best place to fix the problem is right at the source.
>

I don't think it is that bad at all - but then I'm using rdmd for everything for now. And I agree that the best place to fix the problem is right at the source... the problem is which source. I would say the build source, not the D source as they serve different purposes. Right now we don't have standard build source is the issue and it sounds like you are advocating the two go together.

November 30, 2012
>>> In any event, I'd ask how do the current build systems do it? They read
>>> and parse through the source files to learn about dependencies
>>
>> No, they:
>>
>> 1. Run "$ dmd -o- -c main.d -deps=deps.txt" which will write out all dependencies of "main.d" to "deps.txt"
>
> Personally, I don't see how that would work using the current form of the output. I tried it with Make to figure out dependencies and the problem I immediately ran into was that the output did not contain full path information for the projects modules, and without that information, there was no way to combine builds from related project under a separate folder.

It's easy to work around that. It's certainly orders of magnitude easier than parsing the source files. Besides, parsing them isn't even enough because imports can be inside static if blocks or templates. You need pretty much an entire D frontend to correctly find dependencies from source files.
November 30, 2012
11/30/2012 12:05 PM, Jacob Carlborg пишет:
> On 2012-11-29 23:06, Rob T wrote:
>
>> For the moment, let's talk about ddoc, or unit testing in D. That's the
>> difference, it's not an external tool set, it's instead a a part of the
>> standard language feature set. BTW, IMO ddoc was implemented poorly,
>> using comments, which I fully agree with you would be a vary bad way to
>> go about implementing the feature. In that case, I would rather use Ruby.
>
> The built-in support for unit testing is too simplistic. I think one
> needs an external tool anyway that makes use of the built-in support
> unit tests.
>
> I just want to be able to do something like:
>
> $ test a.d b.d
>
> And it will run all unit tests in the modules "a" and "b". In D I need
> to manually creating a test module which imports all modules I want to
> test.

AFAIK you don't need to import module to run its tests. So
   rdmd --main -unittest a.d b.d
should work.

Then we should also be able to make rdmd more extendable. There is e.g. quite interesting --eval switch but I feel it could be something more customizable w/o a lot of extra work.

>This will give the most basic functionality. This is a few things
> of that's missing:
>
> * Run a single test
> * Names or context for the tests
> * Nice report of which tests failed
> * Continue running other tests if a given test failed
>

-- 
Dmitry Olshansky
November 30, 2012
On Friday, 30 November 2012 at 17:59:23 UTC, jerro wrote:
> It's easy to work around that. It's certainly orders of magnitude easier than parsing the source files. Besides, parsing them isn't even enough because imports can be inside static if blocks or templates. You need pretty much an entire D frontend to correctly find dependencies from source files.

It may be easy, but it's not obvious, which makes it hard. That's the problem with external builds.

You make a very good point about the static ifs and so on, which means the best place to get that kind of information is directly from the compiler during the build process, but why assume that the compiler can only partly build rather than perform full builds, including an installation?

It just seems like a good idea to use D as the build language rather than something else.


November 30, 2012
On Friday, 30 November 2012 at 19:16:56 UTC, Rob T wrote:
> On Friday, 30 November 2012 at 17:59:23 UTC, jerro wrote:
>> It's easy to work around that. It's certainly orders of magnitude easier than parsing the source files. Besides, parsing them isn't even enough because imports can be inside static if blocks or templates. You need pretty much an entire D frontend to correctly find dependencies from source files.
>
> It may be easy, but it's not obvious, which makes it hard. That's the problem with external builds.

If you are writing a build tool, I don't imagine writing those few extra lines to convert relative paths to absolute would be a problem.

> You make a very good point about the static ifs and so on, which means the best place to get that kind of information is directly from the compiler during the build process

And you already can get that info from the compiler using -deps flag.


November 30, 2012
On Friday, 30 November 2012 at 19:35:38 UTC, jerro wrote:
> If you are writing a build tool, I don't imagine writing those few extra lines to convert relative paths to absolute would be a problem.

How does the script know what is a relative path and what is not? It's not that easy, and the programmer should not have to be fooling around solving non-productive problems like this.

>> You make a very good point about the static ifs and so on, which means the best place to get that kind of information is directly from the compiler during the build process
>
> And you already can get that info from the compiler using -deps flag.

But it does not supply enough information and the -deps thing externalizes the build process, which only supports the externalized build mess. The whole thing was designed based on using an external build process, but I don't think it has to be done in that way.


December 01, 2012
On 2012-11-30 18:12, Rob T wrote:

> Personally, I don't see how that would work using the current form of
> the output. I tried it with Make to figure out dependencies and the
> problem I immediately ran into was that the output did not contain full
> path information for the projects modules, and without that information,
> there was no way to combine builds from related project under a separate
> folder. What I find, is that with D, people seem to be building in
> simple ways, everything under one folder. This works perhaps for many
> people, but not for everyone. Currently I want send all build output to
> a separate folder outside my project folder onto a separate drive, but I
> can't do something even that simple. Sure I can hack it with perhaps a
> symbolic, but that's a hack which sucks.

That command will output the full path to the source files. Am I missing something?

Example of output: http://pastebin.com/mCWGHyn7

When you say "build output" are you referring to the object files? In that cases these flags are available:

  -odobjdir      write object & library files to directory objdir
  -offilename    name output file to filename


> No doubt the compiler should be a library. Why isn't it?

I don't know. The compiler is fairly old, especially the backend.

> If it was a
> library, then perhaps it could use itself in some very interesting ways.

Yes, perhaps for CTFE, instead of embedding an interpreter.

> The compiler should also accept plugins for extensibility.

Absolutely, that would be nice.

> I have not
> looked at the code yet, but I suspect what we have under the hood will
> make me want to cry.

Don't look, personally I think the code look horrible.

> If there's information inside the source, then the compiler could use
> that information during a build. A very simple example of this, would be
> the imports. So instead of manually dumping a deps file, and working
> some build script magic, the compiler could have that information
> available internally, thereby saving the programmer from hacking away at
> an external build script to get it. My guess that's the least of the
> advantages, there's probably a lot more that could be done.

The compiler do have all the knowledge about which source files a build depends on, that's why we can get the output using the -deps flag. It just don't compile the source files unless you explicitly tell it to.


-- 
/Jacob Carlborg
December 01, 2012
On 2012-11-30 20:16, Rob T wrote:

> It just seems like a good idea to use D as the build language rather
> than something else.

You can do that with an external build tool as well.

-- 
/Jacob Carlborg
December 01, 2012
On 2012-11-30 19:36, Dmitry Olshansky wrote:

> AFAIK you don't need to import module to run its tests. So
>     rdmd --main -unittest a.d b.d
> should work.

RDMD is the external tool that I was talking about. Which is fixing some of the problems I was talking about.

> Then we should also be able to make rdmd more extendable. There is e.g.
> quite interesting --eval switch but I feel it could be something more
> customizable w/o a lot of extra work.

RDMD should be built as a library as well.

-- 
/Jacob Carlborg
December 01, 2012
On 2012-11-30 21:02, Rob T wrote:
> On Friday, 30 November 2012 at 19:35:38 UTC, jerro wrote:
>> If you are writing a build tool, I don't imagine writing those few
>> extra lines to convert relative paths to absolute would be a problem.
>
> How does the script know what is a relative path and what is not? It's
> not that easy, and the programmer should not have to be fooling around
> solving non-productive problems like this.

The -deps output contains the full path for all source files, or is the something I'm missing here?

-- 
/Jacob Carlborg