December 02, 2009
Andrei Alexandrescu wrote:

> 
> I'm amazed that virtually nobody uses rdmd. I can hardly fathom how I managed to make-do without it.
> 
> Andrei

rdmd is a life saver, I use it all the time.
December 02, 2009
Andrei Alexandrescu wrote:
> Leandro Lucarella wrote:
>> Walter Bright, el  1 de diciembre a las 13:45 me escribiste:
>>> Leandro Lucarella wrote:
>>>> I develop twice as fast in Python than in D. Of course this is only me,
>>>> but that's where I think Python is better than D :)
>>> If that is not just because you know the Python system far better
>>> than the D one, then yes indeed it is a win.
>>
>> And because you have less noise (and much more and better libraries
>> I guess :) in Python, less complexity to care about.
>>
>> And don't get me wrong, I love D, because it's a very expressive language
>> and when you need speed, you need static typing and all the low-level
>> support. They are all necessary evil. All I'm saying is, when I don't need
>> speed and I have to do something quickly, Python is still a far better
>> language than D, because of they inherent differences.
>>
>>>> I think only not having a compile cycle (no matter how fast compiling is)
>>>> is a *huge* win. Having an interactive console (with embedded
>>>> documentation) is another big win.
>>> That makes sense.
>>
>> I guess D can greatly benefit from a compiler that can compile and run
>> a multiple-files program with one command (AFAIK rdmd only support one
>> file programs, right?) and an interactive console that can get the ddoc
>> documentation on the fly. But that's not very related to the language
>> itself, I guess it's doable, the trickiest part is the interactive
>> console, I guess...
>>
> 
> I'm amazed that virtually nobody uses rdmd. I can hardly fathom how I managed to make-do without it.
> 
> Andrei


I use it almost exclusively, and find it an extremely useful and efficient tool. The only time I use DMD directly is when I'm done coding and testing, and want to compile the final library file or executable.

For libraries, I define a unit.d file in the library root directory that looks something like this:

  #!/usr/local/bin/rdmd --shebang -w -unittest
  module unit;

  import std.stdio;

  // Import entire library.
  import mylib.moduleA;
  import mylib.moduleB;
  ...

  void main()  { writeln("All unittests passed."); }

Then I mark unit.d as executable, and run it whenever I want to test changes I've made to the library.

-Lars
December 02, 2009
Hello retard,

> Tue, 01 Dec 2009 14:24:01 -0800, Walter Bright wrote:
> 
>> dsimcha wrote:
>> 
>>> My biggest gripe about static verification is that it can't help you
>>> at all with high-level logic/algorithmic errors, only lower level
>>> coding errors.  Good unit tests (and good asserts), on the other
>>> hand, are invaluable for finding and debugging high-level logic and
>>> algorithmic errors.
>>> 
>> Unit tests have their limitations as well. Unit tests cannot prove a
>> function is pure, for example.
>> 
> Sure, unit tests can't prove that.
> 
>> Both unit tests and static verification are needed.
>> 
> But it doesn't lead to this conclusion. Static verification is
> sometimes very expensive and real world business applications don't
> need those guarantees that often. It's ok if a web site or game
> crashes every now and then. If I need serious static verification, I
> would use tools like Coq, not D..
> 

Static verification in Coq is very expensive, but who really does that for real world programs. I think we are talking about automatic static verification with none or minimal programmer assistance - it will get you assurances for larger project with multiple programmers - that various parts plug in correctly (typecheck) and that they do not affect other parts of program in unexpected ways (const/pure/safe) - then you are at good ground to verify yours program logic by yourself (debugging/pre(post)conditions/unittests/asserts/invariants).


December 02, 2009
retard wrote:
> Tue, 01 Dec 2009 14:24:01 -0800, Walter Bright wrote:
>> Unit tests have their limitations as well. Unit tests cannot prove a
>> function is pure, for example.
> 
> Sure, unit tests can't prove that.
> 
>> Both unit tests and static verification are needed.
> 
> But it doesn't lead to this conclusion. Static verification is sometimes very expensive

Not if it's built in to the compiler. I aim to bring the cost of it down to zero.

> and real world business applications don't need those guarantees that often.

Having your accounting software write checks in the wrong amount can be very very bad. And frankly, if you can afford your software unwittingly emitting garbage data, you don't need that software for your business apps.

> It's ok if a web site or game crashes every now and then.

If Amazon's web site goes down, they likely lose millions of dollars a minute. Heck, I once lost a lot of business because the web site link to the credit card system went down. Few businesses can afford to have their ecommerce web sites down.

> If I need serious static verification, I would use tools like Coq, not D..

There's a lot of useful stuff in between a total formal proof of correctness and nothing at all. D can offer proof of various characteristics that are valuable for eliminating bugs.
December 02, 2009
retard wrote:
> The thing is, nowadays when all development should follow the principles of clean code (book), agile, and tdd/bdd, this cannot happen. You write tests first, then the production code. They say that writing tests and code takes less time than writing only the more or less buggy production code. Not writing tests is a sign of a novice programmer and they wouldn't hire you if you didn't advertise your TDD skills.

And therein lies the problem. You need the programmers to follow a certain discipline. I don't know if you've managed programmers before, but they don't always follow discipline, no matter how good they are. The root problem is there's no way to *verify* that they've followed the discipline, convention, procedure, whatever.

But with mechanical checking, you can guarantee certain things. How are you going to guarantee each member of your team put all the unit tests in? Each time they change anything?

> In this particular case you use a dummy test db fixture system, write tests for 'a is int' and 'b is int'. With these tests in place, the functionality provided by D's type system is only a subset of the coverage the tests provide. So D cannot offer any advantage anymore over e.g. Python.

Where's the advantage of:

    assert(a is int)

over:

    int a;

? Especially if I have to follow the discipline and add them in everywhere?
December 02, 2009
Wed, 02 Dec 2009 03:16:58 -0800, Walter Bright wrote:

> retard wrote:
>> The thing is, nowadays when all development should follow the principles of clean code (book), agile, and tdd/bdd, this cannot happen. You write tests first, then the production code. They say that writing tests and code takes less time than writing only the more or less buggy production code. Not writing tests is a sign of a novice programmer and they wouldn't hire you if you didn't advertise your TDD skills.
> 
> And therein lies the problem. You need the programmers to follow a certain discipline. I don't know if you've managed programmers before, but they don't always follow discipline, no matter how good they are. The root problem is there's no way to *verify* that they've followed the discipline, convention, procedure, whatever.
> 
> But with mechanical checking, you can guarantee certain things. How are you going to guarantee each member of your team put all the unit tests in? Each time they change anything?
> 
>> In this particular case you use a dummy test db fixture system, write tests for 'a is int' and 'b is int'. With these tests in place, the functionality provided by D's type system is only a subset of the coverage the tests provide. So D cannot offer any advantage anymore over e.g. Python.
> 
> Where's the advantage of:
> 
>      assert(a is int)
> 
> over:
> 
>      int a;
> 
> ? Especially if I have to follow the discipline and add them in everywhere?

The case I commented on was about fetching values from a db IIRC. So the connection between SQL database and D loses all type information unless you build some kind of high level SQL interface which checks the types (note that up-to-date checking cannot be done with dmd unless it allows fetching stuff from the db on compile time or you first dump the table parameters to some text file before compiling). You can't just write:

  typedef string[] row;
  row[] a = sql_engine.execute("select * from foobar;").result;
  int b = (int)a[0][0];
  string c = (string)b[0][1];

and somehow expect that the first column of row 0 is an integer and the next column a string. You still need to postpone the checking to runtime with some validation function:

  typedef string[] row;
  row[] a = sql_engine.execute("select * from foobar;").result;

  void runtime_assert(T)(string s) { ... }

  runtime_assert!(int)(a[0][0]);
  int b = (int)a[0][0];

  string c = b[0][1];

I agree some disciplines are hard to follow. For example ensuring immutability in a inherently mutable language. But TDD is something a bit easier - it's a lot higher level. It's easy to remember that you can't write any code into production code folder unless there is already code in test folder. You can verify with code coverage tools that you didn't forget to write some tests. In TDD the whole code looks different. You build it to be easily testable. It's provably a good way to write code - almost every company nowadays uses TDD and agile methods such as Scrum.
December 02, 2009
retard wrote:
> Wed, 02 Dec 2009 03:16:58 -0800, Walter Bright wrote:
> 
>> retard wrote:
>>> The thing is, nowadays when all development should follow the
>>> principles of clean code (book), agile, and tdd/bdd, this cannot
>>> happen. You write tests first, then the production code. They say that
>>> writing tests and code takes less time than writing only the more or
>>> less buggy production code. Not writing tests is a sign of a novice
>>> programmer and they wouldn't hire you if you didn't advertise your TDD
>>> skills.
>> And therein lies the problem. You need the programmers to follow a
>> certain discipline. I don't know if you've managed programmers before,
>> but they don't always follow discipline, no matter how good they are.
>> The root problem is there's no way to *verify* that they've followed the
>> discipline, convention, procedure, whatever.
>>
>> But with mechanical checking, you can guarantee certain things. How are
>> you going to guarantee each member of your team put all the unit tests
>> in? Each time they change anything?
>>
>>> In this particular case you use a dummy test db fixture system, write
>>> tests for 'a is int' and 'b is int'. With these tests in place, the
>>> functionality provided by D's type system is only a subset of the
>>> coverage the tests provide. So D cannot offer any advantage anymore
>>> over e.g. Python.
>> Where's the advantage of:
>>
>>      assert(a is int)
>>
>> over:
>>
>>      int a;
>>
>> ? Especially if I have to follow the discipline and add them in
>> everywhere?
> 
> The case I commented on was about fetching values from a db IIRC. So the connection between SQL database and D loses all type information unless you build some kind of high level SQL interface which checks the types (note that up-to-date checking cannot be done with dmd unless it allows fetching stuff from the db on compile time or you first dump the table parameters to some text file before compiling). You can't just write:
> 
>   typedef string[] row;
>   row[] a = sql_engine.execute("select * from foobar;").result;
>   int b = (int)a[0][0];
>   string c = (string)b[0][1];
> 
> and somehow expect that the first column of row 0 is an integer and the next column a string. You still need to postpone the checking to runtime with some validation function:
> 
>   typedef string[] row;
>   row[] a = sql_engine.execute("select * from foobar;").result;
> 
>   void runtime_assert(T)(string s) { ... }
> 
>   runtime_assert!(int)(a[0][0]);
>   int b = (int)a[0][0];
> 
>   string c = b[0][1];


std.conv.to() to the rescue! :)

  import std.conv;
  ...

  row[] a = sql_engine.execute("select * from foobar;").result;

  int b = to!int(a[0][0]);          // Throws if conversions fail
  string c = to!string(a[0][1]);

-Lars
December 02, 2009
Wed, 02 Dec 2009 13:12:58 +0100, Lars T. Kyllingstad wrote:

> std.conv.to() to the rescue! :)
> 
>    import std.conv;
>    ...
> 
>    row[] a = sql_engine.execute("select * from foobar;").result;
> 
>    int b = to!int(a[0][0]);          // Throws if conversions fail
>    string c = to!string(a[0][1]);
> 
> -Lars

You also seem to miss the point. The topic of this conversation (I think?) was about static verification. to! throws at runtime.
December 02, 2009
== Quote from retard (re@tard.com.invalid)'s article
> I thought D was supposed to be a practical language for real world problems. This 'D is good because everything can and must be written in D' is beginning to sound like a religion.

You're missing the point.  Mixing languages always adds complexity.  If you want the languages to talk to each other, the glue layer adds complexity that has nothing to do with the problem being solved.  If you don't want the languages to talk to each other, then you're severely limited in terms of the granularity at which they can be mixed.  Furthermore, it's nice to be able to write generic code once and have it always "just be there".

I get very annoyed with languages that target a small niche.  For example, I do a lot of mathy stuff, but I hate Matlab and R because they're too domain-specific. Anytime I write more than 20 lines of code in either of these, I find that the lack of some general-purpose programming capability in these languages or the awkwardness of using it has just added a layer of complexity to my project.

Even Python runs out of steam when you need more performance but you realize what a PITA it is to get all the glue working to rewrite parts of your code in C. Heck, even Numpy sometimes feels like a kludge because it reimplements basic things like arrays (with static typing, mind you) because Python's builtin arrays are too slow.  Therefore, Numpy code is often not very Pythonic.

A practical language should have enough complexity management tools to handle basically any type of complexity you throw at it, whether it be a really complicated business model, insane performance requirements, the need to scale to massive datasets, or the sheer volume of code that needs to be written.  Making more assumptions about what problems you want to solve is what libraries or applications are for.  These complexity management tools should also stay the heck out of the way when you don't need them.  If you can achieve this, your language will be good for almost anything.
December 02, 2009
Walter Bright:
> But with mechanical checking, you can guarantee certain things.

Usually what mechanical checking guarantee is not even vaguely enough, and such guarantee aren't even about the most important parts :-)
Unit tests are more important, because they cover things that matter more.
Better to add much more unit tests to Phobos.


> Where's the advantage of:
>      assert(a is int)
> over:
>      int a;
> ? Especially if I have to follow the discipline and add them in everywhere?

Probably I have missed parts of this discussion, so what I write below can be useless.
But in dynamic code you don't almost never assert that a variable is an int; you assert that 'a' is able to do its work where it's used. So 'a' can often be an int, decimal, a multiprecision long, a GMP multiprecision, or maybe even a float. What you care of it not what a is but if does what it has to, so you care if it quacks :-) That's duck typing.

Bye,
bearophile