March 01, 2003
"Walter" <walter@digitalmars.com> wrote in news:b3r0ls$2ai6$1@digitaldaemon.com:

> 
> "Antti Sykari" <jsykari@gamma.hut.fi> wrote in message news:87smu7b1sx.fsf@hoastest1-8c.hoasnet.inet.fi...
>> "Walter" <walter@digitalmars.com> writes:
>> > One thing that is never mentioned about C is that I believe the PC is
> what
>> > pushed C into the mass accepted language that it is. The reasons are: 1) PCs were slow and small. Writing high performance, memory efficient
> apps
>> > was a requirement, not a luxury.
>> > 2) C was a good fit for the PC architecture.
>> > 3) C was the ONLY high level language that had a decent compiler
>> > for it
> in
>> > the early PC days. (The only other options were BASIC and
>> > assembler.)
> Sure,
>> > there were shipping FORTRAN and Pascal compilers for the PC early on,
> but
>> > the implementations were so truly terrible they were useless (and I and
> my
>> > coworkers really did try to get them to work).
>> I recently visited my university's library and examined some programming language textbooks from the late 1970's and the early 1980's.  Many of them didn't even mention C and those that did, didn't usually consider it to be much of a language.  Algol, Pascal, Cobol and Fortran were the languages of the day, with an occasional side note for Lisp, Prolog and Smalltalk. Although C had existed from around 1973 is was still a cryptical-looking, system-oriented niche language used only by some UNIX researchers and didn't seem to be of much interest.
> 
> You're right. I started programming in 1975, and had heard of all those languages (and many more like bliss, simula, APL) except for C, which I never heard of until 1983.

When I went to the university in 1984 they had just switched the computer science departments core teaching language from Pascal to C.


March 02, 2003
Hi, Mark.

Perhaps it would be more productive if we focused discussion on specific language constructs which improve expressiveness.  Here are some I would like to discuss.

My bent is always towards features that give us more power while helping us to write faster applications.  I've used most of these enough in high-performance applications to know that they can be very valuable.

Template frameworks (or equivalent, such as virtual classes):

This is much like copying and pasting code from an entire module into your module, with some modifications.  This would allow me to efficiently reuse code designed for mutually recursive data structures like graphs.  There seems to be no speed overhead for using this feature.

Dynamic Inheritance (dynamic extension?)

This is where you add class members on-the-fly.  The simple version is slow, but we've been using a method that has exactly 0 speed penatly (actually, there seems to be a speed-up due to cache efficiency).  The trick is using arrays of properties to represent members of classes, rather than structure style layout with contiguous fields.  This allows us to write efficient code with a shared object database model where the database objects are shared between tools.  Without it, each tool has to come up with some hack to add it's custom data to the objects that exist in the database.

Extended support for relationships between classes.

Most languages that borrow heavily from C have some equivalent to pointers.  Even Java has object handles as a basic type.  Using these, we can write template libraries of container classes to implement relationships between classes such as linked lists, or hash tables. However, the simple model that a relationship is owned soley by the class containing the container class member is not very efficient, and leaves some tasks to the programmer to implement.  By having language constructs that allow container classes to add members to both the parent and child classes of a relationship, we gain both efficiency and safty.  For example, if class Graph owns a linked list of Nodes, and Nodes point back to Graph, we have to remember to NULL out the back pointer of a Node when we remove it from the linked list.  This leads to lots of errors.  Also, the next pointer isn't embedded in the child class, as would be the natural and efficient implementation if we were doing it by hand without the container class.

Support for code generators.

Code generators add much power to a programmer's toolbox.  GUI generators are the obvious example.  Where I work, we even use data structure code generators.  Full recursive destructors are also generated, as is highly efficient memory management for each class. Adding simple support for code generators can be done by extending the language so that generators only have to write D code, not read it. This means that classes have to be able to be extended simply by creating new files with appropriate declarations.

Compiler extensions

This is a tough feature, but it can be very powerful.  Basically, the compiler needs to be written in D, and end-users need to be able to make extensions directly to the compiler by writing new files in D.  A custom compiler gets compiled for each user application, so these new features can be included efficiently.  How much of the compiler gets exposed, and in what manner makes a big difference.  For example, if D supported descriptions of parsers in a yacc-like manner, we could add syntax to D with yacc like rules in our applications.  It seems possible to allow users to add new built-in types.  Complex numbers could be such a language extension.  A simple compiler I wrote allows users to write code generators that run off the compiler's representation of the user's code.  This is used, for example, to generate the container class code in both parent and child classes, as described above in support for relationships, as well as recursive destructors.  The possiblilties here seem huge.

Bill

March 02, 2003
Though it's not for me, i'll take liberty to comment a bit.

Bill Cox wrote:
> Hi, Mark.
> 
> Perhaps it would be more productive if we focused discussion on specific language constructs which improve expressiveness.  Here are some I would like to discuss.

Mark has given us an ocean of information which is mostly very interesting, but we can't cope with the vast amount. :) Thanks for trying to make him think in more pragmatic terms.

> Dynamic Inheritance (dynamic extension?)
> 
> This is where you add class members on-the-fly.  The simple version is slow, but we've been using a method that has exactly 0 speed penatly (actually, there seems to be a speed-up due to cache efficiency).  The trick is using arrays of properties to represent members of classes, rather than structure style layout with contiguous fields.  This allows us to write efficient code with a shared object database model where the database objects are shared between tools.  Without it, each tool has to come up with some hack to add it's custom data to the objects that exist in the database.

Great idea. It'd be also very useful for scripting languages. Thus a scripting language and D could communicate fairly directly. Not to mention that it could reduce executable size when large GUI frameworks are used, because of less "generality". (maybe, due to mix-ins?)

> Support for code generators.
> 
> Code generators add much power to a programmer's toolbox.  GUI generators are the obvious example.  Where I work, we even use data structure code generators.  Full recursive destructors are also generated, as is highly efficient memory management for each class. Adding simple support for code generators can be done by extending the language so that generators only have to write D code, not read it. This means that classes have to be able to be extended simply by creating new files with appropriate declarations.

> Compiler extensions
> 
> This is a tough feature, but it can be very powerful.  Basically, the compiler needs to be written in D, and end-users need to be able to make extensions directly to the compiler by writing new files in D.  A custom compiler gets compiled for each user application, so these new features can be included efficiently.  How much of the compiler gets exposed, and in what manner makes a big difference.  For example, if D supported descriptions of parsers in a yacc-like manner, we could add syntax to D with yacc like rules in our applications.  It seems possible to allow users to add new built-in types.  Complex numbers could be such a language extension.  A simple compiler I wrote allows users to write code generators that run off the compiler's representation of the user's code.  This is used, for example, to generate the container class code in both parent and child classes, as described above in support for relationships, as well as recursive destructors.  The possiblilties here seem huge.

Right. This would impose that compilers have to have similar internal structure though. I planned to write an external tool which does something like that and yuilds D source. Wonder if i ever come to it, but i value any ideas. Then, a backend interface can be attached to it, for which i'd implement a compiler into C and possibly CIL/Net.

And other code generation tools (GUI constructor toolkits and alike) could use it somehow to parse D code with extensions and/or make additions to it. That is, its core functionality should be a library.

-i.

March 02, 2003
Someone at my place of work says that the big thing that makes him think D will not do well is the lack of an Eval() function.

Eval("D code snippet") or Eval("D function definitions", "D function call")
I suppose are what he's after.  Some other languages such as Lisp and Perl
have these.
It would require at least a Dscript interpreter, or a small compiler in the
runtime library.
I'm not sure everyone would benefit from it but occasionally the most
straightforward way to solve a task is to write a program that writes
programs based on runtime data.

D has regexps already.

Another convenient feature that I see in ML and Haskell is a construct that can execute some code based on the type of an object.  This is similar to function overloading in utility but written with a more switch-like syntax. It would also replace code that does a bunch of if's to decide if an object is-a descendant of one of these certain classes.  I believe they call it pattern matching and I think it has even more functionality than what I've mentioned here.  (but I'm just learning Haskell and OCaml)

I think this is the main reason OCaml and Haskell are great languages with which to write parsers.  If you make D into a language which makes it very easy to write a D parser, you've made life easier for many many people.

Thought about bootstrapping DMD yet?  ;)

Sean

 > Perhaps it would be more productive if we focused discussion on specific
 > language constructs which improve expressiveness.  Here are some I would
 > like to discuss.


March 02, 2003
Since the feature of program histories is local in nature (as local as local variables are), it would be straightforward to make it so that they would be calculated only when needed.

The good side of the tangible program histories is that they can raise the abstraction of the code and make some bugs obvious.  The bad side is that it makes the structure ("what really happens and when") of the code less obvious.  The interesting thing is that it's a declarative feature understanding of which leads to a different way of thinking than traditional imperative languages.

Might be that something similar will pop up some day in a mainstream programming language...

-Antti

"Sean L. Palmer" <seanpalmer@directvinternet.com> writes:

> I certainly wouldn't want the compiler keeping profiling histories in my code, at least not if they weren't used.  Perhaps the virtual properties would exist only if you tried to use them.
>
> Sean
>
> "Antti Sykari" <jsykari@gamma.hut.fi> wrote in message news:87ptpac4bd.fsf@hoastest1-8c.hoasnet.inet.fi...
>> Mark Evans <Mark_member@pathlink.com> writes:
>> > Todd Proebsting has worked on many languages -- his stuff is worth reading.
>>
>> Indeed.  I found this exploratory article quite interesting:
>>
>> ftp://ftp.research.microsoft.com/pub/tr/tr-2000-54.ps
>>
>> "We propose a new language feature, a program history, that significantly reduces bookkeeping code in imperative programs. A history represents previous program state that is not explicitly recorded by the programmer. By reducing bookkeeping, programs are more convenient to write and less error-prone. Example program histories include a list that represents all the values previously assigned to a given variable, an accumulator that represents the sum of values assigned to a given variable, and a counter that represents the number of times a given loop has iterated. Many program histories can be implemented with low overhead."
>>
>> -Antti
March 02, 2003
I believe Larry Wall said that languages need more pronouns.

I propose extending the with construct to allow multiple aliases within the scope of the with body

instead of:

{
    int myverylongidentifier;
    int myaccessofdifficulttoaccessmember;
    {
        alias myverylongidentifier id;
        alias myaccessofdifficulttoaccessmember member;
        if (id > 5 && id < 10)
            member = member + id;
    }
}

we could write:

{
    int myverylongidentifier;
    int myaccessofdifficulttoaccessmember;
    with (myverylongidentifier id,  myaccessofdifficulttoaccessmember
member)
        if (id > 5 && id < 10)
            member = member + id;
}

This allows one to conveniently build one's own pronouns.

For those times when one isn't trying to bring names into scope so much as build a convenient temporary name for some complex expression, I also propose that an "it" keyword is added which is equivalent to the innermost enclosing old-style with statement expression:

with (a * b + c)
    myvar = it;

Seems more convenient than:

{
    mytype it = a * b + c;
    myvar = it;
}

and also would replace this:

int member1;
struct mybigstruct { int member1, member2; }
mybigstruct mybigstructvar;
{
    alias mybigstruct it;
    myvar = it.member1 + it.member2;
}

with this:

int member1;
struct mybigstruct { int member1, member2; }
mybigstruct mybigstructvar;
with (mybigstructvar)
    myvar = it.member1 + it.member2;

But use of "it" could in some cases be less ambiguous than without;  for instance using the existing with syntax it is not clear:

int member1;
struct mybigstruct { int member1, member2; }
mybigstruct mybigstructvar;
with (mybigstructvar)
    myvar = member1 + member2;  // which member1 are we talking about here?

Essentially it's syntax sugar for alias.

The with statement should be able to handle any construct that alias handles.

Thoughts?

Sean

> Perhaps it would be more productive if we focused discussion on specific language constructs which improve expressiveness.  Here are some I would like to discuss.


March 02, 2003
"Sean L. Palmer" <seanpalmer@directvinternet.com> wrote in message news:b3tu41$m74$1@digitaldaemon.com...
> I believe Larry Wall said that languages need more pronouns.
>
> I propose extending the with construct to allow multiple aliases within
the
> scope of the with body
>
[...]
>
> we could write:
>
> {
>     int myverylongidentifier;
>     int myaccessofdifficulttoaccessmember;
>     with (myverylongidentifier id,  myaccessofdifficulttoaccessmember
> member)
>         if (id > 5 && id < 10)
>             member = member + id;
> }
>
> This allows one to conveniently build one's own pronouns.
>

Ah, a Lisp afficionado?  This is the "let" construct, which in Lisp is implemented with a macro.  Perhaps we ought to consider (again) a "real" preprocessing pass in the compiler... and not of the C ilk.  This complicates things, of course, but imagine a pair of compiler passes running before the others, a parser generation pass and a parser application pass. Do something like this to implement your above "with"  (again I'm brainstorming kitchen sink ideas):

syntax with (decl_list) {body} is {
      replace decl_list  is { var expr [, decl_list] } by {
         "\expr.type var = \expr;"
      }
      "{ \decl_list \body }"
}

So anything in quotes gets pasted into the output of this syntax transform. You can define 'replace' rules which transform expressions in the input. The rules will be applied recursively as necessary (through the []-delimited optional expansion).  And true to the C approach, use '\' to escape out of the quotes to paste in parameters to the syntax transform or replacement rule outputs. And escaped expressions will have properties like in the normal language, just a different set of properties. For example above we write \expr.type to paste in the type of expr.  And with the above transform in a library somewhere, you'll have your new "with" syntax, although I'd probably rename it "let" since there is precedence for that and "with" already does something different.

And to play off the "it" idea, do something like this:

syntax if_with (expr) {body} is {
    "{ \expr.type it = \expr; if (it) { \body }}"
}

"if_with" will now be a sort of lexical optimization, as it will save you declaring and setting the variable in order to check and use it.

 - Dan "Kitchen Sink" Liebgold


March 03, 2003
Aside from all the subtle gotchas you have to watch out for when doing textual macro preprocessing, I really don't relish the idea of doing:

import my_convenient_macro_library;

my_convenient_macro_library.with (foo)
{
    // do something with foo
}

Yeah it's nice to be able to make your own syntax, but you can always use an external preprocessor for that.  That's not the issue.  What the issue is, is what level of functionality should be available in the core D spec that's guaranteed to be in every implementation.  What convenience features can I rely upon to be there all the time?

Things like this are core features, not library features.  And preferrably not user-defined features.  People will end up extending D, but the vanilla baseline is what gets used for 99% of all work, since it's expected to be fairly portable.

Sean

"Dan Liebgold" <dliebgold@yahoo.com> wrote in message news:b3u1ic$nsq$1@digitaldaemon.com...
> "Sean L. Palmer" <seanpalmer@directvinternet.com> wrote in message news:b3tu41$m74$1@digitaldaemon.com...
> > I believe Larry Wall said that languages need more pronouns.
> >
> > I propose extending the with construct to allow multiple aliases within
> the
> > scope of the with body
> >
> [...]
> >
> > we could write:
> >
> > {
> >     int myverylongidentifier;
> >     int myaccessofdifficulttoaccessmember;
> >     with (myverylongidentifier id,  myaccessofdifficulttoaccessmember
> > member)
> >         if (id > 5 && id < 10)
> >             member = member + id;
> > }
> >
> > This allows one to conveniently build one's own pronouns.
> >
>
> Ah, a Lisp afficionado?  This is the "let" construct, which in Lisp is implemented with a macro.  Perhaps we ought to consider (again) a "real" preprocessing pass in the compiler... and not of the C ilk.  This complicates things, of course, but imagine a pair of compiler passes
running
> before the others, a parser generation pass and a parser application pass. Do something like this to implement your above "with"  (again I'm brainstorming kitchen sink ideas):
>
> syntax with (decl_list) {body} is {
>       replace decl_list  is { var expr [, decl_list] } by {
>          "\expr.type var = \expr;"
>       }
>       "{ \decl_list \body }"
> }
>
> So anything in quotes gets pasted into the output of this syntax
transform.
> You can define 'replace' rules which transform expressions in the input.
The
> rules will be applied recursively as necessary (through the []-delimited optional expansion).  And true to the C approach, use '\' to escape out of the quotes to paste in parameters to the syntax transform or replacement rule outputs. And escaped expressions will have properties like in the normal language, just a different set of properties. For example above we write \expr.type to paste in the type of expr.  And with the above
transform
> in a library somewhere, you'll have your new "with" syntax, although I'd probably rename it "let" since there is precedence for that and "with" already does something different.
>
> And to play off the "it" idea, do something like this:
>
> syntax if_with (expr) {body} is {
>     "{ \expr.type it = \expr; if (it) { \body }}"
> }
>
> "if_with" will now be a sort of lexical optimization, as it will save you declaring and setting the variable in order to check and use it.
>
>  - Dan "Kitchen Sink" Liebgold
>
>


March 03, 2003
In article <b3v6q3$1bmn$1@digitaldaemon.com>, Sean L. Palmer says...
>
>Aside from all the subtle gotchas you have to watch out for when doing textual macro preprocessing, I really don't relish the idea of doing:
>
>import my_convenient_macro_library;
>
>my_convenient_macro_library.with (foo)
>{
>    // do something with foo
>}
>

As Mark T quoted a few posts back: "Library design is language design". Something basic like the "with" macro would be part of the *standard* library, and would not need importation or any special specification. Thus, unlike in most languages, the implementation of many basic constructs would be available to expert D practitioners, and that is a good thing.

Syntax transforming macros are one of the basic Lisp ideas that Eric Raymond was implicating when he said "LISP is worth learning for a different reason — the profound enlightenment experience you will have when you finally get it. That experience will make you a better programmer for the rest of your days, even if you never actually use LISP itself a lot."

Dan


March 03, 2003
Farmer wrote:
> Mark Evans <Mark_member@pathlink.com> wrote in
> news:b3pg8m$19te$1@digitaldaemon.com: 
> 
> Hi,
> 
> in your last post you often mentioned the term "languages expressiveness".
> 
> What is this exactly? Is there any academic definition for it (That is easily understood) ?
> How can you measure this? [ the kernel language? :-) ]

I think i can try to define it. A language being expressive means that many problems can be expressed directly and intuitively in its terms. Since "directly" and esp. "intuitively" cannot be measured well, it is and stays a very vague goal, which is nontheless very well worth persuing. It might affect the popularity of a language.
Like: "perl is a very expressive shell scripting language" mans that perl is being felt to be easy to use for shell scripting tasks.
"Python is a very expressive rapid prototyping language." And so on.

I also work with projects where performance is important, but else safety is. Do you prefer a fast elevator or a safe one? Presumably both. But if it can't be both, it'd rather be safe else you're risking your life.

And i beg you for more respect towards the researchers. They usually research problems which are of importance. However, since they don't strive to make a product, but insight, they may choose ways of implementation and tools which are not appropriate for commercial software development, because they might just be better suited for that particular purpose. Or doesn't really matter why, may be a personal preference as well. They give you knowledge, which is to your advantage only, and can be profitably used in an industrial product. Just usually not the tools they have used to prove their thesis. Having a "research focus" rather than a "pragmatic purpose" is a good goal, since insight is always helpful in pragmatic sense.

-i.