Thread overview
Another example of a use of C macros that has not been discussed
Jan 14, 2003
Paul Sheer
Jan 15, 2003
Kelvin Lee
Jan 15, 2003
Paul Sheer
Jan 15, 2003
Evan McClanahan
Jan 16, 2003
Paul Sheer
Jan 16, 2003
Evan McClanahan
Jan 21, 2003
Walter
Jan 21, 2003
Walter
January 14, 2003
Example below.

(Kelvin Lee writes:)
> 
> A smart enough compiler should simply assign p a pointer to the string "abc 123 xyz" while execute some code for assigning the right thing to q.
> 
> If the compiler is so smart, the following kind of macro processing is completely unnecessary.
> 
> In addition, if this kind of optimization can be turned off, debugging this kind of "macros" would become much easier.
> 
> BTW, can the D compiler do something like this?
> 

I think this is more than about optimizations.

A macro processor is for creating patterned constant data elegantly and maintainably. ---> its an interesting definition because a compiler should be able to explicitly create patterned data so the code doesn't have to.

For embedded systems knowing the data will be compiled into the .rodata (const) section is also important. This is the point of the const keyword.

Here is another example that you can't even do with cpp:

Here are two pieces of code/data which are very similar.
I should be able to use the macto processor to write
this once, substituting only the non-conanical pieces.

/* SHA1 in C */
static u_int32_t *sha1(int32_t *x, int x_length, int len) {
  u_int32_t *res;
  int i, j;
  int32_t t, w[80];
  int32_t a=1732584193, b=-271733879, c=-1732584194, d=271733878, e=-1009589776;
  memset(w, '\0', sizeof(w));
  x[len >> 5] |= 0x80 << (24 - len % 32);
  x[(x_length = (((len + 65 + 511) >> 9) << 4)) - 1] = len;
  for(i = 0; i < x_length; i += 16) {
    int32_t oa=a, ob=b, oc=c, od=d, oe=e;
    for(j = 0; j < 80; j++) {
      if(j < 16) w[j] = x[i + j];
      else w[j] = rol(w[j-3] ^ w[j-8] ^ w[j-14] ^ w[j-16], 1);
      t = adu(adu(rol(a, 5), ft(j, b, c, d)), adu(adu(e, w[j]), kt(j)));
      e = d; d = c; c = rol(b, 30); b = a; a = t; }
    a = adu(a, oa); b = adu(b, ob); c = adu(c, oc); d = adu(d, od); e = adu(e, oe); }
  res = (u_int32_t *) malloc ((padding + 5) * sizeof(u_int32_t));
  res[0] = a; res[1] = b; res[2] = c; res[3] = d; res[4] = e;
  memset((res + 5), '\0', padding * sizeof(u_int32_t));
  return res; }

/* SHA1 in JavaScript */
const char *java_script_func =
"function sha1(x, len) {\r\n\
  var i, j, t, w = Array(80);\r\n\
  var a=1732584193, b=-271733879, c=-1732584194, d=271733878, e=-1009589776;\r\n\
  x[len >> 5] |= 0x80 << (24 - len % 32);\r\n\
  x[(((len + 65 + 511) >> 9) << 4) - 1] = len;\r\n\
  for(i = 0; i < x.length; i += 16) {\r\n\
    var oa=a, ob=b, oc=c, od=d, oe=e;\r\n\
    for(j = 0; j < 80; j++) {\r\n\
      if(j < 16) w[j] = x[i + j];\r\n\
      else w[j] = rol(w[j-3] ^ w[j-8] ^ w[j-14] ^ w[j-16], 1);\r\n\
      t = adu(adu(rol(a, 5), ft(j, b, c, d)), adu(adu(e, w[j]), kt(j)));\r\n\
      e = d; d = c; c = rol(b, 30); b = a; a = t; }\r\n\
    a = adu(a, oa); b = adu(b, ob); c = adu(c, oc); d = adu(d, od); e = adu(e, oe); }\r\n\
  return Array(a, b, c, d, e); }\r\n\
\r\n\
";

You can see that the javascript and C functions are
similar, and have to produce the same result because
this is a CGI where the web browser POST will be
compared to the server side hash.

Now if cpp could handle substitutions inside string
constants, I would not have to write these out twice
in full, and I could maintain one macro instead of
two separate pieces of code.

You might say that such an example occurs to rarely.
But having this power quickly shows its usefulness
to the point where it becomes indespensible in the
same way that Lisp macros do.

Here is a diff -u just for interest:

-function sha1(x, len) {
-  var i, j, t, w = Array(80);
-  var a=1732584193, b=-271733879, c=-1732584194, d=271733878, e=-1009589776;
+static u_int32_t *sha1(int32_t *x, int x_length, int len) {
+  u_int32_t *res;
+  int i, j;
+  int32_t t, w[80];
+  int32_t a=1732584193, b=-271733879, c=-1732584194, d=271733878, e=-1009589776;
+  memset(w, '\0', sizeof(w));
   x[len >> 5] |= 0x80 << (24 - len % 32);
-  x[(((len + 65 + 511) >> 9) << 4) - 1] = len;
-  for(i = 0; i < x.length; i += 16) {
-    var oa=a, ob=b, oc=c, od=d, oe=e;
+  x[(x_length = (((len + 65 + 511) >> 9) << 4)) - 1] = len;
+  for(i = 0; i < x_length; i += 16) {
+    int32_t oa=a, ob=b, oc=c, od=d, oe=e;
     for(j = 0; j < 80; j++) {
       if(j < 16) w[j] = x[i + j];
       else w[j] = rol(w[j-3] ^ w[j-8] ^ w[j-14] ^ w[j-16], 1);
       t = adu(adu(rol(a, 5), ft(j, b, c, d)), adu(adu(e, w[j]), kt(j)));
       e = d; d = c; c = rol(b, 30); b = a; a = t; }
     a = adu(a, oa); b = adu(b, ob); c = adu(c, oc); d = adu(d, od); e = adu(e, oe); }
-  return Array(a, b, c, d, e); }
+  res = (u_int32_t *) malloc ((padding + 5) * sizeof(u_int32_t));
+  res[0] = a; res[1] = b; res[2] = c; res[3] = d; res[4] = e;
+  memset((res + 5), '\0', padding * sizeof(u_int32_t));
+  return res; }

-paul





January 15, 2003
First of all, I don't disagree preprocessing is sometimes useful (like your
example).
But my point is just that a smart compiler can certainly be helpful in some
situations to eliminate some kinds of preprocessing which are unnecessary
(like my example). And the result is that the code would become cleaner to
read and easier to debug and may be more efficient.

BTW, there is a conjecture that the number of bugs in your C program is directly proportional to the number of defined macros.

Kiyo

"Paul Sheer" <psheer@icon.co.za> wrote in message news:b00cdt$kdc$1@digitaldaemon.com...
>
> I think this is more than about optimizations.
>
> A macro processor is for creating patterned constant data elegantly and maintainably. ---> its an interesting definition because a compiler should be able to explicitly create patterned data so the code doesn't have to.
>
> For embedded systems knowing the data will be compiled into the .rodata (const) section is also important. This is the point of the const keyword.
>
> Here is another example that you can't even do with cpp:
>
> Here are two pieces of code/data which are very similar.
> I should be able to use the macto processor to write
> this once, substituting only the non-conanical pieces.
>
> [source code removed]
>
> You can see that the javascript and C functions are
> similar, and have to produce the same result because
> this is a CGI where the web browser POST will be
> compared to the server side hash.
>
> Now if cpp could handle substitutions inside string
> constants, I would not have to write these out twice
> in full, and I could maintain one macro instead of
> two separate pieces of code.
>
> You might say that such an example occurs to rarely.
> But having this power quickly shows its usefulness
> to the point where it becomes indespensible in the
> same way that Lisp macros do.
>
> [source code removed]
>
> -paul

January 15, 2003
On Wed, 15 Jan 2003 13:38:59 +1100, Kelvin Lee wrote:
> First of all, I don't disagree preprocessing is sometimes useful (like your
> example).
> But my point is just that a smart compiler can certainly be helpful in some
> situations to eliminate some kinds of preprocessing which are unnecessary
> (like my example). And the result is that the code would become cleaner to
> read and easier to debug and may be more efficient.

agreed

> 
> BTW, there is a conjecture that the number of bugs in your C program is directly proportional to the number of defined macros.
> 

not in my code

there is a fundemental problem which i would actually like some feedback on, Kevin, as follows:

in Lisp Paul Graham is always raving about macros. I didn't know what he was going on about until i tried to use macros in C as much as possible. Then it became clear how much easier macos can make a complex programming task. A macro is LIKE AN OBJECT with overloading capabiliries, but

 - it's more powerful (because you can overload ANYTHING)
 - it's lighter
 - it's less typing to declare and use
 - it's less prone to unwanted emergent behavior

Now C macros are not proper macros's in the Lisp sense.
But they could be even more powerful.

As for objects - very few people use objects for the
intended purpose of overloading and inheretence. Most
people use objects merely because its the only clean
way they know of to make an *interface*. What most people
think to themselves is an object is really only an
interface. (You can prove this to yourself by asking a
number of C++ programmers what an object is. You'll
be surprised by their answers :-)

There is no use of OO that isn't better solvable
with macros and lexical closures. This is why the Lisp
guys laugh at OO.

Now I know how to create a clean interface in C
without having to use OO. So I don't mind if a language
has objects so long as I don't *have* to use them.

But I *do* want more powerful macro support than cpp.

-paul


January 15, 2003
Paul Sheer wrote:
> There is no use of OO that isn't better solvable
> with macros and lexical closures. This is why the Lisp
> guys laugh at OO.
> 
> Now I know how to create a clean interface in C
> without having to use OO. So I don't mind if a language
> has objects so long as I don't *have* to use them.
> 
> But I *do* want more powerful macro support than cpp.

Lisp guys laugh at OO?  What is CLOS then?  I agree with the macros thing, though.  I would like some clean, lightweight mechanism for syntax extension, but it doesn't seem to be a priority.

Evan

January 16, 2003
> Lisp guys laugh at OO?

> What is CLOS then?

it's going backwards

> I agree with the macros thing, though.  I would like some clean, lightweight mechanism for syntax extension,

> but it doesn't seem to be a priority.

do you mean that it *should* be a priority?

-paul


January 16, 2003
Paul Sheer wrote:

>>What is CLOS then? 
> 
> it's going backwards
> 

I've met some pretty hardcore lisp hackers who would disagree with you.  I've also met some who would agree.  I don't think that it's something that is that generalizable.

>>I agree with the macros thing, though.  I would like some clean, lightweight mechanism for syntax extension,
>>but it doesn't seem to be a priority.
> 
> do you mean that it *should* be a priority?

Well, I don't think that I'm the one that gets to decide that.  I would like a feature like that.  However, Walter has never weighed in on extensibility from what I've read, so I really don't have any idea.

Evan

January 21, 2003
"Kelvin Lee" <kiyolee@*hongkong*.com> wrote in message news:b02h9m$24h7$1@digitaldaemon.com...
> BTW, there is a conjecture that the number of bugs in your C program is directly proportional to the number of defined macros.

Since #define's are bugs, you are correct by definition <g>.


January 21, 2003
"Paul Sheer" <psheer@icon.co.za> wrote in message news:b032a5$2eu3$1@digitaldaemon.com...
> But I *do* want more powerful macro support than cpp.

I agree that cpp, as a macro text processor, is woefully inadequate. Most of what it does can be better handled by syntactical constructs, and things that would make preprocessing really useful it cannot do.

D should not have a preprocessor built in. For macro processing needs, it would be best to use an external macro processor that is best suited to the problem being solved. There are enough of them around, no need to invent a new one.