August 17, 2001
Tobias Weingartner wrote:
> In article <9lgki2$24sf$1@digitaldaemon.com>, Johnny Lai wrote:
> 
>>Well I'm disappointed that people seem to feel that the preprocessor
>>and templates are not features that are worth having...
>>
> 
> Ok, these are reasons to look at.  Maybe not valid, but they are
> there...
> ...

Well, first, not all preprocessing is at compile time.

In this light, most IDE's and aware editors use a form of preprocessing.  Not as extensive as if you could say, e.g., "Take all the green words, and move the first letter to the back", but then how often would one want to do this?  Of course, it could be quite nice to, e.g., locate all references to green words matching the regexp ..., or that kind of thing.  IDE's sometimes provide this kind of facility.

Or consider JavaDoc, or Eiffel's short command.  These are a sort of preprocessor.  The Eiffel short command will even follow up the chains of inheritance while constructing a list of all routines/variables visible to clients of a class.  It's a kind of preprocessing.

What about anything that builds and maintains a project file?  Some of them merely store meta-information, but some of them actively analyze the code to extract the information.

Now let's consider compile time preprocessing:
Some of the work that this could do is better done by generics.
Some of the work that this could do is better done by const variables.
Some of the work that this could do is better done by inlining of functions.
The only remaining task that I can think of would be better done by an alias construct.  It's true that the only form that comes to my mind right now is a preprocessor-ish form:
Alias(t_right, "tree->node.right")
That sort of thing.  I don't want the second argument to be a string, though.  That's just the way it appeared in my thoughts (of how to do it with a preprocessor).  And I'm not sure that it's a desireable thing for a preprocessor to do.  (I just escaped maintaining some C code that was full of the equivalent done with the C preprocessor...this needs careful thought.  That was pretty unreadable code.)

What are the other reasonable uses of a preprocessor?

August 18, 2001
> I personally miss the preprocessor when I use Java.

Agreed, but only because of Java's limitations (no templates, etc.), and not because I *want* to use macros.

> And templates! Yes they are complex stuff and I wish they were easier
but...
> they allow you to do things that you would otherwise be unable to do, or have to pay dearly for.

Agreed 100%!  I would never consider writing a large-scale program in a language without template-style functionality.  It's really astounding how much simplification and generality one can attain with templates -- not just talking about the cannonical (and IMO poor) example of "container classes", but for complex related type hierarchies, parser tools, math structures, etc.

The stigma of templates being useful just for container classes and a few other isolated cases (which a language could well implement as a special feature) is really unfortunate.  I suppose that's because container classes are the next obvious step for a C programmer.  But if you spend a few weeks experimenting with languages like Haskell (not useful for practical programming, but a GREAT thing to learn), you come back to C++ templates with a completely new perspective and start writing code in a much more compact and general way!

(Here I'm advocating template-style functionality -- I actually think the C++ syntax for templates is pretty kludgy, but that's a minor issue in the grand scheme of things.)

> Unfortunately, right now in C++, it is difficult and unintuitive to do,
but
> it is possible (there's a nice book that discusses some of these things,
and
> you'll be amazed at what they can get templates to do - Generative Programming by Krzysztof Czarnecki, and Ulrich W. Eisenecker).

Agreed 101%!

-Tim


August 18, 2001
>
> I nearly jumped for joy upon seeing D's "debug" attribute. At the risk of getting stomped on for proposing a new keyword, how about adding a "config" attribute, with an identifier:
>
>    config (MacOSX)
>    {
>       // do OS X specific stuff
>    }
>    config (Win32)
>    {
>       // do Win32 specific stuff
>    }
>    config (PalmOS)
>    {
>       // do PalmOS specific stuff
>    }
>
> again, with compiler switches to pass in the identifier?

Have nothing incredibly useful to say, but I REALLY like that idea.  I was already convinced that with garbage collection, single inheretence + interfaces, and the debug keyword that D is my new language of choice if/when it becomes a reality.

Toss in the config keyword, and my Win32 vs Linux woes would be greatly reduced WITHOUT having to use the messy preprocessor, and I would throw Walter a party for the accomplishment personally  :)




August 20, 2001
Tobias Weingartner wrote:
> 
> In article <9lgki2$24sf$1@digitaldaemon.com>, Johnny Lai wrote:
> > I personally miss the preprocessor when I use Java. Otherwise how do we do code tuned for multiple platforms in 1 source? Or to include debugging code and removing at will (not just asserts)? Sometimes it's nice to have an extra level of indirection ^_^.. maybe we need instead is a preprocessor language that can be applied to any/many language(s) when needed.
> 
> So much in such a small paragraph.  I'll try to address these all in sequence.
> 
> First, using a preprocessor to tune/port source to multiple platforms is a *VERY* bad idea.  As someone who as worked at porting various things to many different platforms, I can attest to that.  (10+ platforms, 2 huge products, 5.5M+ lines of source)

	If the preprocessor is not the way, what is?  I'm not saying we need a
text macro preprocessor, but we something I would assume.  Where I work
we have moved a lot of code to java and we now have to maintain a tree
of common code and a tree of platform specific code for each platform.
Because of this we can't use straight make either.  Java does not
provide the platform independence it claims to and I don't have high
hopes for a natively compiled language.
	I must admit that I despise a number of the problems that the
preprocessor cause, but conditional compilation for platforms should
really be addressed if the language is to be seriously used.
Conditional imports would be the absolute minimum, allowing you to hide
platform specifics in a module.  This could be over kill for issues like
filesystem directory separators and the like.
	Anyhow, you say the preprocessor is bad for multi-platform support.  It
sounds like you have experience.  What would be better?  Platform
independence is too idealistic.  I won't buy that D can deliver it and
still be generally useful.

Dan
August 20, 2001
	I admit that I don't care for the problems that came with the
preprocessor, but it had it's pluses.  Rather that just blast D (since
it looks 'cool' and all) I was curious how I would address conditional
compilation.  I'll consider two cases and I'm curious how they would be
addressed in D.

	First would be a small project.  Say I want to write a wizzle decoder.
It a basic project that take files contain wizzle encoded data,
de-wizzles it and presents it to the user.  for the most part the
de-wizzling is number crunching but the will inevitably be issues of
dealing with basic system interfaces to get data (files) and present it
(files, a window, audio, idunno).  I C/C++ I would write a file or for
each platform with the code for getting data, and presenting it and use
#ifdefs to select which files I want to #include.  The makefile would
handle compiling the write code.
	How would I do this in D?  Unless I put each implementation in a
separate directory, I would have to give each file a different name, and
hence a different module name.  I would then have to do trick to point
at these different directories, and the organizational overhead is just
seems a bit steep for de-wizzling. ;-)  This also excludes ports to
platforms that don't have hierarchical files systems.  (Loth the
mainframe.)  This also doesn't address the rare few programs that are so
small (even with an #ifdef or two) that they would otherwise fit nicely
into one file.  I guess that is a peeve I have with java.  It tends to
require more file than I feel should be necessary for some smallish
projects.

	Second, say I'm making an OS kernel. I've decide that the world is
ready for Danux.  Now here, having a directory for each platform is
pretty justified even without D's requirements.  My question here has to
do with configuration management.  If you've compiled linux you know
that there are a lot of compile time options.  Let face it, Danux is
going to have to be just as flexible if not more so.  How would I go
about supporting the ability to pass compile time configuration options
into D to tell it what I want compiled and how do I act upon that
information?
	The first thing that comes to mind is to abuse the debug(identifier)
syntax.  I don't know if it would work though to do imports that way.
Let's face it though, that is abuse.  I would need to tell the program
which OS modules/drivers/functionality is being supported for this
build.  I would also need to tell it option setting like is a driver
being build statically, or is it a run time loadable.  What should the
max/default sizes for given properties be.  Short of writing a program
that will in turn write my program with the correct options (that would
be a preprocessor wouldn't it?) I don't really know how I would handle
this in D.  Let's face it.  We are not just dewizzling data here.  We
need power.

Dan
August 20, 2001
The Aug 2001 issue of CUJ (I think..the AUG part is pukka) has an article
that discusses the elimination of common uses of preprocessor for such
tasks. (C++ related article.)

He says that (not exact quote, the mag is not in front of me): "using #ifdef to write multiplatform code does not create platform independant code, but code that is dependent on multiple platforms!"

Taking his hints, I have found that keeping the common, platform independant, interface in the .hpp file and having multiple .cpp files, one for each platform; does simplify the code clutter. The appropriate files are to be choosen in the makefile during the platform specific build. The downside is that the number of files increases, but I guess the clear distiction between the 'interface' and the 'implementation' compensates for it.


a <hursh@infonet.isl.net> wrote in message news:3B806CC6.FE5992@infonet.isl.net...
> Tobias Weingartner wrote:
> >
> > In article <9lgki2$24sf$1@digitaldaemon.com>, Johnny Lai wrote:
> > > I personally miss the preprocessor when I use Java. Otherwise how do we do code tuned for multiple platforms in 1 source? Or to include debugging code and removing at will (not just asserts)? Sometimes it's nice to have an extra level of indirection ^_^.. maybe we need instead is a preprocessor language that can be applied to any/many language(s) when needed.
> >
> > So much in such a small paragraph.  I'll try to address these all in sequence.
> >
> > First, using a preprocessor to tune/port source to multiple platforms is a *VERY* bad idea.  As someone who as worked at porting various things to many different platforms, I can attest to that.  (10+ platforms, 2 huge products, 5.5M+ lines of source)
>
> If the preprocessor is not the way, what is?  I'm not saying we need a
> text macro preprocessor, but we something I would assume.  Where I work
> we have moved a lot of code to java and we now have to maintain a tree
> of common code and a tree of platform specific code for each platform.
> Because of this we can't use straight make either.  Java does not
> provide the platform independence it claims to and I don't have high
> hopes for a natively compiled language.
> I must admit that I despise a number of the problems that the
> preprocessor cause, but conditional compilation for platforms should
> really be addressed if the language is to be seriously used.
> Conditional imports would be the absolute minimum, allowing you to hide
> platform specifics in a module.  This could be over kill for issues like
> filesystem directory separators and the like.
> Anyhow, you say the preprocessor is bad for multi-platform support.  It
> sounds like you have experience.  What would be better?  Platform
> independence is too idealistic.  I won't buy that D can deliver it and
> still be generally useful.
>
> Dan


August 20, 2001
In article <3B806CC6.FE5992@infonet.isl.net>, a wrote:
> Tobias Weingartner wrote:
> > 
> > First, using a preprocessor to tune/port source to multiple platforms is a *VERY* bad idea.  As someone who as worked at porting various things to many different platforms, I can attest to that.  (10+ platforms, 2 huge products, 5.5M+ lines of source)
> 
> 	If the preprocessor is not the way, what is?  I'm not saying we need a
> text macro preprocessor, but we something I would assume.  Where I work
> we have moved a lot of code to java and we now have to maintain a tree
> of common code and a tree of platform specific code for each platform.
> Because of this we can't use straight make either.  Java does not
> provide the platform independence it claims to and I don't have high
> hopes for a natively compiled language.

Separate files for each platform (or api/feature) is the way to go.  It helps in keeping the spagethi code away as more platforms are added.  It also helps in coallescing support for platforms as bugs, apis and features migrate into a cohesive mass.


> 	I must admit that I despise a number of the problems that the
> preprocessor cause, but conditional compilation for platforms should
> really be addressed if the language is to be seriously used.

Certainly.  However, I believe strongly, that it should be discouraged. How many times do you see '#ifdef __linux', when they really mean to say '#ifdef SVR4'?  Most people don't know which standards define which api, giving them "platform" dependant compilation, usually means they make the wrong choice...


> Conditional imports would be the absolute minimum, allowing you to hide platform specifics in a module.  This could be over kill for issues like filesystem directory separators and the like.

Conditional imports are not necessary.  Build libraries to abstract system and API differences.  Import libraries, and use their common API.


> 	Anyhow, you say the preprocessor is bad for multi-platform support.  It
> sounds like you have experience.  What would be better?  Platform
> independence is too idealistic.  I won't buy that D can deliver it and
> still be generally useful.

Not using preprocessor for multi-platform.  It sounds hard, and it can be. It sounds over-kill, and it can be.  But really, once you get beyond the 3-4 machine level, a little foresight, and forebearance of the over-kill things will make life a *lot* easier...


--Toby.
August 20, 2001
In article <3B807834.59635C0C@infonet.isl.net>, a wrote:
> 
> 	First would be a small project.  Say I want to write a wizzle decoder.
> It a basic project that take files contain wizzle encoded data,
> de-wizzles it and presents it to the user.  for the most part the
> de-wizzling is number crunching but the will inevitably be issues of
> dealing with basic system interfaces to get data (files) and present it
> (files, a window, audio, idunno).  I C/C++ I would write a file or for
> each platform with the code for getting data, and presenting it and use
> #ifdefs to select which files I want to #include.  The makefile would
> handle compiling the write code.
> 	How would I do this in D?  Unless I put each implementation in a
> separate directory, I would have to give each file a different name, and
> hence a different module name.  I would then have to do trick to point
> at these different directories, and the organizational overhead is just
> seems a bit steep for de-wizzling. ;-)  This also excludes ports to
> platforms that don't have hierarchical files systems.  (Loth the
> mainframe.)  This also doesn't address the rare few programs that are so
> small (even with an #ifdef or two) that they would otherwise fit nicely
> into one file.  I guess that is a peeve I have with java.  It tends to
> require more file than I feel should be necessary for some smallish
> projects.

Ouch, why would you tie the module name and the import name together? Just because some implementation of displaying data is in linux-display.d and another is in openbsd-display.d, you should still be able to use "import display" in the main program to get whichever was compiled.  Think "libraries", linux-diplay.d would be compiled, and put in a library called "display" (traditional unix would use libdisplay.a for a static library).


> 	Second, say I'm making an OS kernel. I've decide that the world is
> ready for Danux.  Now here, having a directory for each platform is
> pretty justified even without D's requirements.  My question here has to
> do with configuration management.  If you've compiled linux you know
> that there are a lot of compile time options.  Let face it, Danux is
> going to have to be just as flexible if not more so.  How would I go
> about supporting the ability to pass compile time configuration options
> into D to tell it what I want compiled and how do I act upon that
> information?

Great.  There are a number of good configuration tools out there.  I will agree that conditional compilation can make things easier in these cases. However, if properly constructed, a simple "if" statement can be as effective in this case.  Since it evaluates to "if(0)", and the compiler removes that part of the code, it is just as if conditional compilation happened.


> 	The first thing that comes to mind is to abuse the debug(identifier)
> syntax.  I don't know if it would work though to do imports that way.
> Let's face it though, that is abuse.  I would need to tell the program
> which OS modules/drivers/functionality is being supported for this
> build.  I would also need to tell it option setting like is a driver
> being build statically, or is it a run time loadable.  What should the
> max/default sizes for given properties be.  Short of writing a program
> that will in turn write my program with the correct options (that would
> be a preprocessor wouldn't it?) I don't really know how I would handle
> this in D.  Let's face it.  We are not just dewizzling data here.  We
> need power.

Write more generic code.  Seriously, one of my main problems with open code today is that it is targeted to linux, instead of the broader unix comunity out there.  It used to be that windows programs/utilities were the hardest to port to a "standard" unix environment (solaris, *bsd, osf, etc).  Today that has been largely replaced with the huge number of linux centric code out there.  Code that depends on certain behaviour that the linux api has, rather than following the written standard that exists.

--Toby.
August 20, 2001
It just seems to me that all the usually platform-specific bs can go into the D standard library--that is, threads and semaphores, sockets, maybe graphics, basic filesystem functions, and then the basic types that seem to depend on word size, etc.  Also byte-order needs to be addressed in some instances, maybe have a library call for NetworkByteOrder(...).  (BTW, if D is the big success we hope it is, and it spreads to some old-school and some brand-spankin new systems, will 'int', etc be changed to mean 16- or 64-bit integer??)

My question is this:  Is there anything in the typical platform murk that can't be handled by an extensive base library?

Another 2 cents,
Brent


August 20, 2001
Brent Schartung wrote:

> My question is this:  Is there anything in the typical platform murk that can't be handled by an extensive base library?

For the stuff you mentioned, the technical, under-the-covers stuff of computing, I would guess that a standard library would work well.  Such a thing (particularly if it covers threading and synchronization) would be a Good Idea.

The trouble, though, comes when you are dealing with the more mundane (but perhaps more important) part of the program - the structure of its user interface.  How do you write a class library that works equally well with graphical and text-based user interfaces?  If you just deal with graphical ones, do you assume that the window is local (like Mac and Windows) or remote (like X-Windows)?  Do you choose a synchronous event model, which will *trash* performance for people connected via a modem or the Internet, or do you choose an asynchronous event model, which will make your program much more complex?  Do you recognize and use graphics accelerators, or ignore them?

I'm not saying that it can't be done - in fact, I think that it *should* be done.  But when you get to that level, things are no longer trivial.