Jump to page: 1 26  
Page
Thread overview
Concepts, Techniques, and Models of Computer Programming
Jan 20, 2003
Mark Evans
Jan 21, 2003
Mark Evans
Jan 21, 2003
Antti Sykari
Jan 21, 2003
Mark Evans
Jan 21, 2003
Ilya Minkov
Jan 21, 2003
Ilya Minkov
Jan 22, 2003
Mark Evans
Jan 22, 2003
Daniel Yokomiso
Jan 22, 2003
Mark Evans
Jan 22, 2003
Daniel Yokomiso
Jan 22, 2003
Robert Medeiros
Jan 22, 2003
Daniel Yokomiso
Jan 23, 2003
Daniel Yokomiso
Feb 27, 2003
Walter
Feb 27, 2003
Mark Evans
Feb 27, 2003
Walter
Feb 27, 2003
Mike Wynn
Feb 28, 2003
Bill Cox
Feb 28, 2003
Walter
Mar 01, 2003
Antti Sykari
Mar 01, 2003
Walter
Mar 01, 2003
Patrick Down
Mar 01, 2003
Mark Evans
Mar 01, 2003
Walter
Mar 01, 2003
Farmer
Mar 01, 2003
Walter
Mar 03, 2003
Ilya Minkov
Mar 05, 2003
Farmer
Mar 05, 2003
Walter
Mar 05, 2003
Mark Evans
Mar 05, 2003
Bill Cox
Mar 05, 2003
Mark Evans
Mar 05, 2003
Dan Liebgold
Mar 05, 2003
Walter
Mar 05, 2003
Bill Cox
Mar 06, 2003
Daniel Yokomiso
Functional C++ one better
Mar 06, 2003
Mark Evans
Mar 07, 2003
Walter
Mar 07, 2003
Sean L. Palmer
Mar 08, 2003
Daniel Yokomiso
Mar 09, 2003
Sean L. Palmer
Mar 10, 2003
Burton Radons
Mar 06, 2003
Daniel Yokomiso
Expressiveness of a language (Was: Re: Concepts, Techniques, and Models of Computer Programming)
Mar 05, 2003
Antti Sykari
Mar 07, 2003
Farmer
Mar 01, 2003
Antti Sykari
Mar 01, 2003
Sean L. Palmer
Tangible program histories (was Re: Concepts, Techniques, and Models of Computer Programming)
Mar 02, 2003
Antti Sykari
Mar 02, 2003
Bill Cox
Mar 02, 2003
Ilya Minkov
Mar 02, 2003
Sean L. Palmer
Mar 02, 2003
Sean L. Palmer
Re: Concepts etc.; a new macro system?
Mar 02, 2003
Dan Liebgold
Mar 03, 2003
Sean L. Palmer
Mar 03, 2003
Dan Liebgold
Mar 04, 2003
Sean L. Palmer
Jan 24, 2003
Mark Evans
Jan 27, 2003
Robert Medeiros
Feb 27, 2003
Walter
January 20, 2003
This post is not just another language citation.  It's about language fundamentals (although the interesting language Oz serves as a reference).

This new book is important for D.  The D newsgroup torrent of discussion has little regard for fundamentals or cohesion of design from a computational standpoint.  The criteria are vague, e.g. "must feel like C" and "must be easier than C++" and "I'd like this feature."  Not that Walter isn't trying.  The poor cohesion of C++ exacerbates the problem.  C++ itself is a mish-mash, yet serves as the starting point for D.  For that matter, C is a mish-mash -- see previous posts on IMP.  So we have a lot of mish-mash piled up.  A review of language fundamentals may help D more clearly delineate a proper design point and track it carefully.  (Walter, read: "make my life easier.")

This book demonstrates how languages and their "paradigms" boil down to certain fundamentals (the "kernel language").  Adding just one feature to the kernel enables an entirely new programming paradigm (e.g. class of languages).

My own feeling is that D should pay more attention to the functional paradigm which is extremely powerful in a variety of applications.  Languages with functional power (like OCaml and Mathematica) leave poor-cousin imitators like C++ STL in the dust.  STL was in some respects a vain attempt to graft functional programming onto C++.

I'm not sure what to make of Oz just yet, but it culminates years of research along these lines.  The book is down to earth.  Before the D practicality police shoot me down, here are some quotes for you:

"The number of different computation models that are known to be useful is much
smaller than the number of programming languages....The main criterium for
presenting a model is whether it is useful in practice."
"We find that a good programming style requires using programming concepts that
are usually associated with different computation models.  Languages that
implement just one computation model make this difficult."  (MAJOR POINT FOR D
TO CONSIDER.)
"Concurrency can be simple."

I would love to see D's "kernel language" written down.  The kernel language is not a virtual machine, it is a semantic specification.

Enjoy,
Mark

Oz the language
http://www.mozart-oz.org/

PDF book draft:
http://www.info.ucl.ac.be/people/PVR/book.html
"Concepts, Techniques, and Models of Computer Programming"
by
PETER VAN ROY
SEIF HARIDI
(c) 2001-2003

"One approach to study computer programming is to study programming languages. But there are a tremendously large number of languages, so large that it is impractical to study them all. How can we tackle this immensity? We could pick a small number of languages that are representative of different programming paradigms. But this gives little insight into programming as a unified discipline. This book uses another approach.

"We focus on programming concepts and the techniques to use them, not on programming languages. The concepts are organized in terms of computation models. A computation model is a formal system that defines how computations are done. There are many ways to define computation models. Since this book is intended to be practical, it is important that the computation model should be directly useful to the programmer. We will therefore define it in terms of concepts that are important to programmers: data types, operations, and a programming language. The term computation model makes precise the imprecise notion of 'programming paradigm'. The rest of the book talks about computation models and not programming paradigms. Sometimes we will use the phrase programming model. This refers to what the programmer needs: the programming techniques and design principles made possible by the computation model.

"Each computation model has its own set of techniques for programming and reasoning about programs. The number of different computation models that are known to be useful is much smaller than the number of programming languages. This book covers many well-known models as well as some less-known models. The main criterium for presenting a model is whether it is useful in practice. Each computation model is based on a simple core language called its kernel language. The kernel languages are introduced in a progressive way, by adding concepts one by one. This lets us show the deep relationships between the different models. Often, just adding one new concept makes a world of difference in programming. For example, adding destructive assignment (explicit state) to functional programming allows to do [sic] object-oriented programming. When stepping from one model to the next, how do we decide on what concepts to add? We will touch on this question many times in the book. The main criterium is the creative extension principle. Roughly, a new concept is added when programs become complicated for technical reasons unrelated to the problem being solved. Adding a concept to the kernel language can keep programs simple, if the concept is chosen carefully. This is explained in Section 2.1.2 and Appendix E.

"A nice property of the kernel language approach is that it lets us use different models together in the same program. This is usually called multiparadigm programming. It is quite natural, since it means simply to use the right concepts for the problem, independent of what computation model they originate from. Multiparadigm programming is an old idea. For example, the designers of Lisp and Scheme have long advocated a similar view. However, this book applies it in a much broader and deeper way than was previously done."


January 21, 2003
http://www.ps.uni-sb.de/alice/manual/tour.html http://www.ps.uni-sb.de/Papers/abstracts/Kornstaedt2001.html

This paper showcases an Oz-inspired language, the Alice variant of Standard ML.
A major difference between Alice and Oz is static vs. dynamic typing.  The
following quote is worth pondering in that regard, since D is statically typed
(and properly so given its intent -- but so is Standard ML):
"Its powerful static type system is one of the major pros of ML.  However, there
are programming tasks where it is not possible to  perform all typing
statically. For example, consider exchange of data structures between separate
processes. To accompany such tasks of open programming, Alice complements its
static type system with a controlled form of dynamic typing."

The major point about Oz is not that it's another language with nice feautures we should borrow.  Oz is merely a reference implementation of the kernel language.  The kernel language is the big deal.  It serves to unify and integrate language design.  So forget about performance and dynamic typing.  The kernel language defines and classifies language semantics, one of the dangling, unresolved issues in the whole D development.

I suspect that much debate on the D newsgroup would evaporate if D were scrutinized along the lines of this book -- the closest thing to a Scientific Method for programming languages that non-mathematicians can understand.

Mark


January 21, 2003
Mark Evans <Mark_member@pathlink.com> writes:

> This new book is important for D.  The D newsgroup torrent of discussion has little regard for fundamentals or cohesion of design from a computational standpoint.  The criteria are vague, e.g. "must feel like C" and "must be easier than C++" and "I'd like this feature."  Not that Walter isn't trying.  The poor cohesion of C++ exacerbates the problem.  C++ itself is a mish-mash, yet serves as the starting point for D.  For that matter, C is a mish-mash -- see previous posts on IMP.  So we have a lot of mish-mash piled up.  A

It's instructive to note that the contemporary form of C++, which is perceived to be a pile of mish-mash, was certainly not intended to become that.  I guess it "just happened" when features were added.

But they were added with good intentions, and are even occasionally useful.  The language might not be the simplest in the world, but when one understands why the features are there and the context in which they entered the language, suddenly C++ doesn't seem all that complex any more.

And the design decisions of C++ have, I trust, been backed up by well-defined principles.  In "The Design and Evolution of C++", Stroustrup lists a set of rules, which are divided into categories of general rules, design support rules, language-technical rules and low-level programming support rules.  I'll quote some of them here, since many of them are relevant to contemporary language design as well.

Among general rules, there were:
- C++'s evolution must be driven by real problems.
- Don't get involved in a sterile quest for perfection.
- C++ must be useful now.

These three rules imply that the purpose of C++ was to become a very practical language, driven by real needs of real people.  As is D, I assume...  I'd imagine that D will eventually get features that it currently cannot even dream about, and that will probably make the language more complex.  Not all useful language features, or even programming paradigms, have been invented yet - and who knows if D will one day support one of them.

Another, remarkable aspect of C++ is:
- All features must be affordable.
- What you don't use, you don't pay for.

C++ was designed from the beginning to be as efficient as possible, which is kind of reasonable because then, in the evil mid-eighties, processor time was limited and memory was scarce.  (Not that it isn't today, at least for the hordes of the game programmers who, for some reason, are found in large quantities in this newsgroup ;) Which was kind of bad, at least since it caused the lack of garbage collection in C++.  (And no portable way to get (or even print) a stack trace without a debugger, which would be nice. Java has this, and I don't know if it's a performance issue since you can investigate the stack inside a debugger anyway.)  And kind of good, too, since it showed that object-oriented programs can be efficient.

However, D has the same problem as C++ had in the eighties -- C++ had also the following rules, and then for good reason:

- Use traditional (dumb) linkers.
- No gratuitous incompatibilities with C.

Without these, C++ would've probably been a much cleaner language, but on the other hand, it might not have existed at all... or at least attracted the masses like it eventually did.  Similarly, D is attempting to be attractive for the masses that already know C++ or Java.  Which is kind of nice, since I like C-style syntax :) (Although I'm of the opinion that parts of C's declaration syntax, such as function pointers, could benefit from redesigning.  As well as certain other parts.)

Finally, my favourite design rules, which I'd like to strive for myself (and which I'd like D - or any language - to develop towards) are:

- Don't try to force people.
- It is more important to allow a useful feature than to prevent every
misuse.

C++ doesn't assume that the programmer is stupid; it does not try to prevent the misuse of pointers, manual memory allocation, silly casts or what-you-have.  D seems to go into the same direction.  Which is, again, nice.  At least if the dangerous features are kept as difficult to misuse as possible (which might be hard).

On the other hand, C++ (and D) allows several different programming styles and, often, many ways to do the same thing.

I had a class today, on a course called "Principles on Programming Languages".  One of the points that were presented was that a programming language should only provide one way to do a thing -- for example, in C, there are four ways to increment a variable (++x, x++, x = x + 1 and x += 1).  But I would consider that only as a richness, and the languages of single philosophy, such as Pascal or Eiffel, I'd consider mostly just too restrictive.

Anyway, the fundamental design rules enumerated in the chapter 4 of "The Design and Evolution of C++" make a good reading for anyone, whether they are C++ or D programmers, language designers or just interested in the topic.

To the point, then:

The design of D seems to be have taken the shopping-list approach: take the goodies of C++, Java and Eiffel and put them together; add some widely-used language-defined types (such as string and dynamic arrays), leave out the preprocessor, (a couple of other omissions, which can be seen at http://www.digitalmars.com/d/) and voilĂ .  This seems like a less "scientific" method than the one that produced C++, and -- I might be wrong though -- there seems not to be a well-defined set of rules and principles to guide the design.  Maybe it would help in determining the purpose and nature of D to write down and prioritize some rules that would be applied when new features are requested.

(Whoops, it appears that I got a bit sidetracked.  Yet another off-topic post, then. Back to the topic:)

I didn't read the book mentioned in the subject yet, just browsed through the draft available on the web page and read a couple of pages from the start.  But it seems like a very good book, one that could even become a classic.  (Or then again, it might turn out to be crap and fade into the tombs of history.  But you never know ;)

Oz the language, however, seems like it really doesn't have to care about commercial success, so it can do whatever it wants to.  (Like adopt a non-C-like syntax, for instance. :)

-Antti
January 21, 2003
Bjarne Stroustroup has long lamented the end result of all those good intentions so I'm afraid even its father would not agree with you about C++.

The kernel language technique is a methodical, scientific way to define and analyze language features which is far more practical than lambda-calculus but still precisely defined -- unlike our endless newsgroup discussions.  The authors show how minor changes to the kernel induce whole new paradigms of programming.  When you analyze these problems you find, in the end, just a few key concepts at the heart of everything.

It's something like a Turing machine equivalency demonstration but at a much higher level that is practical for real-world language design.

Mark


January 21, 2003
Antti Sykari wrote:
> It's instructive to note that the contemporary form of C++, which is
> perceived to be a pile of mish-mash, was certainly not intended to
> become that.  I guess it "just happened" when features were added.
> 
> But they were added with good intentions, and are even occasionally
> useful.  The language might not be the simplest in the world, but when
> one understands why the features are there and the context in which
> they entered the language, suddenly C++ doesn't seem all that complex
> any more.

Well, it's drifting towards Perl, in the sense that C++ is not too hard to programme in, but it's a "tower of babel":

"... go down, confuse their language, so they will not understand one another."

Genesis 11:7.

It's the same language, it's simply confused. :>
Very confused. :>

> 
> And the design decisions of C++ have, I trust, been backed up by
> well-defined principles.  In "The Design and Evolution of C++",
> Stroustrup lists a set of rules, which are divided into categories of
> general rules, design support rules, language-technical rules and
> low-level programming support rules.  I'll quote some of them here,
> since many of them are relevant to contemporary language design as
> well.

There's not only him. These "committees" can spoil anything. I want this, i want that. There every voice counts, but here it is a fair dictatorship of intellect :> ... oh, i meant monarchy!

> These three rules imply that the purpose of C++ was to become a very
> practical language, driven by real needs of real people.  

Which are (over-)represented by the committee :>

> 
> However, D has the same problem as C++ had in the eighties -- C++ had
> also the following rules, and then for good reason:
> 
> - Use traditional (dumb) linkers.
> - No gratuitous incompatibilities with C.
> 
> Without these, C++ would've probably been a much cleaner language, but
> on the other hand, it might not have existed at all... or at least
> attracted the masses like it eventually did.  Similarly, D is
> attempting to be attractive for the masses that already know C++ or
> Java.  Which is kind of nice, since I like C-style syntax :) (Although
> I'm of the opinion that parts of C's declaration syntax, such as
> function pointers, could benefit from redesigning.  As well as certain
> other parts.)

Hey, there are tons of wonderful languages... Java is almost the worst of the best... i meant of the modern. Where are they? In universities? I can't bring my frinds to "help me out in OCaml", or "Sather", or anything, but in C, C++, D - no problem.

> Finally, my favourite design rules, which I'd like to strive for
> myself (and which I'd like D - or any language - to develop towards)
> are:
> 
> - Don't try to force people.

It is worse when people force something they don't 100% understand :> which wouldn't happen here.

> - It is more important to allow a useful feature than to prevent every
> misuse.

HM... Right, most academics are *too* restrictive. But being not restrictive at all unadvertedly results in a mess.


> On the other hand, C++ (and D) allows several different programming
> styles and, often, many ways to do the same thing.

Why do I read foreign Delphi code as if it were mine? Good example of the library? The language is not really restrictive... Like Eiffel is.

But true, it does have a bit less flexibility than C. But C is still not too hard to read, because most different ways fall into a very local scope. C++ goes global.

I guess D compiler should restrict most things that "look like bugs" in C code by warnings. And provide ways to shut warnings off, one by one. A compiler should only indicate 1 error or up to N warnings, then break compilation to support fast pinpoint-correct debugging style. (N<5) I'll make a testsuite and a list with explanations later.

> Anyway, the fundamental design rules enumerated in the chapter 4 of
> "The Design and Evolution of C++" make a good reading for anyone,
> whether they are C++ or D programmers, language designers or just
> interested in the topic.
> 
> To the point, then:
> 
> The design of D seems to be have taken the shopping-list approach:
> take the goodies of C++, Java and Eiffel and put them together; add
> some widely-used language-defined types (such as string and dynamic
> arrays), leave out the preprocessor, (a couple of other omissions,
> which can be seen at http://www.digitalmars.com/d/) and voilĂ .  This
> seems like a less "scientific" method than the one that produced C++,
> and -- I might be wrong though -- there seems not to be a well-defined
> set of rules and principles to guide the design.  Maybe it would help
> in determining the purpose and nature of D to write down and
> prioritize some rules that would be applied when new features are
> requested.

It is drifting in the right durection. In the Modula-3 direction, collecting only features which fit well. To the extent, it seems to have the same basic ideas as Modula-3, which is much *less* mess than C++. Although noone has formulated any loud rules, good only to be violated. :> Except that a OS had to be written with it, robust fast and efficient, with all applications, in the shortest possible time.

> (Whoops, it appears that I got a bit sidetracked.  Yet another
> off-topic post, then. Back to the topic:)
> 
> I didn't read the book mentioned in the subject yet, just browsed
> through the draft available on the web page and read a couple of pages
> from the start.  But it seems like a very good book, one that could
> even become a classic.  (Or then again, it might turn out to be crap
> and fade into the tombs of history.  But you never know ;)
> 
> Oz the language, however, seems like it really doesn't have to care
> about commercial success, so it can do whatever it wants to.  (Like
> adopt a non-C-like syntax, for instance. :)
> 
> -Antti


-i.

January 21, 2003
Ilya Minkov wrote:
> Well, it's drifting towards Perl, in the sense that C++ is not too hard to programme in, but it's a "tower of babel":
> 
> "... go down, confuse their language, so they will not understand one another."
> 
> Genesis 11:7.
> 
> It's the same language, it's simply confused. :>
> Very confused. :>
> 

I think *someone over there* wanted to punish the silly people trying to do the impossible things in C++ :>
Climb into the sky, attach a GC and contracts to C++... it's a striking simlarity...

I don't believe in *him*, but i'm literate enough to respect what he does :>

-i.

January 22, 2003
> collecting only features which fit well.

The point of the Oz book is to deal with 'features' at the level of the kernel language.  Then by definition, the new capability fits well into the design.

>- What you don't use, you don't pay for.

Right, and the book shows the minimum kernel language required to support various high-level paradigms.

>C++ was designed from the beginning to be as efficient as possible,

Efficiency is irrelevant to the kernel language specification.  It's just a way to define language semantics, like a Turing machine.  Nobody complains that Turing machines are inefficient, but in fact every conceivable computer program (in any language) can be stated as a Turing machine.  The kernel language serves a similar purpose at a higher level which is more appropriate for language designers.

>On the other hand, C++ (and D) allows several different programming styles and, often, many ways to do the same thing.

Functional and logic programming is impossible in D and C++.  Oz covers all paradigms including these, in a clean, coherent design.  This language demonstrates that with proper kernel language considerations, one can support all known programming styles -- without the deign-by-committee ugliness of C++.

>- C++'s evolution must be driven by real problems.
>- Don't get involved in a sterile quest for perfection.
>- C++ must be useful now.

Oh please.  I posted comments about the practicality police anticipating this kind of shootout.  You missed.  Please go back and read my police remarks, then read the book before wasting more ammo.  You might find that you're shooting at the good guys.

If it makes you feel better I think a lot of computer science is academic too, but not all of it fits that description, and you should be open-minded enough to consider such a possibility.

Mark


January 22, 2003
"Mark Evans" <Mark_member@pathlink.com> escreveu na mensagem news:b0hpdj$1kib$1@digitaldaemon.com...
> This post is not just another language citation.  It's about language fundamentals (although the interesting language Oz serves as a reference).
>

[snip]

It's a very nice book. I've read it when they released to public review some time ago. Their approach of using kernel language features to analyse language expressiveness is very good, but their work has more theory than practice. It's like using lambda calculus, it's more expressive than a Turing machine, but it's impractical to everyday use. Oz is a well designed language, but they lack practical use. IMO language designers will use the ideas from this book in their languages, discarding some, and some years from now a practical language with a powerful set of kernel features will emerge. Pretty much like C (if compared to a plain Turing machine), Haskell (if compared to pure lambda calculus) or ??? (if compared to Lisp) ;-)


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.443 / Virus Database: 248 - Release Date: 10/1/2003


January 22, 2003
>Their approach of using kernel language features to analyse
>language expressiveness is very good, but their work has more theory than
>practice.

It's a 1,000 page book with the minimum theory required to perform the analysis and presented in a practical style.  The idea here is to provide some unifying perspective on language design to help Walter pick a sweet spot based on sound design principles.

I reject vague assertions of impracticality.  The next person who makes that claim had better adduce some evidence.  Oz is a research demonstration language and was intended as such, but the results are general.

> IMO language designers will use the
> ideas from this book in their languages, discarding some

Yes, and that's exactly the point.  The book identifies the fundamental choices that can be made, and also how to make them.  Glad you agree it's a very nice book with a very good approach.

>It's like using lambda calculus

Not hardly!  The authors state right up front that such techniques don't help real-world language designers and programmers.

Mark


January 22, 2003
Hi,

    Comments embedded.

"Mark Evans" <Mark_member@pathlink.com> escreveu na mensagem news:b0msu1$1io1$1@digitaldaemon.com...
> >Their approach of using kernel language features to analyse
> >language expressiveness is very good, but their work has more theory than
> >practice.
>
> It's a 1,000 page book with the minimum theory required to perform the
analysis
> and presented in a practical style.  The idea here is to provide some
unifying
> perspective on language design to help Walter pick a sweet spot based on
sound
> design principles.
>
> I reject vague assertions of impracticality.  The next person who makes
that
> claim had better adduce some evidence.  Oz is a research demonstration
language
> and was intended as such, but the results are general.


    If you re-read what I said you'll see that I never used the word
"impracticality" when talking about Oz. I've used it when said about lambda
calculus in everyday work. Pure lambda calculus looks wonderful, but it's
not practical to use. Oz looks wonderful, but how many libraries where
developed using it, by people that weren't involved with the language
design? Compare it to Java. Java has lots of flaws, but is somewhat
consistent (I don't claim full consistency, just some).
    I like studying and researching new ideas, but when people start to make
statements about generality or expressiveness I start reading it with care.
It's like saying, Design by Contract improves software reliability, instead
of promotes good design of method and class contracts. Some features look
very good (e.g. multi-methods, Sather-like iterators, generics, tuples,
etc.) but some of them will bite you back sometime. Whenever I add a new
feature in my language, I test it by coding some basic classes, like
collections, regex, numerics, report generation, xml, gui, etc.,  and
checking how the design changed. Sometimes I discover that something looked
very nice, but became ugly with usage. I'm sure that three minutes before I
release the first alpha compiler people will send me several posts
complaining about unforeseen side-effects of some features, be it either
syntax or semantics. Walter feel this everyday, as people here post problems
with almost anything he puts in D.
    Oz is a very good language, but IMHO it still needs more usage to become
evident of what is good and what should be changed, like any other language.


>
> > IMO language designers will use the
> > ideas from this book in their languages, discarding some
>
> Yes, and that's exactly the point.  The book identifies the fundamental
choices
> that can be made, and also how to make them.  Glad you agree it's a very
nice
> book with a very good approach.
>
> >It's like using lambda calculus
>
> Not hardly!  The authors state right up front that such techniques don't
help
> real-world language designers and programmers.
>
> Mark

    Well, Erlang and Common Lisp are real-world languages that makes heavy
use of lambda calculus ;-) I'm just saying that some features may look nice
(e.g. concurrency/paralelism) but it can be very tricky if you don't add the
correct amount of sugar. There's a lot of good computer scientists out there
trying to define the "correct" semantics for concurrency, and yet there's no
prevailing set of concurrency primitives.

    Best regards,
    Daniel Yokomiso.

"If you want to be happy be."


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.443 / Virus Database: 248 - Release Date: 10/1/2003


« First   ‹ Prev
1 2 3 4 5 6