March 10, 2006
Georg Wrede wrote:
> 
> If we, with D, can maintain backwards compatibility (no more than what Walter -- and we all, currently aim to), then such would not be a problem here either. We could skip the "Major version every 2 years, bug fixes in between" mantra, and make D more into a pipeline -- like a continuous process. (We're on the Net: no need to plan the manual books and boxes, and the ad campaign 18 months in advance.)
> 
> This sounds daring, I admit, but we've seen it done. And if we do it on purpose (vs. ending up there), then we can do it in style and elegance. It is for this thing we have the /deprecated/ keyword. If we only made major steps, who'd care about that -- there'd be so much else to plan ahead when switching to the new version at the sweat shop. (And most of the existing software would stay updated with the old version anyway.)

Knowing that all post-1.0 compilers are _guaranteed_ to be available on the net _indefinitely_, is one major factor for today's code shops, when they decide on the next language.
March 10, 2006
Hasan Aljudy wrote:
>> 
>> OOP _is_ a set of tools. Nothing more, nothing less.
> 
> no my friend, it's alot more!
> oop is a way of thinking (in terms of coding, of course). It's a
> different way of doing analysis and design.
> Object oriented analysis and design produce completely different results
> from procedural analysis and design.
> 
> You can do procedural analysis and design but write code that uses classes and employes inheritance, but your code will still be procedural, because when you thought about the problem you did so procedurally, and the code will inevitable reflect that.

OOP is a toolbox in a programming toolshed (and inheritance is a tool in
that particular toolbox). Just to make this great analogy complete ;)

Nothing is as simple as you try to put it. When analyzing/designing, you should create structures, objects and functions so they reflect the topology of the problem at hand. It IS wrong to decide which tool to use before you know the problem. But of course one know how to use some tools better than others, and this will reflect the way one works.

> 
>>>OOP is not a set of tools, it's a set of principles. Inheritance is not there just for the sake of it, it serves a very good purpose: increasing cohesion by pulling common/shared code in a super class.
>> 
>> 
>> Yes, and by this you get tight coupling and reduced reusability and extendability.
> 
> If that happens, it generally means you don't know what you're doing; you're using inheritance the wrong way.

One can be lucky, but usually strong cohesion reduce flexibility. Using inheritance in all cases is the wrong way, use it where it helps toward a solution!

> 
>> Principles just get in the way for practicality. Using
>> inheritance just for the sake of it, or because of principles, is nothing
>> even close to practical.
> 
> You don't use inheritance because it's a principle (that's the same as
> using just for its own sake).

Yes, which is what I said.

> You use inheritance when you find that you need to. i.e. when you find yourself writing too much duplicate code.

Yes, I'm glad you agree. And usually you need very little inheritance, even though you might program everything using objects, either it is to encapsulate state, functionality or resources.

>> But then my programming principles encompass pragmatism :)
> 
> Funny you should say so.
> 
> Is skipping the analysis and design phases (i.e. diving right into
> writing code) pragmatic or not?

Of course not, I've never said that. In your previous posts you have preached the view that we should use OOP (and with it inheritance), and that there never is a need to use anything else. I am just politely disagreeing with you.

> You can say it's pragmatic .. if you're too lazy to do it! However, you'll actually get the most out of your time if you do a proper analysis and design first.

You might get an even better result if you analyze, design and implement (and refactor) continuosly to get a better understanding of the system and problems at hand, and some OOP ways to do things (especially inheritance) make this difficult. The waterfall method is a software engineering fossile.

March 10, 2006
David Medlock wrote:
> 
> Outside of simple Shape examples in textbooks, implementation inheritance has largely been a failure for reusable or easy to maintain code.  OOP is supposed to be about black box reusability of objects,ie. components but then its main claim to fame is a way to break that abstraction.

Inheritance isn't even much of a problem so long as the trees remain very shallow, but things tend to fall apart beyond that.  I think it says something about the deep inheritance model that the design work is usually done via an external tool such as UML.  Also, I believe that it should be possible to figure out what's going on through code inspection, and complex object hierarchies make this extremely difficult.


Sean
March 11, 2006
Seems like we agree on principles, but we don't agree on what OOP is.

See, you think OOP is a set of tools, when it's actually a set of principles.

It's like when you think about books as a set of words rather than a set of ideas.
You won't know what's the benefit of writing books or reading them! They'd be just a useless set of words!


Lars Ivar Igesund wrote:
> Hasan Aljudy wrote:
> 
>>>OOP _is_ a set of tools. Nothing more, nothing less.
>>
>>no my friend, it's alot more!
>>oop is a way of thinking (in terms of coding, of course). It's a
>>different way of doing analysis and design.
>>Object oriented analysis and design produce completely different results
>>from procedural analysis and design.
>>
>>You can do procedural analysis and design but write code that uses
>>classes and employes inheritance, but your code will still be
>>procedural, because when you thought about the problem you did so
>>procedurally, and the code will inevitable reflect that.
> 
> 
> OOP is a toolbox in a programming toolshed (and inheritance is a tool in
> that particular toolbox). Just to make this great analogy complete ;) 

I have to disagree! (duh)

> 
> Nothing is as simple as you try to put it. 

I'm *not* trying to make it look simple. It *is* complicated.

> When analyzing/designing, you
> should create structures, objects and functions so they reflect the
> topology of the problem at hand. It IS wrong to decide which tool to use
> before you know the problem. But of course one know how to use some tools
> better than others, and this will reflect the way one works.

ok ..

> 
>>>>OOP is not a set of tools, it's a set of principles. Inheritance is not
>>>>there just for the sake of it, it serves a very good purpose: increasing
>>>>cohesion by pulling common/shared code in a super class.
>>>
>>>
>>>Yes, and by this you get tight coupling and reduced reusability and
>>>extendability.
>>
>>If that happens, it generally means you don't know what you're doing;
>>you're using inheritance the wrong way.
> 
> 
> One can be lucky, but usually strong cohesion reduce flexibility. Using
> inheritance in all cases is the wrong way, use it where it helps toward a
> solution!

I'm not promoting inheritance. It's quite the opposite. I'm saying that OOP is *not* about inheritance.
Inheritance is just a tool.
Just because your code uses inheritance doesn't mean it's object oriented.
If you don't know how to apply object oriented principles, don't blame the paradigm.

When you design your object model, your goal should be to achieve a high  level of cohesion and a low level of coupling.

Sometimes you can use inheritance to achieve that goal. Sometimes inheritance can stand in your way.

That's why OOP is *not* a set of tools. If you keep thinking about object orientation as a set of tools then you will not get much out of it.

> 
> 
>>>Principles just get in the way for practicality. Using
>>>inheritance just for the sake of it, or because of principles, is nothing
>>>even close to practical.
>>
>>You don't use inheritance because it's a principle (that's the same as
>>using just for its own sake).
> 
> 
> Yes, which is what I said.

See. We agree on principles ;)

> 
> 
>>You use inheritance when you find that you need to. i.e. when you find
>>yourself writing too much duplicate code.
> 
> 
> Yes, I'm glad you agree. And usually you need very little inheritance, even
> though you might program everything using objects, either it is to
> encapsulate state, functionality or resources.
> 
> 
>>>But then my programming principles encompass pragmatism :)
>>
>>Funny you should say so.
>>
>>Is skipping the analysis and design phases (i.e. diving right into
>>writing code) pragmatic or not?
> 
> 
> Of course not, I've never said that. In your previous posts you have
> preached the view that we should use OOP (and with it inheritance), and
> that there never is a need to use anything else. I am just politely
> disagreeing with you.

I'm just trying to change your opinion on OOP. I'm not saying you should use it all the time.
I myself don't use OO all the time. I'm not a purist. However, I often find myself converting procedural code to OO code, as part of the refactoring process.

> 
> 
>>You can say it's pragmatic .. if you're too lazy to do it! However,
>>you'll actually get the most out of your time if you do a proper
>>analysis and design first.
> 
> 
> You might get an even better result if you analyze, design and implement
> (and refactor) continuosly to get a better understanding of the system and
> problems at hand, and some OOP ways to do things (especially inheritance)
> make this difficult. The waterfall method is a software engineering
> fossile.

The waterfall method is stupid. I totally agree.

Usually (like I said above), I always find myself converting procedural code to object oriented code during the refactoring process.
March 11, 2006
Georg Wrede wrote:
> Hasan Aljudy wrote:
> 
>> Is skipping the analysis and design phases (i.e. diving right into writing code) pragmatic or not?
>> You can say it's pragmatic .. if you're too lazy to do it! However, you'll actually get the most out of your time if you do a proper analysis and design first.
> 
> 
> I must live in the wrong country, or spend too much time with self-taught programmers, earning their living programming.
> 
> Very seldom do I see one who actually thinks before he hits the keyboard. Whe I ask, they frown and say "I've already got it covered!" Yeah, right. And just looking at their code proves them wrong.

Haha.
I myself don't apply what I'm saying. Seldom do I plan my projects on paper. Even when I do put something on paper, it's mininal.

I usually have everything in my head! Which is probably why it takes me a loooong time to get things done!!

Fortunately I'm still a student and not in the industry, yet.

> 
> With all the UML and stuff floating around, one would think they would have understood the meaning of planning and analysis. And the funniest thing is, on all these programmer's door it says "analyst", not "programmer".
> 
> But "UML is for suits." Glad at least the bosses try to get something thought out before it's done. :-(
March 11, 2006
Anders F Björklund wrote:
> Bruno Medeiros wrote:
> 
>>> I am afraid it will be: "virtualization"
>>
>> What do you mean? Isn't virtualization at the core a hardware concept?
> 
> Like this: http://www.kernelthread.com/publications/virtualization/
> 
> In this case: D#, or other bytecode ?
> 
> --anders

D progs running in a VM? It's an issue orthogonal to the programming language itself.

-- 
Bruno Medeiros - CS/E student
http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
March 11, 2006
David Medlock wrote:
> 
> see:
> http://okmij.org/ftp/Computation/Subtyping/#Problem
> 
> -DavidM

Hum, nice article.

-- 
Bruno Medeiros - CS/E student
http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
March 11, 2006
David Medlock wrote:
<snip>
> see:
> http://okmij.org/ftp/Computation/Subtyping/#Problem
> 
> -DavidM

A nice example of not understanding OOP :)

WOW .. I think I'm beginning to understand what Schock means when he says everyone out there thinks they are doing object oriented programming, but only very few of them really are.

There's no "book" that you can follow literally to produce good code. It's a matter of trial and error. You try to subclass CSet from CBag, and disvocer that it doesn't work, or produce more problem than it solves, then don't whine about it, just implement CSet differently. It's not OOP's fault, nor is it inheritance's fault.
If you try to open a door with a screw-driver and it doesn't work, should you blame the screw-driver, or blame yourself for not understanding how doors work?

Apparently subclassing CSet from CBag wasn't such a good idea. Don't blame the object oriented paradigm for it. No where in the paradigm does it say that you should sublcass CSet from CBag!

Aside from that, the real clurpit here is C++, which allows you to deal with objects BY VALUE!! polymorhpism dies at that point.

And, really, C++ doesn't support OO, it just presents an awefully complicated set of features!!

I came to really hate C++ lately.
March 12, 2006
Bruno Medeiros wrote:

> D progs running in a VM? It's an issue orthogonal to the programming language itself.

Maybe so, but both Java and C# makes a big deal out of running in a VM.
If everything is "going managed", then that leaves D "with assembler" ?

What I meant was that the trend seems to be towards scripting languages and virtual machines. Especially with WinFX looming around the corner ?
(What if C++ and Win32 becomes "the sandbox", and C# and WinFX is the real OS interface ? Could leave current apps where 16-bit ones are now)

Just a thought. (I happen to like assembler, plain old C - and D too...)
But a D frontend to a CLR/JVM backend, would be a neat thing to have ?

--anders
March 12, 2006
"Hasan Aljudy" <hasan.aljudy@gmail.com> wrote in message news:dupijv$isn$1@digitaldaemon.com...
> Walter Bright wrote:
>> The trouble with OOP is, well, not everything is an object. For example, take the trig function sin(x). It's not an object. Of course, we could bash it into being an object, but that doesn't accomplish anything but obfuscation.
>
> That's because we've been taught math in a procedural way ;)
> Ideally, x, the angle, would be an object, and sin is a method on that
> object.
> However, I think that we're so used to think about math functions in a
> procedural way, so it's better they stay procedural.
>
> Like you said, it'll be a bit confusing if it was an object, but that's not because it can't be an object, but mainly because that's not how we think about it.

If we think of a number as a number rather than an object, then it's just obfuscation to force it into being an object.