November 16, 2017
On Thursday, 16 November 2017 at 06:51:58 UTC, rikki cattermole wrote:
> On 16/11/2017 6:35 AM, Ola Fosheim Grostad wrote:
> Thing is, it is a failure, the way most people use it.

You can say that about most things: exceptions, arrays, pointers, memory, structs with public fields... But I guess what you are saying is that many people arent good at modelling...

> When used correctly it is a very nice additive to any code base.
> It just can't be the only one.

Well, it can in a flexible OO language (niche languages). However it was never meant to be used out of context, i.e not meant to be used for "pure math".
November 16, 2017
On Thursday, 16 November 2017 at 07:12:16 UTC, Ola Fosheim Grostad wrote:
> But I guess what you are saying is that many people arent good at modelling...

I just want to add to this that I believe most people are much better at OO modelling than other modelling strategies (ER, SA, NIAM etc). Simply because people are good at understanding stereotypes. Even without training.

The ability to communicate with your customer is therefore a good reason to use OO-modelling, so that you can get some feedback on your take of their world.
November 16, 2017
On Thursday, 16 November 2017 at 06:35:30 UTC, Ola Fosheim Grostad wrote:
> No, classes is a powerful modelling primitive. C++ got that right. C++ is also fairly uniform because of it.

Yes, I agree that classes are a powerful modelling primitive, but my point was that Stroustrup made classes the 'primary focus of program design'. Yes, that made it more uniform alright... uniformly more complicated. And why? Because he went on to throw C into the mix, because performance in Simula was so poor, and would not scale. C promised the efficiency and scalability he was after. But an efficient and scalable 'class oriented' language, means complexity was inevitable.

It wasn't a bad decision on his part. It was right for the time I guess. But it set the stage for its demise I believe.

> People who harp about how OO is a failure don't know how to do real world modelling...

I would never say OO itself is a failure. But the idea that is should be the 'primary focus of program design' .. I think that is a failure...and I think that principle is generally accepted these days.


>> I have to wonder whether that conclusion sparked the inevitable demise of C++.
>
> There is no demise...

If the next C++ doesn't get modules, that'll be the end of it...for sure.

>> Eric should be asking a similar question about Go ..what decision has been made that sparked Go's inevitable demise - or in the case of Go, decision would be decisions.
>
> Go is growing...

Yeah..but into what? It's all those furry gopher toys, t-shirts, and playful colors.. I think that's what's attracting people to Go. Google is the master of advertising afterall. Would work well in a kindergarten. But it makes me want to puke. It's so fake.

>> a := b
>
> A practical shorthand, if you dont like it, then dont use it.

Was just a senseless, unnecessary change. The immediate impression I got, was that they were trying to undo a decision, that was made when B was developed, rather doing it because it really assisted the modern programmer (what language uses that? None that I use that's for sure). And I get that feeling about other decisions they've made...as if they are just trying to correct the past. They should be focused on the future. They should have got some experienced younger programmers at google to design a language instead. I bet it wouldn't look anything like Go.


November 16, 2017
On Thursday, 16 November 2017 at 11:24:09 UTC, codephantom wrote:
> On Thursday, 16 November 2017 at 06:35:30 UTC, Ola Fosheim Grostad wrote:
> Yes, I agree that classes are a powerful modelling primitive, but my point was that Stroustrup made classes the 'primary focus of program design'. Yes, that made it more uniform alright... uniformly more complicated. And why? Because he went on to throw C into the mix, because performance in Simula was so poor, and would not scale. C promised the efficiency and scalability he was after. But an efficient and scalable 'class oriented' language, means complexity was inevitable.

Nah, he is just making excuses. Simula wasn't particularly slow as a design, but used a GC similar to the one in D and bounds checks on arrays, like D. C++ was just a simple layer over C and evolved from that. Had nothing to do with language design, but was all about cheap implementation. Initial version of C++ was cheap and easy to do.

> I would never say OO itself is a failure. But the idea that is should be the 'primary focus of program design' .. I think that is a failure...and I think that principle is generally accepted these days.

Uhm, no? What do you mean by 'primary focus of program design' and in which context?

> If the next C++ doesn't get modules, that'll be the end of it...for sure.

I like namespaces. Flat is generally better when you want explicit qualifications.

> Yeah..but into what? It's all those furry gopher toys, t-shirts, and playful colors.. I think that's what's attracting people to Go. Google is the master of advertising afterall. Would work well in a kindergarten. But it makes me want to puke. It's so fake.

It is the runtime and standard library. And stability. Nice for smaller web services.

> correct the past. They should be focused on the future. They should have got some experienced younger programmers at google to design a language instead. I bet it wouldn't look anything like Go.

Go isnt exciting and has some short-comings that is surprising, but they managed to reach a stable state, which is desirable when writing server code. It is this stability that has ensured that they could improve on the runtime. ("experienced young programmers" is a rather contradictory term, btw :-)


November 16, 2017
On Tuesday, 14 November 2017 at 09:43:07 UTC, Ola Fosheim Grøstad wrote:
> On Tuesday, 14 November 2017 at 06:32:55 UTC, lobo wrote:
>> "[snip]...Then came the day we discovered that a person we incautiously gave commit privileges to had fucked up the games’s AI core. It became apparent that I was the only dev on the team not too frightened of that code to go in. And I fixed it all right – took me two weeks of struggle. After which I swore a mighty oath never to go near C++ again. ...[snip]"
>>
>> Either no one manages SW in his team so that this "bad" dev could run off and to build a monster architecture, which would take weeks, or this guy has no idea how to revert commit.
>
> ESR got famous for his cathedral vs bazaar piece, which IMO was basically just a not very insightful allegory over waterfall vs evolutionary development models, but since many software developers don't know the basics of software development he managed to become infamous for it… But I think embracing emergence has hurt open source projects more than it has helped it. D bears signs of too much emergence too, and is still trying correct those «random moves» with DIPs.
>
> ESR states «C is flawed, but it does have one immensely valuable property that C++ didn’t keep – if you can mentally model the hardware it’s running on, you can easily see all the way down. If C++ had actually eliminated C’s flaws (that it, been type-safe and memory-safe) giving away that transparency might be a trade worth making. As it is, nope.»
>
> I don't think this is true, you can reduce C++ down to the level where it is just like C. If he cannot mentally model the hardware in C++ that basically just means he has never tried to get there…

The shear amount of inscrutable cruft and rules, plus the moving target of continuously changing semantics an order or two of magnitude bigger than C added to the fact that you still need to know C's gotchas, makes it one or two order of magnitude more difficult to mental model the hardware. You can also mental model the hardware with Intercal, if you haven't managed just means you haven't tried hard enough.

>
> I also think he is in denial if he does not see that C++ is taking over C. Starting a big project in C today sounds like a very bad idea to me.

Even worse in C++ with its changing standards ever 5 years.




November 16, 2017
On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim Grostad wrote:
> On Tuesday, 14 November 2017 at 11:55:17 UTC, codephantom wrote:
>> [...]
>
> Well, in another thread he talked about the Tango split, so not sure where he is coming from.
>
>> [...]
>
> No, the starting point for C++ was that Simula is better for a specific kind of modelling than C.
>
>> [...]
>
> It is flawed... ESR got that right, not sure how anyone can disagree. The only thing C has going for it is that CPU designs have been adapted to C for decades. But that is changing. C no longer models the hardware in a reasonable manner.
>
Because of the flawed interpretation of UB by the compiler writers, not because of a property of the language itself.
November 16, 2017
On Thursday, 16 November 2017 at 18:06:22 UTC, Patrick Schluter wrote:
> On Tuesday, 14 November 2017 at 16:38:58 UTC, Ola Fosheim Grostad wrote:
>> changing. C no longer models the hardware in a reasonable manner.
>>
> Because of the flawed interpretation of UB by the compiler writers, not because of a property of the language itself.

No, I am talking about the actual hardware, not UB. In the 80s there was almost 1-to-1 correspondence between C and CPU internals. CPUs are still designed for C, but the more code shift away from C, the more rewarding it will be for hardware designers to move to more parallell designs.
November 16, 2017
On Thursday, 16 November 2017 at 18:02:10 UTC, Patrick Schluter wrote:
> The shear amount of inscrutable cruft and rules, plus the moving target of continuously changing semantics an order or two of magnitude bigger than C added to the fact that you still need to know C's gotchas, makes it one or two order of magnitude more difficult to mental model the hardware.

I don't feel that way, most of what C++ adds to C happens on a typesystem or textual level. The core language is similar to C.

> Even worse in C++ with its changing standards ever 5 years.

But those features are mostly short hand for things that already are in the language. E.g. lambdas are just objects, move semantics is just an additional nominal ref type with barely any semantics attached to it (some rules for coercion to regular references)...

So while these things make a difference, it doesn't change my low level mental model of C++, which remain as close to C today as it did in the 90s.
November 16, 2017
On Thursday, 16 November 2017 at 11:52:45 UTC, Ola Fosheim Grostad wrote:
> On Thursday, 16 November 2017 at 11:24:09 UTC, codephantom
>> I would never say OO itself is a failure. But the idea that is should be the 'primary focus of program design' .. I think that is a failure...and I think that principle is generally accepted these days.
>
> Uhm, no? What do you mean by 'primary focus of program design' and in which context?

In the 90s (and a bit into the 00s) there was a pretty extreme "everything must be an object; OO is the solution to everything" movement in the industry.  Like most tech fads, it was associated with a lot of marketing and snake oil from people promising anything managers would pay money to hear (e.g., "use OO and your projects will be made up of reusable objects that you can simply drop into your next project!").

Look around most programming languages today and you'll see objects, so in that sense OOP never failed.  What failed was the hype train.  It's no different from most other tech fads (except XML has declined drastically since the hype passed).
November 16, 2017
On Tuesday, 14 November 2017 at 09:43:07 UTC, Ola Fosheim Grøstad wrote:
> ESR got famous for his cathedral vs bazaar piece, which IMO was basically just a not very insightful allegory over waterfall vs evolutionary development models, but since many software developers don't know the basics of software development he managed to become infamous for it…

Everything ESR says is worth taking with a good dose of salt, but his "The Art of Unix Programming" isn't a bad read.