August 10, 2021

On Tuesday, 10 August 2021 at 08:22:04 UTC, surlymoor wrote:

>

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

[...]

Those examples don't feel all that insulting; then again, I am a stupid programmer. However, something that would sting is being deprived of generics because the Elder Gods deemed them too confusing.

Indeed, however after years of gatherings at the Eldery Mountains, a new revelation on the Book of Wisdom has been made visible on the Old World, an event that is deemed to happen every few centuries, upon which the Elder Gods have summoned the runes of generics, and great rejoice has spread throughout the land with festivities across all villages.

https://go.googlesource.com/proposal/+/refs/heads/master/design/43651-type-parameters.md

https://go2goplay.golang.org/

August 10, 2021

On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:

>

On Monday, 9 August 2021 at 23:09:30 UTC, H. S. Teoh wrote:
...
I love how people love to hate Java, yet have no words of hate against OOP in Smalltalk, SELF, Eiffel, Sather, C#, VB, Python (check methods from typeof(func)), Dart.

How has D's multiparadigm helped to have a better marketshare than some of those extremist OOP languages?

I may have started some of the jump on Java trend with the bumper car comment.

This may get long, as there are historical evolutions to cover, an old guy in the mood to ramble, and maybe pass on some learnin's, save a few younglings from some choice of programming tool angst.

I'm not going to speak for others, Paulo, but I find Java to be fun, not hated, more smirky amusing.

Now rambling to the room. Being a fellow Canuck, I've followed the works of James Gosling with some interest, for a lot of years now. Back when it was Oak, and the smart agent in Oak was Duke, and Duke is still a very cool mascot.

Be sure to check out sc, the spreadsheet calculator, a TUI interface console spreadsheet with vi like key bindings. Something James also wrote, way back when as public domain code.

And some personal reasons behind the smirking at Java.

When Java was first being hyped (and it was hyped, something like $500 million was allocated to advertising by Sun), I was programming in Forth on a Vax for a phone company. PCs were toys and core business adjacent, not core business tools, in my mind at the time.

The web was a distraction, for science and amusement, viewed as business adjacent, not core. At the time. Could see the potential of Mosaic and the WWW, but would not have recommended it to a manager as a smart move for critical work. It was for the marketing department, not a competitor to telephony. The web did not seem like bigboy pants internet technology at the time, not like the more serious uses like Gopher, Telnet and FTP. :-)

Attitude on the web changed fairly quickly, but Java still seemed like a toy. Real programmers wrote to machine code, not a play area sandbox. But, $500 million in advertising budget caught the eye of many a leader, and it was fun.

Our Forth system was set to be superseded by an amalgam overlord system in C++, consolidating some 20 or 30 other large systems. Watched in predicted horror as three years of 300 people's time was wasted on project management and Grady Booch cloud diagram planning. 300 people. 3 years. The writing was on the wall when zero user level demo screens were available years after starting. C++ isn't going to work, this needs to be in Java, was the next recommendation.

I think the Forth project we worked on (for many more years after the story from above) was finally retired a few years ago after being outsourced to a Vaxen emulator in Bangalore a decade ago. The amalgam is still in a continuing cycle of development startup followed by cancel and fail, last I heard. 25ish years later. Yet, unless given the runway we initially enjoyed with the Forth system, I would not recommend Forth for large scale systems. You need time to build an application vocabulary when using Forth in the large, and there is little chance for cost recovery during that phase. Given lead time, absolutely, but that'd be a rare management decision in the 3rd millennium.

C++ rose to fame, in part, due to human pride. A sense of "I figured it out, ar'mant I smart". That's a powerful draw. For many, C++ is very smart. Smarter than some of us prideful programmers though.

Java did not rise to fame on merit (it might have, more slowly). It rose to fame on aggressive marketing and expensive hype. Since the lock in had started, many companies have poured billions into JVM technology, to ensure its success at making reasonably fast bumper cars run in a sandbox.

That's all ok, but I was young when it was just starting, and it smelled like a toy then. It still kinda smells like a toy, because of that initial bias.

Useful for business adjacent pretty reporting and such. Haven't been paid to work in Java since 1998-ish, but have followed it along the way, 1.2 was just coming out, with the whole J2EE thing (or J2SE, or JDK, can't quite remember the marketing speak without looking it up). I'm not sure, but I'd wager some of that Java 1.2 is still in the marketing application we wrote, if the company still exists.

Still enjoy exploring OpenJDK 11 now, and have an intrinsic FUNCTION JVM built into a version of GnuCOBOL, which should eventually be solid enough for us to advertise GnuCOBOL as ready to run any enterprise Java classes in an engine embedded in COBOL via JNI. At its core, JDK Java is written in C and GnuCOBOL compiles via C.

I'm still that kind of biased internally, COBOL for core business, Java for some fluffy bits and pretty pictures. Also realize that the amount of .class code in the world is immense. Why not leverage it to help an enterprise pursue its goals, while counting all the beans in COBOL, integrated tightly?

My bias to OOP is similar. Watched too many large scale failures come and go with C++ and Java rip and replace projects. But when determined and with deep enough pockets most failures can be silently buried, small successes over hyped until it all becomes legacy code anyway, ready to be replaced by Go, Ruby, Rust, Zig, or whatever is the lang hype du jour at the time.

Except for the life expectancy of source code thing, I'd happily hype D to any management at any large company. But at 6ish years of life expectancy before being forced to manually update a codebase, it'd be a hard sell for large systems. Will recommend D for business adjacent short span productivity applications from now on though.

How many sites still run Python 2 because the code borks in Python 3? How many sites will be on Java 8 until 2038 or beyond? The borks might be extremely trivial to a programmer, parens around arguments to print for instance, but when a non-tech boss sees the first

SyntaxError: Missing parentheses in call to 'print'. Did you mean print(name)?

they may just stick with Python 2 and get on with running their business. They paid for that code already. 10 years is nothing in that kind of time frame. They will wait until the pain of stagnation (or being mocked as archaic legacy) exceeds a threshold. Then they will pour money at the tech team, often times just to save face, and in a sour mood. There will always be another consultant ready to hype a tech, and promise a short cut to the promised land.

Slow, long tail growth is the best kind of growth, in my old guy opinion.

D is making sound decisions in the now it seems, growing slowly, which is sound in the long term. Now to convince the D dev team that we, the stupid programmers, leave behind code that may not age well if even a small language detail changes. Cover that base and businesses will follow.

To keep a little bit more in thread, some of the decisions may be guard rail protect the programmer from the programmer decisions. Those aren't all bad.

COBOL isn't active at 60 because it's great tech in the internet era, it's still active, with a huge market share, because it grew that big with a never shrinking codebase, that still compiles. Pennies are still pennies, core banking is still the same banking. If I'm not mistaken, the oldest still running codebase is a contract management system, MOCAS, for the U.S. department of defense. 60+ years old, mostly COBOL. There are billions on the line for the first team that can answer a still open call to convince DoD that a replacement won't drop any features or muck up any contracts in play at time of transition. Some have tried, all failed to convince the brass, so far. More famous, but still old is the SABRE airline reservation system, in COBOL/CICS. There are programs running in the IRS supporting the Individual Master File and Master Banking File projects still using (probably, these are not overly public source codes) COBOL sources written in 1962, recompiled for each new mainframe iteration.

Java 8 is set to enjoy suffer a similar fate. As is Python 2, Perl 5, ... (whether the Python/Perl/... core contributors like it or not). Ruby, maybe not, but maybe. If you can convince a company to pay for some code, they are not going to want to pay for the same code again, even decades later.

Ranting almost over. D design decisions seem like very sane long view decisions. But source codes need a longer life expectancy, or D may end up relegated to short and mid range development planning, ad infinitum. That's ok, if that is the end goal or an acceptable fate.

D does not feel like a design effort with stupid in mind. The guard rails and preferred paths in D seem well planned and well placed. Still new to D, could be completely wrong in the depths. Protecting from tricks some people may feel comfortable with but keeping simple mistakes from being catastrophic, painful to find and fix or costly to maintain, is not limiting, in my opinion, at least not in a bad way. That goes for D or any other programming language tool chain.

Have good, make well, excuse the long read and slow drift.

August 10, 2021

On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:

>

I love how people love to hate Java, yet have no words of hate against OOP in Smalltalk, SELF, Eiffel, Sather, C#, VB, Python (check methods from typeof(func)), Dart.

In this case, Java was already raised as an example, so it was easiest to talk about it when criticizing the dogmatism about OO.

Of those other languages, I really only know C#. Yes, the criticism about lack of free functions does apply to C# just like it applies to Java. In general I don't consider C# quite as restrictive as Java, because at least it has user-defined value types and operator overloading.

I'll grant that this is probably not because Java would trust the programmer less than C#, but because of language simplicity. I quess I dislike scarce-featured languages in general - I'm not very tempted by Go either, not that I have tried. C is kind of fun to play with because of the low-level tricks, but I don't consider it a good general-purpose language either.

>

How has D's multiparadigm helped to have a better marketshare than some of those extremist OOP languages?

I don't think I have to explain the bandwagon fallacy for you, so I'm assuming you mean "How has D's multiparadigm helped to have a better marketshare than if it had an extremist attitude towards OOP?".

Consider a simple bug report (https://issues.dlang.org/show_bug.cgi?id=22151) example I submitted recently.

void main()
{   *&main = *&main;
}

if D had a similar OO attitude as Java and C#, the example would have to be something like

struct S
{   static freeFun()
    {   *&freeFun = *&freeFun;
    }
}

Note that the comparison is fair - I'm still taking advantage of D's expressive power in the second example, by dropping the unnecessary void and visibility attribute. If every single small code snippet (or utility module) has to contain an enclosing struct, class or union, that totals to a HUGE readability price.

Yes, allowing free functions does complicate modules, and includes the need for function hijack protection. But that pays back in a big way when it comes to readability.

Also remember that you cannot have UFCS without free functions, except if you have something like C# this parameters for static member functions. And frankly, that solution has NO upsides whatsoever compared to D. The programmer still needs to define the needless class, and language designers need to think about the module system and hijack protection just as hard as with free functions.

August 10, 2021
On Tuesday, 10 August 2021 at 06:23:39 UTC, Paulo Pinto wrote:

> I love how people love to hate Java, yet have no words of hate against OOP in Smalltalk, SELF, Eiffel, Sather, C#, VB, Python (check methods from typeof(func)), Dart.
>
> How has D's multiparadigm helped to have a better marketshare than some of those extremist OOP languages?

I think in both cases a better comparison is Scala, since it's so close to Java. Martin Odersky cleaned up the OOP and made it multiparadigm (with functional programming) and a lot of people liked it. Since it's a programming language, people do complain about Scala, but they complain about other things. The features that stand out about Java are the boilerplate and the lack of expressiveness, and in my opinion, those criticisms are correct. In one sense, the Java language designers agree, since they've copied so much from languages like Scala.
August 10, 2021
On Tue, Aug 10, 2021 at 06:23:39AM +0000, Paulo Pinto via Digitalmars-d wrote:
> On Monday, 9 August 2021 at 23:09:30 UTC, H. S. Teoh wrote:
> > ...
> > No offense taken.  I just find the OO extremism of Java to be
> > laughable, that's all.  Why not just admit that OO isn't the be-all
> > and end-all of programming, and call free global functions what they
> > are.
> > 
> > This is why D's multi-paradigm approach makes so much more sense: sometimes one paradigm doesn't fit the problem, why not acknowledge it and allow a different paradigm to step in.  Not every problem is a nail to which the OO hammer must be brought to bear.
[...]
> I love how people love to hate Java, yet have no words of hate against
> OOP in Smalltalk, SELF, Eiffel, Sather, C#, VB, Python (check methods
> from typeof(func)), Dart.

Whoa, slow down right there.  My rant was against shoehorning every programming problem into a single paradigm, not against OOP itself.  OOP certainly has its uses -- that's why D has it too!  The problem comes when OOP zealots try to shove it down everyone's throats even when the problem at hand clearly does not fit the glove.

I picked Java because that's the language I am most familiar with that exhibits this syndrome.  I don't know enough (well, any) Smalltalk or Eiffel to be able to criticize them coherently, and I have not written a single line of C#, so I don't have much to go on there.  Python I have written in very small scales, and it does have some nice things about it.  But I've not written enough non-trivial Python code to be able to criticize it effectively.

But the point is that single-paradigm languages ultimately suffer from the shoehorn program: trying to shoehorn every programming problem into a single mold, even when it doesn't really fit.  Certain algorithms are simply better expressed as an imperative loop that mutate variables; it's a royal pain in the neck to try to write the equivalent of such code in Lisp, for example.  It's certainly *possible*, thanks to Turing completeness, but exhibits all the symptoms of dressing up a duck as an aquatic chicken with webbed feet: it's awkward, requires excessive paraphrases, and more complex than it really needs to be.  The same can be said about writing range-based code in a language like C: you can do it, but it's awkward, error-prone, and just plain ugly.

In a multi-paradigm language, you can choose the best paradigm to express the algorithm at hand, in the most convenient, clear way, without having to resort to the shoe-horn. It's faster to write, easier to read, and more maintainable in the long run. You can treat a duck as a duck instead of an aquatic chicken with webbed feet. It's so refreshing not to have to paraphrase yourself all the time.


> How has D's multiparadigm helped to have a better marketshare than some of those extremist OOP languages?

I don't understand where marketshare enters the equation. We're talking about how well a language fits a problem domain, not about the fashion contest that is today's programming language "market".

(And incidentally, which language is most popular has very little to do with its technical merit, it is primarily determined by how much marketing capital it has and how well the marketing department sells it. With a large-enough marketing budget and a spark of marketing genius, you can sell *anything* to anyone.  I have no interest in what's popular and what's not; what I look for is technical excellence, which is the true value.)


T

-- 
Дерево держится корнями, а человек - друзьями.
August 10, 2021
On 8/9/2021 8:15 AM, bachmeier wrote:
> I did not study unsigned math in college.

There are no classes in unsigned math.

> I took one programming class and I've done a lot of independent study. Maybe I could figure out how to work with unsigned math, but why would I want to? I have better things to do with my time.

It's worth spending 5 minutes to learn what 2's complement arithmetic is.

https://en.wikipedia.org/wiki/Two%27s_complement#:~:text=Two's%20complement%20is%20a%20mathematical,two's%20complement%20is%202N.
August 10, 2021
On Tue, Aug 10, 2021 at 10:59:58AM +0000, Brian Tiffin via Digitalmars-d wrote: [...]
> *Now rambling to the room.* Being a fellow Canuck, I've followed the works of James Gosling with some interest, for a lot of years now. Back when it was Oak, and the smart agent in Oak was Duke, and Duke is still a very cool mascot.

Haha, didn't realize Gosling was Canadian.  Hooray for fellow Canuckians!  ;-)


[...]
> When Java was first being hyped (and it was hyped, something like $500
> million was allocated to advertising by Sun), [...]
> [...] Real programmers wrote to machine code, not a play area sandbox.
> But, $500 million in advertising budget caught the eye of many a
> leader, and it was fun.

Yep, that's the perception I had from the 90's.  I was young and idealistic, and immediately smelled the hype behind Java -- and distrusted it.  Almost 3 decades later, I still distrust it.  But now, with history in hindsight, it is also abundantly clear that what drives programming language popularity is marketing budget.  Technical merit plays at best a secondary role (if even).


[...]
> Java did not rise to fame on merit (it might have, more slowly). It rose to fame on aggressive marketing and expensive hype.  Since the lock in had started, many companies have poured billions into JVM technology, to ensure its success at making reasonably fast bumper cars run in a sandbox.

Yes, and that is why marketing hype is ultimately the decider of technology adoption, not the technical merit itself.  First, with a large enough budget, you attract the attention of the VIPs who make the decisions at the top level. They dictate the use of said tech among the lower ranks, and then over time the tech becomes entrenched, ensuring its continued use.  Technical excellence does not play a major role here.  If it works reasonably well, it will stay. And the longer it stays, the harder it becomes to displace it. Inertia is a powerful force.


[...]
> My bias to OOP is similar.  Watched too many large scale failures come and go with C++ and Java rip and replace projects.

As Joel Spolsky once said:

	... the single worst strategic mistake that any software company
	can make: They decided to rewrite the code from scratch.

;-)


> But when determined and with deep enough pockets most failures can be silently buried, small successes over hyped until it all becomes legacy code anyway, ready to be replaced by Go, Ruby, Rust, Zig, or whatever is the lang hype du jour at the time.

Haha, this reminds me of one of my favorite Walter quotes:

	I've been around long enough to have seen an endless parade of
	magic new techniques du jour, most of which purport to remove
	the necessity of thought about your programming problem.  In the
	end they wind up contributing one or two pieces to the
	collective wisdom, and fade away in the rearview mirror.
	-- Walter Bright


But yeah, it's the hype that drives adoption.  The technical merit, not so much. If at all.


[...]
> Slow, long tail growth is the best kind of growth, in my old guy opinion.

I agree.

But that's a hard sell in today's age of instant gratification. Nobody wants to -- nor has the time to -- gradually build up an infrastructure that will last for the long term.  They want, and need, a solution NOW, and you better be able to deliver NOW, otherwise they will just move on to the next salesman who promises the here and now.

And it's hard to fault them when the investors are knocking on their door asking when the promised results will materialize.


> D is making sound decisions in the now it seems, growing slowly, which is sound in the long term.  Now to convince the D dev team that we, the stupid programmers, leave behind code that may not age well if even a small language detail changes.  Cover that base and businesses will follow.
[...]

I think this is what Andrei has been saying every so often: stop deleting, start adding.  I.e., never remove old features, just add new ones to be used in favor if necessary.  If needed, deprecate and hide away old features (hide away the docs so that new people don't accidentally use it for new code), but never, ever remove it. If a new, better way to do something is discovered, add it to the language, highlight it front and center in the docs, but don't ever remove the old stuff.  If we need to redesign Phobos, for example, do it in a new namespace, don't remove/replace the old one.  Make it possible for the two to coexist.  Accretion rather than replacement.

OT1H the idealist in me screams "please break my code, fix the language to remove all the broken stuff!".  OTOH the long-term code maintainer in me shakes his fists every time a compiler upgrade breaks one of my old projects.

The solution to the conundrum is, accretion rather than replacement. Let the old code compile.  Complain about deprecations if you have to, but don't ever make it not compile (unless it was an outright bug to have compiled in the first place).  But new code can use the new stuff and benefit from it.


T

-- 
Trying to define yourself is like trying to bite your own teeth. -- Alan Watts
August 10, 2021
On 8/9/2021 12:01 PM, bachmeier wrote:
> There's an important difference though. Signed/unsigned mistakes are a choice the programming language designer makes - Walter made his choice and Gosling made a different choice. You're more or less stuck with the limitations of floating point.

The thing about two's complement arithmetic, and IEEE floating point, is their faults are very well known. You can move your understanding of them from one language to the next.

It's worthwhile to invest a few minutes learning about them if you're going to spend your career programming.

Just like it's worthwhile learning about how tire adhesion works when looking forward to a decades of driving. (It can save your life.)
August 10, 2021
On 8/9/2021 12:23 PM, H. S. Teoh wrote:
> As Knuth once said:
> 
> 	People who are more than casually interested in computers should
> 	have at least some idea of what the underlying hardware is like.
> 	Otherwise the programs they write will be pretty weird.
> 	-- D. Knuth
> 
> That includes knowing the ugly realities of 2's complement arithmetic.

A friend mine (a very smart one) back in college one day decided to learn programming. He got the Fortran specification(!), read it, and wrote a program.

The program ran correctly, but incredibly slowly. Baffled, he took it to his programmer friend for help. The friend laughed, and said here's the problem: you're writing to a file in a loop:

    loop
        open the file
        append a character
        close the file

instead of:

   open the file
   loop
      append character
   close the file

My friend said he followed the spec, which said nothing at all about how file I/O actually worked.
August 10, 2021
On Tuesday, 10 August 2021 at 17:47:48 UTC, H. S. Teoh wrote:
> I think this is what Andrei has been saying every so often: stop deleting, start adding.  I.e., never remove old features, just add new ones to be used in favor if necessary.  If needed, deprecate and hide away old features (hide away the docs so that new people don't accidentally use it for new code), but never, ever remove it. If a new, better way to do something is discovered, add it to the language, highlight it front and center in the docs, but don't ever remove the old stuff.  If we need to redesign Phobos, for example, do it in a new namespace, don't remove/replace the old one.  Make it possible for the two to coexist.  Accretion rather than replacement.
>
> OT1H the idealist in me screams "please break my code, fix the language to remove all the broken stuff!".  OTOH the long-term code maintainer in me shakes his fists every time a compiler upgrade breaks one of my old projects.
>
> The solution to the conundrum is, accretion rather than replacement. Let the old code compile.  Complain about deprecations if you have to, but don't ever make it not compile (unless it was an outright bug to have compiled in the first place).  But new code can use the new stuff and benefit from it.

Of course, the obvious counterargument is that this approach is exactly how you end up with a language like C++: powerful in the hands of experts, with a large and successful ecosystem, but packed to the gills with pitfalls and footguns for beginners to hurt themselves with.

Ultimately, that's the question D's leadership has to answer. What's more important to D's future: existing users of D, or potential future users of D?

C++ has a lot of inertia, and doesn't necessarily need to attract new users to continue being a successful language. But I'm not sure the same is true for D. If C++ holds onto legacy features that cause pain for beginners, people will grumble and put up with it, because C++ is so entrenched that they have no better option. If D does the same thing, people are more likely to just leave for Rust, or Go, or C#.