August 09, 2021

On Sunday, 8 August 2021 at 18:26:10 UTC, Brian Tiffin wrote:

>

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

...
Language designers seem to have a big brother attitude towards programmers and think they will save the world by introducing limitations.

Examples.
...
2.
Somewhat related. when Java was designed, the designer (James Gosling I believe) claimed that programmers were too stupid to understand the difference between signed and unsigned math (despite often several years of university education) and removed signed math entirely from the language. The impact is that when unsigned math is required, you are forced to conversions and library solutions. Not ideal when an HW APIs deals with unsigned numbers for example.

You are welcome to add any other examples that you find significant for the discussion.

This partially applies to D in some extent but can often be found in other languages and mentality of several language designers.

The question is, do you think language designers go to far when trying to "save" programmers from misuse or not?
Do you think there can be approaches that both prevent bugs at the same time do not limit the language?

Just to point out that using Java as a sample seems a bit off. Java (well, the JVM) is the ultimate in restrictive. It's a compiler that doesn't emit CPU code, but safe(r) emulation in a sandbox. ....

If you are talking about Java 1.0 - 1.2, yeah.

Since around 2000, AOT compilation has always been a capability in commercial JDKs like Excelsior JET, IBM RealTime WebSphere, Aonix, even GCC had GCJ until 2009.

Nowadays PTC, Jamaica are still around, GraalVM exists, while the JIT cache from JRockit has been made part of OpenJDK, and IBM RealTime WebSphere are now part of IBM OpenJ9.

August 09, 2021
On Sunday, 8 August 2021 at 08:59:16 UTC, Walter Bright wrote:
> All he said was "thank god for mom!"

Just an off-topic philosophical question here, but was he thanking god, his mom, or both?
August 09, 2021

On Sunday, 8 August 2021 at 09:49:41 UTC, Dylan Graham wrote:

>

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

[snip]

Zig does stuff like this and it's why I can't take that language seriously. At all. To paraphrase what I was told by Zig's community and BDFL: "if there's multiple ways to do something then obviously dumb dumb programmer will get it completely wrong".

I like programming languages that help me catch bugs (D or Rust), not languages that treat me like a 3 year old.

To be fair, I don't think lack of unsigned integers is a big deal. Because many operators do not care whether it's signed or unsigned that's being used, and for the remaining ones (divide, modulus, right shift) the language can define standard library functions.

But I do agree in general that either Java does not have much faith in programmer ability, or they value implementation simplicity much more than readability. I can see no other reasons for disallowing free functions or requiring curly braces for every singe if or while statement. Or for requiring repeating public on every public member instead of just before the first one as in C++ (Haven't done much Java, forgive me if I recall the details wrong).

August 09, 2021

On Monday, 9 August 2021 at 12:14:27 UTC, Dukc wrote:

>

On Sunday, 8 August 2021 at 09:49:41 UTC, Dylan Graham wrote:

>

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

[snip]

Zig does stuff like this and it's why I can't take that language seriously. At all. To paraphrase what I was told by Zig's community and BDFL: "if there's multiple ways to do something then obviously dumb dumb programmer will get it completely wrong".

I like programming languages that help me catch bugs (D or Rust), not languages that treat me like a 3 year old.

To be fair, I don't think lack of unsigned integers is a big deal. Because many operators do not care whether it's signed or unsigned that's being used, and for the remaining ones (divide, modulus, right shift) the language can define standard library functions.

But I do agree in general that either Java does not have much faith in programmer ability, or they value implementation simplicity much more than readability. I can see no other reasons for disallowing free functions or requiring curly braces for every singe if or while statement. Or for requiring repeating public on every public member instead of just before the first one as in C++ (Haven't done much Java, forgive me if I recall the details wrong).

You can kind of fake free functions with import static or method references.

Compound statements aren't required for if/while, only a convention inherited from C best practices, goto fail from Apple showed the world why they are a good idea.

Class members without visibility specificier default to package visibility, so they are public to remaining classes on the package or module (as of Java 9+).

August 09, 2021

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

Somewhat related. when Java was designed, the designer (James Gosling I believe) claimed that programmers were too stupid to understand the difference between signed and unsigned math (despite often several years of university education) and removed signed math entirely from the language. The impact is that when unsigned math is required, you are forced to conversions and library solutions. Not ideal when an HW APIs deals with unsigned numbers for example.

Programmers are humans that write programs. I've surely written more than a million lines of code in my life (who knows how much, but that's definitely a lower bound) and I did not study unsigned math in college. I took one programming class and I've done a lot of independent study. Maybe I could figure out how to work with unsigned math, but why would I want to? I have better things to do with my time.

But set all that aside. Anyone that's taught a university class will agree that you can't assume someone understands something just because they attended a lecture and took a test over it.

I don't necessarily disagree that there are some cases of overly restrictive language design. I didn't last very long with Go for that reason. I just think unsigned math is not the best example. Switching to safe by default would be a better example.

August 09, 2021

On Monday, 9 August 2021 at 15:15:53 UTC, bachmeier wrote:

>

On Saturday, 7 August 2021 at 12:15:15 UTC, IGotD- wrote:

>

Somewhat related. when Java was designed, the designer (James Gosling I believe) claimed that programmers were too stupid to understand the difference between signed and unsigned math (despite often several years of university education) and removed signed math entirely from the language. The impact is that when unsigned math is required, you are forced to conversions and library solutions. Not ideal when an HW APIs deals with unsigned numbers for example.

Programmers are humans that write programs. I've surely written more than a million lines of code in my life (who knows how much, but that's definitely a lower bound) and I did not study unsigned math in college. I took one programming class and I've done a lot of independent study. Maybe I could figure out how to work with unsigned math, but why would I want to? I have better things to do with my time.

But set all that aside. Anyone that's taught a university class will agree that you can't assume someone understands something just because they attended a lecture and took a test over it.

I don't necessarily disagree that there are some cases of overly restrictive language design. I didn't last very long with Go for that reason. I just think unsigned math is not the best example. Switching to safe by default would be a better example.

Isn't unsigned math exist because we are limited in how many bits the number can hold? There's no concept of signed/unsigned for float/double/real. Which one is simpler that would introduce less complexity for the programmer writing code?

August 09, 2021

On 8/7/21 8:15 AM, IGotD- wrote:

>

This is a general discussion which applies to all computer languages and also under several decades. What I have observed is that language designers see programmers misuse the language and introduce possible bugs and therefore remove features in languages. An analogy would limit the functionality of cars because people sometimes are involved in accidents, like automatic speed limiter (soon to be law in several countries).

Language designers seem to have a big brother attitude towards programmers and think they will save the world by introducing limitations.

Examples.

Array indexes should be signed instead of unsigned because somehow programmers mess up loops among other things. Bjarne Stroustrup considered his unsigned index to be a historic mistake. While unsigned array indexes make perfectly sense, the bias towards signed seems to be that programmers are stupid. The question is, if can't make a for loop with unsigned math, shouldn't you look for another job?

If people who have made mistakes with unsigned math in loops were disqualified from programming, this place would be a ghost town.

Also note that signed indexes are allowed (you can also use negative indexes)! It's the array LENGTH being unsigned which is a problem. Note also due to D promotion rules, using any unsigned values poisons all your other values to be unsigned (usually unexpectedly).

>

Somewhat related. when Java was designed, the designer (James Gosling I believe) claimed that programmers were too stupid to understand the difference between signed and unsigned math (despite often several years of university education) and removed signed math entirely from the language. The impact is that when unsigned math is required, you are forced to conversions and library solutions. Not ideal when an HW APIs deals with unsigned numbers for example.

"too stupid" seems like an incorrect assessment. More like "too careless". Consider that it's not really unsigned math or signed math, but when you are doing math between signed and unsigned values, what should happen? That's where most people get into trouble. Note that signed math and unsigned math is identical, it's just most people aren't doing math around the value 2^31, but they do a lot around the value 0.

I would love for D to use signed indexes for arrays, especially with 64-bit integers.

>

The question is, do you think language designers go to far when trying to "save" programmers from misuse or not?
Do you think there can be approaches that both prevent bugs at the same time do not limit the language?

Until the world of programming is ruled by perfect AI, please keep trying to fix my stupid human mistakes, thanks!

However, I do know of cases that have gone too far. Like Swift eliminating for loops -- that one stung.

-Steve

August 09, 2021

On Monday, 9 August 2021 at 16:14:49 UTC, Mark wrote:

>

Isn't unsigned math exist because we are limited in how many bits the number can hold? There's no concept of signed/unsigned for float/double/real. Which one is simpler that would introduce less complexity for the programmer writing code?

I'm definitely not qualified to answer such questions, but I can say I hate unsigned integers with a passion. See for instance https://forum.dlang.org/post/rdrqedmbknwrppbfixll@forum.dlang.org

August 09, 2021
On Mon, Aug 09, 2021 at 04:14:49PM +0000, Mark via Digitalmars-d wrote: [...]
> Isn't unsigned math exist because we are limited in how many bits the number can hold? There's no concept of signed/unsigned for float/double/real. Which one is simpler that would introduce less complexity for the programmer writing code?

Floating-point, if anything, is *more* complex and comes with more pitfalls than signed/unsigned mistakes. Worse yet, it *appears* to behave like what most people imagine real numbers to behave, but in fact doesn't, which makes mistakes more likely, and also harder to catch because they usually only crop up in special corner cases while generally appearing to work correctly.

Cf.: https://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html

At the end of the day, it boils down to: you have to learn how your programming language and its types work. If you imagine you can just cowboy your way through programming without fully understanding what you're doing, you only have yourself to blame when the results are disappointing.

As Walter once said:

	I've been around long enough to have seen an endless parade of
	magic new techniques du jour, most of which purport to remove
	the necessity of thought about your programming problem.  In the
	end they wind up contributing one or two pieces to the
	collective wisdom, and fade away in the rearview mirror.
	-- Walter Bright


T

-- 
Windows 95 was a joke, and Windows 98 was the punchline.
August 09, 2021

On Monday, 9 August 2021 at 16:27:28 UTC, Steven Schveighoffer wrote:

>

However, I do know of cases that have gone too far. Like Swift eliminating for loops -- that one stung.

-Steve

Is this correct? for loops with classical C syntax are removed but you can still have for loops over ranges and the x...y syntax just makes an integer range of your liking. This is similar to foreach (i; 0 .. 3) in D.

It's just a syntax change and bugs with ranges is probably just as easy as with the old C syntax.