November 17, 2014
On 11/17/2014 1:08 PM, "Ola Fosheim Grøstad" <ola.fosheim.grostad+dlang@gmail.com>" wrote:
> I've been saying that for SOME OPERATIONS they are too, and that is not without
> evidence. Just plot it out for a 65xx, 680xx, Z80 etc CPU and it becomes
> self-evident. Any system level programmer should be able to do it in a few minutes.

When designing a language data type, you don't design it for "some" operations. You design it so that it works best most of the time, or at least let the user decide.

You can always add a sentinel for specific cases. But C forces its use for all strings for practical purposes. The design is backwards, and most of the time a sentinel is the wrong choice.

BTW, I learned how to program on a 6800. I'm not ignorant of those machines. And frankly, C is too high level for the 6800 (and the other 8 bit CPUs). The idea that C maps well onto those processors is mistaken. Which is hardly surprising, as C was developed for the PDP-11, a 16 bit machine.

Yes, I know that people did use C for 8 bit machines.
November 17, 2014
On Monday, 17 November 2014 at 22:03:48 UTC, Walter Bright wrote:
> You can always add a sentinel for specific cases. But C forces its use for all strings for practical purposes. The design is backwards, and most of the time a sentinel is the wrong choice.

Ok, but I would rather say it like this: the language C doesn't really provide strings, it only provides literals in a particular format. So the literal-format is a trade-off between having something generic and simple and having something more complex and possibly limited (having 255 char limit is not good enough in the long run).

I think there is a certain kind of beauty to the minimalistic approach taken with C (well, at least after ANSI-C came about in the late 80s). I like the language better than the libraries…

> BTW, I learned how to program on a 6800. I'm not ignorant of those machines. And frankly, C is too high level for the 6800 (and the other 8 bit CPUs). The idea that C maps well onto those processors is mistaken.

Yes I agree, but those instruction sets are simple. :-) With only 256 bytes of builtin RAM (IIRC) the 6800 was kind of skimpy on memory! We used it in high school for our classes in digital circuitry/projects.

(It is very difficult to discuss performance on x86, there is just too much clutter and machinery in the core that can skew results.)
November 18, 2014
On 11/17/2014 3:15 PM, "Ola Fosheim Grøstad" <ola.fosheim.grostad+dlang@gmail.com>" wrote:
> Ok, but I would rather say it like this: the language C doesn't really provide
> strings, it only provides literals in a particular format. So the literal-format
> is a trade-off between having something generic and simple and having something
> more complex and possibly limited (having 255 char limit is not good enough in
> the long run).

The combination of the inescapable array-to-ptr decay when calling a function, coupled with the Standard library which is part of the language that takes char* as strings, means that for all practical purposes C does provide strings, and pretty much forces it on the programmer.


> I think there is a certain kind of beauty to the minimalistic approach taken
> with C (well, at least after ANSI-C came about in the late 80s). I like the
> language better than the libraries…

C is a brilliant language. That doesn't mean it hasn't made serious mistakes in its design. The array decay and 0 strings have proven to be very costly to programmers over the decades.

November 18, 2014
On Monday, 17 November 2014 at 19:24:49 UTC, Walter Bright wrote:
> On 11/17/2014 3:00 AM, "Ola Fosheim Grøstad" <ola.fosheim.grostad+dlang@gmail.com>" wrote:
>> I am saying
>> that when you have <32KiB RAM total it makes sense to save space by not storing
>> the string length.
>
> I know what you're saying.
>
> You're saying without evidence that sentinels are faster. They are not.
> You're saying without evidence that 0 terminated strings use less memory. They do not.
>
> (It does not save space when "filename" and "filename.ext" cannot be overlapped.)

Stop wasting time with the mouth breather.
November 18, 2014
On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:
> On 11/17/2014 3:15 PM, "Ola Fosheim Grøstad" <ola.fosheim.grostad+dlang@gmail.com>" wrote:
>> Ok, but I would rather say it like this: the language C doesn't really provide
>> strings, it only provides literals in a particular format. So the literal-format
>> is a trade-off between having something generic and simple and having something
>> more complex and possibly limited (having 255 char limit is not good enough in
>> the long run).
>
> The combination of the inescapable array-to-ptr decay when calling a function, coupled with the Standard library which is part of the language that takes char* as strings, means that for all practical purposes C does provide strings, and pretty much forces it on the programmer.
>
>
>> I think there is a certain kind of beauty to the minimalistic approach taken
>> with C (well, at least after ANSI-C came about in the late 80s). I like the
>> language better than the libraries…
>
> C is a brilliant language. That doesn't mean it hasn't made serious mistakes in its design. The array decay and 0 strings have proven to be very costly to programmers over the decades.

Heartbleed is a nice example.

The amount of money in developer time, delivery software updates to customers and buying new hardware with firmware that cannot be replaced.

This is just one case, the CVE List gets updated every day and 90% of the issues are the usual C suspects regarding pointer misuse and out of bounds.

Anyone writing C code should by following practices like https://wiki.debian.org/Hardening

--
Paulo
November 18, 2014
On Tuesday, 18 November 2014 at 04:58:43 UTC, Anonymous Coward wrote:
> Stop wasting time with the mouth breather.

Please write under your full name.
November 18, 2014
On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:
> C is a brilliant language. That doesn't mean it hasn't made serious mistakes in its design. The array decay and 0 strings have proven to be very costly to programmers over the decades.

I'd rather say that it is the industry that has misappropriated C, which in my view basically was "typed portable assembly" with very little builtin presumptions by design. This is important when getting control over layout, and this transparency is a quality that only C gives me. BCPL might be considered to have more presumptions (such as string length), being a minimal "bootstrapping subset" of CPL.

You always had the ability in C to implement arrays as a variable sized struct with a length and a trailing data section, so I'd say that the C provided type safe variable length arrays. Many people don't use it. Many people don't know how to use it. Ok, but then they don't understand that they are programming in a low level language and are responsible for creating their own environment. I think C's standard lib mistakingly created an illusion of high level programming that the language only partially supported.

Adding the ability to transfer structs by value as a parameter was probably not worth the implementation cost at the time… Having a "magic struct/tuple" that transfer length or end pointer with the head pointer does not fit the C design. If added it should have been done as a struct and to make that work you would have to add operator overloading. There's an avalanche effect of features and additional language design issues there.

I think K&R deserves credit for being able to say no and stay minimal, I think the Go team deserves the same credit. As you've experienced with D, saying no is hard because there are often good arguments for features being useful and difficult to say in advance with certainty what kind of avalanche effect adding features have (in terms of semantics, special casing and new needs for additional support/features, time to complete implementation/debugging). So saying no until practice shows that a feature is sorely missed is a sign of good language design practice.

The industry wanted portability and high speed and insisted moving as a flock after C and BLINDLY after C++. Seriously, the media frenzy around C++ was hysterical despite C++ being a bad design from the start. The C++ media noise was worse than with Java IIRC. Media are incredibly shallow when they are trying to sell mags/books based on the "next big thing" and they can accelerate adoption beyond merits. Which both C++ and Java are two good examples of.

There were alternatives such as Turbo Pascal, Modula-2/3, Simula, Beta, ML, Eiffel, Delphi and many more. Yet, programmers thought C was cool because it was "portable assembly" and "industry standard" and "fast" and "safe bet". So they were happy with it, because C compiler emitted fast code. And fast was more important to them than safe. Well, they got what they deserved, right?

Not adding additional features is not a design mistake if you try hard to stay minimal and don't claim to support high level programming. The mistake is in using a tool as if it supports something it does not.

You might be right that K&R set the bar too high for adding extra features. Yet others might be right that D has been too willing to add features. As you know, the perfect balance is difficult to find and it is dependent on the use context, so it materialize after the fact (after implementation). And C's use context has expanded way beyond the original use context where people were not afraid to write assembly.

(But the incomprehensible typing notation for function pointers was a design mistake since that was a feature of the language.)
November 18, 2014
On Tuesday, 18 November 2014 at 11:15:28 UTC, Ola Fosheim Grøstad wrote:
> On Tuesday, 18 November 2014 at 02:35:41 UTC, Walter Bright wrote:
>> C is a brilliant language. That doesn't mean it hasn't made serious mistakes in its design. The array decay and 0 strings have proven to be very costly to programmers over the decades.
>
> I'd rather say that it is the industry that has misappropriated C, which in my view basically was "typed portable assembly" with very little builtin presumptions by design.

Lint was created in 1979 when it was already clear most AT&T developers weren't writing correct C code!

>
> I think K&R deserves credit for being able to say no and stay minimal, I think the Go team deserves the same credit.

Of course, two of them are from the same team.

> The industry wanted portability and high speed and insisted moving as a flock after C and BLINDLY after C++. Seriously, the media frenzy around C++ was hysterical despite C++ being a bad design from the start. The C++ media noise was worse than with Java IIRC. Media are incredibly shallow when they are trying to sell mags/books based on the "next big thing" and they can accelerate adoption beyond merits. Which both C++ and Java are two good examples of.
>
> There were alternatives such as Turbo Pascal, Modula-2/3, Simula, Beta, ML, Eiffel, Delphi and many more. Yet, programmers thought C was cool because it was "portable assembly" and "industry standard" and "fast" and "safe bet".

This was a consequence of UNIX spreading into the enterprise, like we
have to endure JavaScript to target the browser, we were forced to
code in C to target UNIX.

Other OS just followed along, as we started to want to port those big
iron utilities to smaller computers.

If UNIX had been written in XPTO-LALA, we would all be coding in XPTO-LALA today.


--
Paulo
November 18, 2014
On Tuesday, 18 November 2014 at 08:28:19 UTC, Paulo  Pinto wrote:
> This is just one case, the CVE List gets updated every day and 90% of the issues are the usual C suspects regarding pointer misuse and out of bounds.

Sure, but these are not a strict language issues since the same developers would turn off bounds-checking at the first opportunity anyway!

Professionalism does not involve blaming the tool, it involves picking the right tools and process for the task. Unfortunately the IT industry has over time suffered from a lack of formal education and immature markets. Software is considered to work when it crash only once every 24 hours, we would not accept that from any other utility?

I've never heard anyone in academia claim that C is anything more than a small step up from assembler (i.e. low level), so why allow intermediate skilled programmers to write C code if you for the same application would not allow an excellent programmer to write the same program in assembly (about the same risk of having a crash). People get what they deserve.

Never blame the tool for bad management. You get to pick the tool and the process, right? Neither the tool or testing will ensure correct behaviour on its own. You have many factors that need to play together (mindset, process and the tool set).

If you want a compiler that works, you're probably better off writing it in ML than in C, but people implement it in C. Why? Because they FEEL like it… It is not rational. It is emotional.
November 18, 2014
On Tuesday, 18 November 2014 at 12:02:01 UTC, Paulo  Pinto wrote:
> On Tuesday, 18 November 2014 at 11:15:28 UTC, Ola Fosheim Grøstad wrote:
>> I'd rather say that it is the industry that has misappropriated C, which in my view basically was "typed portable assembly" with very little builtin presumptions by design.
>
> Lint was created in 1979 when it was already clear most AT&T developers weren't writing correct C code!

Sure, but most operating system vendors considered it a strategic move to ensure availability of high level languages on their mainframes. E.g. Univac provided Algol and gave a significant rebate to the developers of Simula on the purchase of a Univac to ensure that Simula would be available for high level programming.

>> There were alternatives such as Turbo Pascal, Modula-2/3, Simula, Beta, ML, Eiffel, Delphi and many more. Yet, programmers thought C was cool because it was "portable assembly" and "industry standard" and "fast" and "safe bet".
>
> This was a consequence of UNIX spreading into the enterprise, like we
> have to endure JavaScript to target the browser, we were forced to
> code in C to target UNIX.

Nobody were forced to write code in C to target anything, it was a choice. And a choice that grew out of a focus on performance and the fact that people still dropped down to write machine language quit frequently. Mentality matters.

Javascript is different, since it is "the exposed VM" in the browser, but even there you don't have to write in Javascript. You can write in a language that compiles to javascript.