May 17, 2017
On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
> On Wed, May 17, 2017 at 04:16:59PM -0700, Walter Bright via Digitalmars-d wrote:
>> On 5/17/2017 1:46 PM, H. S. Teoh via Digitalmars-d wrote:
>>> People aren't willing to accept that their cherished choice of
>>> language may have been the wrong one, especially if they have
>>> invested much of their lives in mastering said language.
>>
>> It may not be the developers that initiate this change. It'll be the
>> managers and the customers who force the issue - as those are the
>> people who'll pay the bill for the problems.
>
> That may or may not force a shift to a different language. In fact, the
> odds are heavily stacked against a language change. Most management are
> concerned (and in many cases, rightly so) about the cost of rewriting
> decades-old "proven" software as opposed to merely plugging the holes in
> the existing software.  As long as they have enough coders plugging away
> at the bugs, they're likely to be inclined to say "good enough".

What will cause a shift is a continuous business loss.

If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.

The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.

At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).

I wonder how much something like D in betterC mode can take over some of these tasks?

-Steve
May 17, 2017
On Wed, May 17, 2017 at 08:58:31PM -0400, Steven Schveighoffer via Digitalmars-d wrote:
> On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
[...]
> What will cause a shift is a continuous business loss.
> 
> If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.
> 
> The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.

This is a possible scenario, but I don't see it being particularly likely, because in terms of data breaches, memory safety is only part of the equation. Other factors will also come into play in determining the overall susceptibility of a system. Things like secure coding practices, and by that I include more than just memory safety, such as resource management, proper use of cryptographic technology, privilege separation, access control, data sanitation, etc..  In spite of C's flaws, it *is* still possible to create a relatively secure system in C. It's harder, no argument about that, but possible.  It depends on how the company implements secure coding practices (or not).  In a memory safe language you can still make blunders that allow breaches like SQL injection in spite of memory safety.



> At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).
[...]

If that's your definition of "dying", then C has been steadily dying over the past decade or two already. :-)  Since the advent of Java, C#, and the rest of that ilk, large droves of programmers have been leaving C and adopting these other languages instead. I don't have concrete data to back this up, but my suspicion is that the majority of new projects started today are not in C, but in Java, C#, Javascript, and similar languages, and a smaller percentage in C++, depending on the type of project. Perhaps in gaming circles C++ might still be dominant, but in the business applications world and in the web apps world the trend is definitely on Java, C#, et al. C's role has pretty much been shrinking to embedded software and low-level stuff like OSes (mainly Posix) and low-level network code. (Unfortunately it still accounts for a significant number of low-level network code, especially those running on network hardware like routers and firewalls, which is why security issues in C code are still a concern today.)

Nevertheless, there is still an ongoing stream of exploits and security incidents in the web programming world largely driven by supposedly memory-safe languages like Java or Javascript. (Well, there is that disaster called PHP that's still prevalent on the web, maybe that accounts for some percentage of these exploits. But that's mostly in the implementation of PHP rather than the language itself, since AFAIK it doesn't let you manipulate memory directly in an unsafe way like C does. But it does let you do a lot of other stupid things, security-wise, that will still pose problems even though it's technically memory-safe. That's why I said, memory safety only goes so far -- you need a lot more than that to stand in the face of today's security threats.)


T

-- 
People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird. -- D. Knuth
May 18, 2017
On Wednesday, 17 May 2017 at 20:41:43 UTC, Walter Bright wrote:
> On 5/17/2017 3:21 AM, Joakim wrote:
>> Hmm, this talk has become the most-viewed from this DConf, by far beating
>> Scott's keynote.  Wonder how, as this seems to be the only link to it, hasn't
>> been posted on reddit/HN.  I guess people like panels, the process panel last
>> year is one of the most viewed videos also.

Heh, someone just posted it to HN:

https://hn.algolia.com/?query=dconf2017%20walter%20safety

> I received -2 net votes on Hackernews for suggesting that the takeaway from the WannaCry fiasco for developers should be to use memory safe languages.
>
> Maybe the larger community isn't punished enough yet.

HN votes are an isolated case, but I'm not sure how much wider recognition there is that memory safety is a big part of the problem and that there exist viable languages that offer a way out, as Andrei said in the panel.  There is a long way to go in publicizing these new languages that offer better solutions.
May 18, 2017
On Thursday, 18 May 2017 at 00:58:31 UTC, Steven Schveighoffer wrote:
> On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
>> On Wed, May 17, 2017 at 04:16:59PM -0700, Walter Bright via Digitalmars-d wrote:
>>> On 5/17/2017 1:46 PM, H. S. Teoh via Digitalmars-d wrote:
>>>> [...]
>>>
>>> It may not be the developers that initiate this change. It'll be the
>>> managers and the customers who force the issue - as those are the
>>> people who'll pay the bill for the problems.
>>
>> That may or may not force a shift to a different language. In fact, the
>> odds are heavily stacked against a language change. Most management are
>> concerned (and in many cases, rightly so) about the cost of rewriting
>> decades-old "proven" software as opposed to merely plugging the holes in
>> the existing software.  As long as they have enough coders plugging away
>> at the bugs, they're likely to be inclined to say "good enough".
>
> What will cause a shift is a continuous business loss.
>
> If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.
>
> The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.
>
> At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).
>
> I wonder how much something like D in betterC mode can take over some of these tasks?
>
If you get it to compile for and run the code on an AVR, Cortex R0 or other 16 bit µC, then it would have a chance to replace C. As it stands, C is the only general "high-level" language that can be used for some classes of cpu's.
D requires afaict at least a 32 bit system with virtual memory, which is already a steep requirement for embedded stuff.
C will remain relevant in everything below that.

May 18, 2017
On Wed, 2017-05-17 at 17:27 -0700, H. S. Teoh via Digitalmars-d wrote:
> […]
> odds are heavily stacked against a language change. Most management
> are
> concerned (and in many cases, rightly so) about the cost of rewriting
> decades-old "proven" software as opposed to merely plugging the holes
> in
> the existing software.  As long as they have enough coders plugging
> away
> at the bugs, they're likely to be inclined to say "good enough".
[…]

If a lump of software is allowed into the "it works, do no touch it" then that is the beginning of the end for that product and that company. The accountants probably haven't realised it at the time they make that decision, but they have just signed the death warrant on that part of their organisation.

An organisation that keeps all of it's software in development at all times may appear to spend more on development, but they are keeping the organisation codebase in a fit state for evolution. As the market changes, the organisation can change without massive revolution.

The difference here is between an organisation that treats software as a cost versus software as an asset. As long as you do not measure the asset by lines of code, obviously.

The rather interesting anecdote of the moment is FORTRAN (and Fortran). Various code bases written in the 1960s must still be compilable by current Fortran compilers because no-one is allowed to alter the source code of the 1960s codes. This makes Fortran one of the weirdest languages, and their compiler writers some of the best. Note though that all the organisation who followed the "the source code is fine now" are having real troubles hiring FORTRAN and Fortran developers, c.f. UK government and NASA. I believe some organisations are having to hire at £2000 per day for these people.

So for the accountants: you need to look further than the next three months when it comes to your assets and bottom line over the lifetime of the organisation.

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder

May 18, 2017
On Thursday, 18 May 2017 at 05:07:38 UTC, Patrick Schluter wrote:
> On Thursday, 18 May 2017 at 00:58:31 UTC, Steven Schveighoffer wrote:
>> On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
>>> [...]
>>
>> What will cause a shift is a continuous business loss.
>>
>> If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.
>>
>> The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.
>>
>> At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).
>>
>> I wonder how much something like D in betterC mode can take over some of these tasks?
>>
> If you get it to compile for and run the code on an AVR, Cortex R0 or other 16 bit µC, then it would have a chance to replace C. As it stands, C is the only general "high-level" language that can be used for some classes of cpu's.
> D requires afaict at least a 32 bit system with virtual memory, which is already a steep requirement for embedded stuff.
> C will remain relevant in everything below that.

https://www.mikroe.com/products/#compilers-software

One of the few companies that thinks there is more to AVR, Cortex R0 or other 16 bit µC than just C.

On this specific case they also sell Basic and Pascal (TP compatible) compilers.

There are other companies selling alternatives to C and still in business, one just has to look beyond FOSS.
May 18, 2017
On Thursday, 18 May 2017 at 00:58:31 UTC, Steven Schveighoffer wrote:
> On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
>> [...]
>
> What will cause a shift is a continuous business loss.
>
> If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.
>
> The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.
>
> At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).
>
> I wonder how much something like D in betterC mode can take over some of these tasks?
>
> -Steve

Is there any other way other than to do good work that's recognised as such and happens to be written in D? And I guess open sourcing dmd back end will make it a more acceptable language for such in time.
May 18, 2017
On Thursday, 18 May 2017 at 06:36:55 UTC, Paulo Pinto wrote:
> On Thursday, 18 May 2017 at 05:07:38 UTC, Patrick Schluter wrote:
>> On Thursday, 18 May 2017 at 00:58:31 UTC, Steven Schveighoffer wrote:
>>> On 5/17/17 8:27 PM, H. S. Teoh via Digitalmars-d wrote:
>>>> [...]
>>>
>>> What will cause a shift is a continuous business loss.
>>>
>>> If business A and B are competing in the same space, and business A has a larger market share, but experiences a customer data breach. Business B consumes many of A's customers, takes over the market, and it turns out that the reason B wasn't affected was that they used a memory-safe language.
>>>
>>> The business cases like this will continue to pile up until it will be considered ignorant to use a non-memory safe language. It will be even more obvious when companies like B are much smaller and less funded than companies like A, but can still overtake them because of the advantage.
>>>
>>> At least, this is the only way I can see C ever "dying". And of course by dying, I mean that it just won't be selected for large startup projects. It will always live on in low level libraries, and large existing projects (e.g. Linux).
>>>
>>> I wonder how much something like D in betterC mode can take over some of these tasks?
>>>
>> If you get it to compile for and run the code on an AVR, Cortex R0 or other 16 bit µC, then it would have a chance to replace C. As it stands, C is the only general "high-level" language that can be used for some classes of cpu's.
>> D requires afaict at least a 32 bit system with virtual memory, which is already a steep requirement for embedded stuff.
>> C will remain relevant in everything below that.
>
> https://www.mikroe.com/products/#compilers-software
>
> One of the few companies that thinks there is more to AVR, Cortex R0 or other 16 bit µC than just C.
>
> On this specific case they also sell Basic and Pascal (TP compatible) compilers.
>
> There are other companies selling alternatives to C and still in business, one just has to look beyond FOSS.

The thing with C is that it is available from the tiniest to the biggest. I remember in my former work place where the asset of the company were communication protocols (mainframe, telecom, lan, industrial). The same sources were used on Z80, x86 (from 80186 to Pentium), 68030, ARM, AVR and 8051 (granted the 2 last didn't use much of the C code). Except for C I'm not aware of any language capable of that spread.
This doesn't mean that it won't change or that something similar wasn't possible with other languages. Pascal was a good contender in the 90s but paradoxically it is the success of turbo-pascal that killed it (i.e. there was no chance for the ISO standard to be appealing). As for Basic, the big issue with it is that it is not even portable within a platform or between versions.
Don't get me wrong, the products you listed are nice and a good step in the right direction, but they are far from there.
I love D but it is not being unfair to notice that it has a lack of platform diversity (I wanted to use D for years for our project at work but the lack of Solaris-SPARCv9 version was an insurmountable issue).
May 18, 2017
On 5/17/2017 10:07 PM, Patrick Schluter wrote:
> D requires afaict at least a 32 bit system

Yes.

> with virtual memory,

No.

May 18, 2017
On Thursday, 18 May 2017 at 08:24:18 UTC, Walter Bright wrote:
> On 5/17/2017 10:07 PM, Patrick Schluter wrote:
>> D requires afaict at least a 32 bit system
>
> Yes.

What are the technical limitations of this?
*LLVM has 16bit targets
*Nobody would use druntime on 16bit anyway and would not generate module info either
*User code is the users problem,
that leaves the front end and ldc as potential sources of limitation.