May 11, 2017
On Tuesday, 9 May 2017 at 14:13:31 UTC, Walter Bright wrote:
> 2. it may not be available on your platform

I just had to use valgrind for the first time in years at work (mostly Python code there) and I realized that there's no version that works on the latest OS X version. So valgrind runs on about 2.5% of computers in existence.

Fun!
May 11, 2017
On Thursday, 11 May 2017 at 21:20:35 UTC, Jack Stouffer wrote:
> On Tuesday, 9 May 2017 at 14:13:31 UTC, Walter Bright wrote:
>> 2. it may not be available on your platform
>
> I just had to use valgrind for the first time in years at work (mostly Python code there) and I realized that there's no version that works on the latest OS X version. So valgrind runs on about 2.5% of computers in existence.
>
> Fun!

Use ASAN.
May 11, 2017
On 05/10/2017 08:06 AM, Patrick Schluter wrote:
> On Wednesday, 10 May 2017 at 06:28:31 UTC, H. S. Teoh wrote:
>> On Tue, May 09, 2017 at 09:19:08PM -0400, Nick Sabalausky
> [...]
>> Perhaps I'm just being cynical, but my current unfounded hypothesis is
>> that the majority of C/C++ programmers ...
>
> Just a nitpick, could we also please stop conflating C and C++
> programmers? My experience is that C++ programmer are completely
> clueless when it comes to C programming? They think they know C but it's
> generally far away. The thing is, that C has evolved with C99 and C11
> and the changes have not all been adopted by C++ (and Microsoft actively
> stalling the adoption of C99 in Visual C didn't help either).

I wouldn't know the difference all that well anyway. Aside from a brief stint playing around with the Marmalade engine, the last time I was still really using C *or* C++ was back when C++ *did* mean little more than "C with classes" (and there was this new "templates" thing that was considered best avoided for the time being because all the implementations were known buggy). I left them when I could tell the complexity of getting things done (in either) was falling way behind the modern curve, and there were other languages which offered sane productivity without completely sacrificing low-level capabilities.

May 11, 2017
On 05/11/2017 11:53 AM, Jonathan M Davis via Digitalmars-d wrote:
>
> In a way, it's amazing how successful folks can be with software that's
> quite buggy. A _lot_ of software works just "well enough" that it gets the
> job done but is actually pretty terrible. And I've had coworkers argue to me
> before that writing correct software really doesn't matter - it just has to
> work well enough to get the job done. And sadly, to a great extent, that's
> true.
>
> However, writing software that's works just "well enough" does come at a
> cost, and if security is a real concern (as it increasingly is), then that
> sort of attitude is not going to cut it. But since the cost often comes
> later, I don't think that it's at all clear that we're going to really see a
> shift towards languages that prevent such bugs. Up front costs tend to have
> a powerful impact on decision making - especially when the cost that could
> come later is theoretical rather than guaranteed.
>
> Now, given that D is also a very _productive_ language to write in, it
> stands to reduce up front costs as well, and that combined with its ability
> to reduce the theoretical security costs, we could have a real win, but with
> how entrenched C and C++ are and how much many companies are geared towards
> not caring about security or software quality so long as the software seems
> to get the job done, I think that it's going to be a _major_ uphill battle
> for a language like D to really gain mainstream use on anywhere near the
> level that languages like C and C++ have. But for those who are willing to
> use a language that makes it harder to write code with memory safety issues,
> there's a competitive advantage to be gained.
>

All very, unfortunately, true. It's like I say, the tech industry isn't engineering, it's fashion. There is no meritocracy here, not by a long shot. In tech: What's popular is right and what's right is popular, period.

May 11, 2017
On 05/10/2017 02:28 AM, H. S. Teoh via Digitalmars-d wrote:
>
> I'd much rather the compiler say "Hey, you! This piece of code is
> probably wrong, so please fix it! If it was intentional, please write it
> another way that makes that clear!" - and abort with a compile error.
>

In the vast majority of cases, yes, I agree. But I've seen good ideas of useful heads-ups the compiler *could* provide get shot down in favor of silence because making it an error would, indeed, be a pedantic pain.

As I see it, an argument against warnings is an argument against lint tools. And lint messages are *less* likely to get heeded, because the user has to actually go ahead and bother to install and run them.


>> That puts me strongly in the philosophy of "Code containing warnings:
>> Allowed while compiling, disallowed when committing (with allowances
>> for mitigating circumstances)."
>
> I'm on the fence about the former.  My current theory is that being
> forced to write "proper" code even while refactoring actually helps the
> quality of the resulting code.

I find anything too pedantic to be an outright error will *seriously* get in my way and break my workflow on the task at hand when I'm dealing with refactoring, debugging, playing around with an idea, etc., if I'm required to compulsively "clean them all up" at every little step along the way (it'd be like working with my mother hovering over my shoulder...).

And that's been the case even for things I would normally want to be informed of. Dead/unreachable code and unused variables are two examples that come to mind.

> The problem is that
> it's not enforced by the compiler, so *somebody* somewhere will
> inevitably bypass it.

I never understood the "Some people ignore it, therefore it's good to remove it and prevent anyone else from ever benefiting" line of reasoning.

I don't want all "caution" road signs ("stop sign ahead", "hidden driveway", "speed limit decreases ahead", etc) all ripped out of the ground and tossed just because there are some jackasses who ignore them and cause trouble. Bad things happen when people ignore road signs, and they do ignore road signs, therefore let's get rid of road signs. That wouldn't make any shred of sense, would it? It's the same thing here:

I'd rather have somebody somewhere bypass that enforcement than render EVERYONE completely unable to benefit from it, ever. When the compiler keeps silent about a code smell instead of emitting a waring, that's exactly the same as emitting a warning but *requiring* that *everybody* *always* ignores it.

"Sometimes" missing a heads-up is better than "always" missing it.


>> C/C++ doesn't demonstrate that warnings are doomed to be useless and
>> "always" ignored. What it demonstrates is that warnings are NOT an
>> appropriate strategy for fixing language problems.
>
> Point.  I suppose YMMV, but IME unless warnings are enforced with
> -Werror or equivalent, after a while people just stop paying attention
> to them, at least where I work.

So nobody else should have the opportunity to benefit from them?

Because that's what the alternative is. As soon as we buy into the "error" vs "totally ok" false dichotomy, we start hitting (and this is exactly what did happen in D many years ago) cases where a known code smell is too pedantic to be justifiable as a build-breaking error. So if we buy into the "error/ok" dichotomy, those code smells are forced into the "A-Ok!" bucket, guaranteeing that nobody benefits.

Those "X doesn't fit into the error vs ok dichotomy" realities are exactly why DMD wound up with a set of warnings despite Walter's philosophical objections to them.


> That's why my eventual conclusion is that anything short of enforcement
> will ultimately fail. Unless there is no way you can actually get an
> executable out of badly-written code, there will always be *somebody*
> out there that will write bad code. And by Murphy's Law, that somebody
> will eventually be someone in your team, and chances are you'll be the
> one cleaning up the mess afterwards.  Not something I envy doing (I've
> already had to do too much of that).

And when I am tasked with cleaning up that bad code, I *really* hope it's from me being the only one to read the warnings, and not because I just wasted the whole day tracking down some weird bug only to find it was caused by something the compiler *could* have warned me about, but chose not to because the compiler doesn't believe in warnings out of fear that somebody, somewhere might ignore it.

May 11, 2017
On 05/11/2017 10:20 PM, Nick Sabalausky (Abscissa) wrote:
> On 05/10/2017 02:28 AM, H. S. Teoh via Digitalmars-d wrote:
>>
>> I'm on the fence about the former.  My current theory is that being
>> forced to write "proper" code even while refactoring actually helps the
>> quality of the resulting code.
>
> I find anything too pedantic to be an outright error will *seriously*
> get in my way and break my workflow on the task at hand when I'm dealing
> with refactoring, debugging, playing around with an idea, etc., if I'm
> required to compulsively "clean them all up" at every little step along
> the way

Another thing to keep in mind is that deprecations are nothing more than a special type of warning. If code must be be either "error" or "non-error" with no in-between, then that rules out deprecations. They would be forced to either become fatal errors (thus defeating the whole point of keeping an old symbol around marked as deprecated) or go away entirely.

May 12, 2017
On Thursday, 11 May 2017 at 15:53:40 UTC, Jonathan M Davis wrote:
> On Monday, May 08, 2017 23:15:12 H. S. Teoh via Digitalmars-d wrote:
>> Recently I've had the dubious privilege of being part of a department wide push on the part of my employer to audit our codebases (mostly C, with a smattering of C++ and other code, all dealing with various levels of network services and running on hardware expected to be "enterprise" quality and "secure") and fix security problems and other such bugs, with the help of some static analysis tools. I have to say that even given my general skepticism about the quality of so-called "enterprise" code, I was rather shaken not only to find lots of confirmation of my gut feeling that there are major issues in our codebase, but even more by just HOW MANY of them there are.
>
> In a way, it's amazing how successful folks can be with software that's quite buggy. A _lot_ of software works just "well enough" that it gets the job done but is actually pretty terrible. And I've had coworkers argue to me before that writing correct software really doesn't matter - it just has to work well enough to get the job done. And sadly, to a great extent, that's true.
>
> However, writing software that's works just "well enough" does come at a cost, and if security is a real concern (as it increasingly is), then that sort of attitude is not going to cut it. But since the cost often comes later, I don't think that it's at all clear that we're going to really see a shift towards languages that prevent such bugs. Up front costs tend to have a powerful impact on decision making - especially when the cost that could come later is theoretical rather than guaranteed.
>
> Now, given that D is also a very _productive_ language to write in, it stands to reduce up front costs as well, and that combined with its ability to reduce the theoretical security costs, we could have a real win, but with how entrenched C and C++ are and how much many companies are geared towards not caring about security or software quality so long as the software seems to get the job done, I think that it's going to be a _major_ uphill battle for a language like D to really gain mainstream use on anywhere near the level that languages like C and C++ have. But for those who are willing to use a language that makes it harder to write code with memory safety issues, there's a competitive advantage to be gained.
>
> - Jonathan M Davis

D wasn't ready for mainstream adoption until quite recently I think.  The documentation for Phobos when I started looking at D in 2014 was perfectly clear if you were more theoretically minded, but not for other people.  In a previous incarnation I tried to get one trader who writes Python to look at D and he was terrified of it because of the docs. And I used to regularly have compiler crashes and ldc was always too far behind dmd.  If you wanted to find commercial users there didn't seem to be so many and so hard to point to successful projects in D that people would have heard of or could recognise - at least not enough of them.  Perception has threshold effects and isn't linear.  There wasn't that much on numerical front either. The D Foundation didn't exist and Andrei played superhero in his spare time.

All that's changed now in every respect.  I can point to the documentation and say we should have docs like that and with runnable tests /examples.  Most code builds fine with ldc, plenty of numerical libraries - thanks Ilya - and perception is quite different about commercial successes.  Remember what's really just incremental in reality can be a step change in perception.

I don't think the costs of adopting D are tiny upfront.  Putting aside the fact that people expect better IDE support than we have, and that we have quite frequent releases (not a bad thing, but it's where we are in maturity) and some of them are a bit unfinished and others break things for good reasons, build systems are not that great even for middling projects (200k sloc).  Dub is an amazing accomplishment for Sonke as one of many projects part time, but it's not yet so mature as a build tool.  We have extern(C++) which is great, and no other language has it.  But that's not the same thing as saying it's trivial to use a C++ library from D (and I don't think it's yet mature bugwise). No STL yet. Even for C compare the steps involved vs LuaJIT FFI.  Dstep is a great tool but not without some friction and it only works for C.

So one should expect to pay a price with all of this, and I think most of the price is upfront (also because you might want to wrap the libraries you use most often). And the price is paid by having to deal with things people often take for granted, so even if it's small in the scheme of things, it's more noticeable.

A community needs energy coming into it to grow, but if there's too quick an influx of newcomers that wouldn't be good either.  Eg if dconf were twice the size it would be a very different experience, not only in a positive way.

I think new things often grow not by taking the dominant player head on, but by growing in the interstices.  By taking hold in obscure niches nobody cares about you gain power to take on bigger niches and over time turns out some of those niches weren't so unimportant after all.  It's a positive for the health of D that it's dismissed and yet keeps growing; just imagine if Stroustrup had had a revelation, written a memo "the static if tidal wave" (BG 1995), persuaded the committee to deprecate all the features and mistakes that hold C++ back and stolen all D's best features in a single language release.  A challenger language doesn't want all people to take it seriously because it doesn't have the strength to win a direct contest.  It just needs more people to take it seriously.

Thr best measure of the health of the language and its community might be are more people using the language to get real work done and is it more or less helping them do so; and what is the quality of new people becoming involved.  If those things are positive then if external conditions are favourable then I think it bodes well for the future.

And by external conditions I mean that people have gotten used to squandering performance and users' time - see Jonathan Blow on Photoshop for example.  If you have an abundance of a resource and keep squandering it, eventually you will run out of abundance.  Storage prices are collapsing, data sets are growing, Moore's Law isn't what it was, and even with dirt cheap commodity hardware it's not necessarily the case that one is I/O bound any more.  Nvme drive does 2.5 GB /sec and we are happy when we can parse JSON at 200 MB /sec.  People who misquote Knuth seem to write slow code, and life is too short to be waiting unnecessarily.  At some point people get fed up with slow code.

Maybe it's wrong to think about there being one true inheritor of the mantle of C and C++.  Maybe no new language will gain the market share that C has, and if so that's probably a good thing.  Mozilla probably never had any moments when they woke up and thought hmm maybe we should have used Go instead, and I doubt people writing network services think maybe Rust would have been better.

I said to Andrei at dconf that principals rather than agents are much more likely to be receptive towards the adoption of D.  If you take an unconventional decision and it doesn't work out, you look doubly stupid - it didn't work out and on top of that nobody else made that mistake : what were you thinking?  So by far the best strategy - unless you're in a world of pain, and desperate for a way out - is to copy what everyone else is doing.

But if you're a principal - ie in some way an owner of a business - you haven't got the luxury of fooling yourself, not if you want to survive and flourish.  The buck stops here, so it's a risk to use D, but it's also a risk not to use D - you can't pretend the conventional wisdom is without risk when it may not suit the problem that's before you. And it's your problem today and it's still your problem tomorrow, and that leads to a different orientation towards the future than being a cog in a vast machine where the top guy is measured by whether he beats earnings next quarter.

The web guys do have a lot of engineers but they have an inordinate influence on the culture.  Lots more code gets written in enterprises and you never hear about it because it's proprietary and people aren't allowed to or don't have time to discuss it.  And maybe it's simply not even interesting to talk about, which doesn't mean it's not interesting to you, and economically important.

D covers an enormous surface area - a much larger potential domain set than Go or Rust.  Things are more spread out, hence the amusing phenomenon on Reddit and the like of people thinking that because they personally don't know anyone that uses D, nothing is happening and adoption isn't growing.  So assessing things by adoption within the niches where people are chatty is interesting but doesn't tell you much.

I don't think most users post on the forum much.  It's a subset of people that for whatever reasons like posting on the forum for intrinsic or instrumental reasons that do.

So if I am right about the surface area and the importance of principals then you should over time see people popping up from areas you had never thought of that have the power to make decisions and trust their own judgement because they have to.  That's how you know the language is healthy - that they start using D and enough of them have success with it.

Liran at Weka had never heard of D not long before he based his company on it.  I had never imagined a ship design company might use Extended Pascal, let alone that D might be a clearly sensible option for automated code conversion and be a great fit for new code.

And I am sure Walter is right about the importance of memory safety.  But outside of certain areas D isn't in a battle with Rust; memory safety is one more appealing modern feature of D.  To say it's important to get it right isn't to say it has to defeat Rust. Not that you implied this, but some people at dconf seemed to implicitly think that way.


Laeeth


May 12, 2017
On Friday, May 12, 2017 04:08:52 Laeeth Isharc via Digitalmars-d wrote:
> And I am sure Walter is right about the importance of memory safety.  But outside of certain areas D isn't in a battle with Rust; memory safety is one more appealing modern feature of D. To say it's important to get it right isn't to say it has to defeat Rust. Not that you implied this, but some people at dconf seemed to implicitly think that way.

I think that we're far past the point that any language is going to beat everyone else out. Some languages will have higher usage than others, and it's going to vary quite a lot between different domains. Really, it's more of a question of whether a lanugage can compete well enough to be relevant and be used by a lot of developers, not whether it's used by most developers.

For instance, D and Go are clearly languages that appeal to a vastly different set of developers, and while they do compete on some level, I think that they're ultimately just going to be used by very different sets of people, because they're just too different (e.g. compare Go's complete lack of generics with D's templates). Rust, on the other hand, seems to have a greater overlap with D, so there's likely to be greater competition there (certainly more competetion with regards to replacing C++ in places where C++ is replaced), but they're still going to appeal to different sets of developers to an extent, just like C++ and D have a lot of overlap but don't appeal to the same set of developers. I fully expect that both Rust and D have bright futures, but I also don't really expect either to become dominant. That's just too hard for a language to do, especially since older languages don't really seem to go away. The programming language ecosystem just becomes more diverse. At most, a language is dominant in a particular domain, not the software industry as a whole.

I would love for D to become a serious player in the programming language space such that you see D jobs out there like we currently see C/C++ or Java jobs out there (right now, as I understand it, even Sociomantic Labs advertises for C++ programmers, not D programmers). But ultimately, what I care about is being able to use D when I program and have enough of an ecosystem around it that there are useful libraries and frameworks that I can use and build upon, because D is the language that I prefer and want to program in. Having D destroy C/C++ or Java or C# or Rust or whatever really isn't necessary for that. It just needs to become big enough that it has a real presence, whereas right now, it seems more like the folks who use it professionally are doing so in stealth mode (even if they're not doing so purposefully). Anyone who wants to get a job somewhere and work in D is usually going to have a hard time of it right now, even though such jobs do exist. As it stands, I think a relatively small percentage of D's contributors are able to use D for their day jobs. And if we can really change _that_, then we'll have gotten somewhere big, regardless of what happens with other languages.

- Jonathan M Davis

May 12, 2017
On 2017-05-09 16:13, Walter Bright wrote:

> I agree. But one inevitably runs into problems relying on valgrind and
> other third party tools:
>
> 1. it isn't part of the language
>
> 2. it may not be available on your platform
>
> 3. somebody has to find it, install it, and integrate it into the
> dev/test process
>
> 4. it's incredibly slow to run valgrind, so there are powerful
> tendencies to skip it
>
> valgrind is a stopgap measure, and has saved me much grief over the
> years, but it is NOT the solution.

AddressSanitizer [1] is a tool similar to Valgrind which is built into the Clang compiler, just add an additional flag. It instruments the binary with the help of compiler so the execution speed will not be that much slower compared to a regular build.

Clang also contains a ThreadSanitizer which is supposed to detect data races.

[1] https://clang.llvm.org/docs/AddressSanitizer.html
[2] https://clang.llvm.org/docs/ThreadSanitizer.html

-- 
/Jacob Carlborg
May 12, 2017
On Saturday, 6 May 2017 at 09:53:52 UTC, qznc wrote:
> On Saturday, 6 May 2017 at 06:26:29 UTC, Joakim wrote:
>> [...]
>
> Hm, Sociomantic removes the live captures the next day?
>
> One request: Chop the panel discussion into one clip per question/topic, please. Alternatively, provide some means to easily jump to the start of each question.

Video of the exchange is now back up:

https://www.youtube.com/watch?v=Lo6Q2vB9AAg#t=24m37s

Question now starts at 22m:19s mark.