November 28, 2014
inline:

On Friday, 28 November 2014 at 05:14:31 UTC, Mike wrote:<snip
> As I see it, D has a lot of contributors willing to maintain and enhance it, but the barrier to entry and learning curve is quite high for anything significant.

 I say there is very few.  I have to challenge your data that there is a lot of contributors. Most projects list their maintainers, what is the number of FTE D maintainers?

That is a loaded question of course, relative to surface area:
http://dlang.org/comparison.html
of sacred cows relative to FTE maintainers.


> Therefore, the number of contributors that actually make a significant difference is quite small.
> Also, since it is a volunteer effort, there's really not much accountability.  Contributors are free to make significant contributions, and then leave the D scene without anyone able to maintain their work.  Contributors are also free to do partial implementations, and move onto something else without completing what they started.
>
> I think anyone wishing to leverage D for a critical project should be aware of that, and perhaps D should be more transparent about it.  That doesn't mean D shouldn't be used for anything of importance, it just means that by using D we enter into an implicit contract requiring us to either be willing to become significant contributors ourselves, or be willing to partner with and fund D's development.
>

How much $ funding so I consider it?
For example for regressions to be done but only for core. (none of silly things, which is a long list of GC, AssociaveArrays, demangle, annotations, higher order functions, pseudo members, transmogrify, goto, try/catch, vardic, traits, variant, sqllite, etc. I am not saying to remove them, just make them downstream like GNU does, not part of regression tested D core.  If I wanted to use a big lang I'd do CLR or JRE.


> When I first approached D, I had high expectations, and as I learned more, I also became disappointed

Yes, that is the pattern. Come in, and leave. There has to be more people staying and more D shops (like my shop is pure 100% D ).
The regressions are painful, and I propose a way to let people experiment and compete on rich features. My proposal is to make a small stable core that can be maintained, so that people say.

> and said a few things I this forum I wish I could take back.  I misunderstood the culture of this community and how it works, and perhaps I still do.  But what is more apparent now is we can't expect to rely on D if we're not willing to make significant contributions ourselves, or fund its development.  And, while using D is itself a small contribution, it's not enough.
>
>> Lets make D core small please. List of things that are core: char arrays.  What else should be core?
>> The key question in this tread: how small should core D be to be maintainable and stable? How can it be split?
>
> I would like D core to be small as well, but for different reasons:  I would like a nimble language that can be used for both the smallest and largest of machines, and let the libraries differentiate.  But, I don't think it should be partitioned based on the size of D's human resources, partially because that is going to fluctuate drastically throughout time.
>  Rather, the division should be made based on architectural and design goals.

The culture of the D contributes is 'to bravely peer into gates of hell' - as in experiment. As user of D I want it to be small, stable and maintained. I proposed a path that allows for both, w/o scaring of people that use D for real projects. If that is not so, then I missed that D is an experimental language.

>
> You might consider doing what other organizations have done:  Allocating your D talent to fix the problems you have encountered by submitting pull requests, or open issues and place bounties on them at https://www.bountysource.com/teams/d/issues.
>
> Regardless, I'd still be interested in knowing, more specifically, what problems you're encountering.
>
> Mike

My problem: I have 3 developers(still hiring more) that work for me please lets not use D, it is not stable release to release (and has a bunch of things no one needs).
I told them: we are a D shop because long term I anticipate D and Walter to succeed.
So I hope that maintainers consider users of D and not chase them away.
Vic

November 28, 2014
On Fri, 28 Nov 2014 20:00:35 +0000
Vic via Digitalmars-d <digitalmars-d@puremagic.com> wrote:

> etc. I am not saying to remove them, just make them downstream like GNU does, not part of regression tested D core.  If I wanted to use a big lang I'd do CLR or JRE.
if i wanted some crippled language that has almost nothing in it, i can take any of the scripting languages out here, or some esoteric compiler.

having dynamic arrays, associative arrays and strings in the language is a must for me: i don't want to reinvent the wheel each time or add some libraries for such simple things (yes, C, i'm talking about you!).

and having alot of things in phobos is great too: i still don't have to download another library for levenstein distance, for example (yes, i really used that and was very happy to find it right into std.algorithm).

and i really enjoyed the fact that all this is a part of the language that undergoes testing on par with basic features.

> > When I first approached D, I had high expectations, and as I learned more, I also became disappointed
> 
> Yes, that is the pattern. Come in, and leave. There has to be
> more people staying and more D shops (like my shop is pure 100% D
> ).
> The regressions are painful, and I propose a way to let people
> experiment and compete on rich features. My proposal is to make a
> small stable core that can be maintained, so that people say.
you can fork and experiment as you want. my private build of DMD has more than 100KB of patches applied at build time, for example.

> My problem: I have 3 developers(still hiring more) that work for me please lets not use D, it is not stable release to release (and has a bunch of things no one needs).
this has an easy solution: just stick with the chosen D version: nobody forces anyone to upgrade.

even more: we have dfix now, which will help to make upgrade process alot less painfull.


November 28, 2014
On Friday, 28 November 2014 at 20:20:55 UTC, ketmar via Digitalmars-d wrote:

> this has an easy solution: just stick with the chosen D version: nobody forces anyone to upgrade.

Amen.

November 28, 2014
On Fri, Nov 28, 2014 at 11:06:14PM +0000, Mike via Digitalmars-d wrote:
> On Friday, 28 November 2014 at 20:20:55 UTC, ketmar via Digitalmars-d wrote:
> 
> >this has an easy solution: just stick with the chosen D version: nobody forces anyone to upgrade.
> 
> Amen.

Yeah, it's not wise to keep upgrading the compiler, even in C/C++, since many enterprise products tend to exhibit subtle breakages with new compilers. (You may have heard of Linus ranting about newer versions of GCC that "miscompile" the kernel, for example -- I remember back in the day, the word was that you had to use egcs to compile the kernel, otherwise if you get a kernel panic you're on your own, the devs can't (and won't) help you). The code that I work on in my day job has a lot of compiler-specific hacks, and would *certainly* become unusable if we were to build it with a non-standard compiler -- standard being what's shipped in the development environment that we are required to use for building it, which hasn't changed for the last 5 years at least (so it's definitely *not* a new compiler!).

Of course, being stuck with an obsolete version of D is Not Nice, so one idea that occurs to me is to do infrequent, preplanned upgrades.

For example, say you standardize on dmd 2.066 for the current product release. This means that you will *only* compile with 2.066 and nothing else, and developers are required to use only the "official" compiler version for their work.  This also means that you, being the boss, decide what feature sets are / aren't allowed in your codebase. For example, glancing over Google's C++ style guide, many C++11 features are banned outright -- even though ostensibly they must have *some* value to *somebody*, otherwise they wouldn't be in the spec! So basically, you decide on the compiler version and feature set, and stick with that.

Now, the product may take a long time to develop, ship, and maintain, so by the time it's in maintenance mode, it could be several years from now. By then, upstream dmd may already be at 2.074, say, but your codebase is still only 2.066. At some point, you'll want to upgrade -- but that should be done wisely.

Never ever upgrade your mainline development wholesale -- if you run into problems (e.g., a language feature you depended on has changed and it's non-trivial to update the code to work around it) you don't want to find yourself in the unpleasant position of having a non-working codebase. Instead, start out a pilot branch with one or two dedicated developers who will (attempt to) build the code with the latest compiler. If you're extremely, extremely, extreeeeeemely lucky, all goes well and you can run some unittests, then some controlled field testing, then switch over.

More likely, though, you'll run into some issues -- compiler bugs that got fixed and code that relied on that is now broken, etc.. So you take the time to investigate -- on the pilot branch -- how to work around it. Once everything is figured out, start investing more effort into porting the codebase over to the new compiler -- at the same time evaluating possible new feature sets to include in your list of permitted features -- while still keeping the 2.066 compiler and the original branch around. Don't ever throw out the old compiler until the older product release is no longer supported. Ideally, the new branch will coincide with the next major release of your product, which should give you some time to sort out compiler bugs (new and old), language workarounds, etc.. Once the new release is stabilized enough to be shippable, you can start deprecating the older codebase and eventually drop it.

Basically, this isn't too much different from what you'd do with any other 3rd party product that you depend on -- you never just drop in a new 3rd party release with no planning and no migration path, that's just asking for trouble. You always keep the current environment running until the new one has been proven to be stable enough to use, then you switch over.


T

-- 
"Outlook not so good." That magic 8-ball knows everything! I'll ask about Exchange Server next. -- (Stolen from the net)
November 29, 2014
On Friday, 28 November 2014 at 20:00:36 UTC, Vic wrote:
>
>  I say there is very few.  I have to challenge your data that there is a lot of contributors. Most projects list their maintainers, what is the number of FTE D maintainers?

You can see the lists here:
https://github.com/D-Programming-Language/dlang.org/graphs/contributors
https://github.com/D-Programming-Language/dmd/graphs/contributors
https://github.com/D-Programming-Language/druntime/graphs/contributors
https://github.com/D-Programming-Language/phobos/graphs/contributors

There's not going to be a high FTE count because D is a volunteer effort with, from what I can tell, no significant financial support.  This is not a secret and users should be aware of this when they choose to use D.  As I said in my previous post, IMO, users should expect to either contribute programming talent or funds if they intend to use it for anything critical.

>
> How much $ funding so I consider it?
> For example for regressions to be done but only for core. (none of silly things, which is a long list of GC, AssociaveArrays, demangle, annotations, higher order functions, pseudo members, transmogrify, goto, try/catch, vardic, traits, variant, sqllite, etc. I am not saying to remove them, just make them downstream like GNU does, not part of regression tested D core.
>  If I wanted to use a big lang I'd do CLR or JRE.

Can't the features that you don't need just be ignored?  The .Net Framework is HUGE, but that doesn't prevent me from using a very small subset of it for my work.

> The regressions are painful, and I propose a way to let people experiment and compete on rich features. My proposal is to make a small stable core that can be maintained, so that people say.

Do you have a bugzilla issue for each of the regressions you're referring to?  The regressions that I see listed here (http://wiki.dlang.org/Beta_Testing#Available_Downloads) are quite small in number, and arguably in code that is rarely employed.

>
> The culture of the D contributes is 'to bravely peer into gates of hell' - as in experiment. As user of D I want it to be small, stable and maintained. I proposed a path that allows for both, w/o scaring of people that use D for real projects. If that is not so, then I missed that D is an experimental language.
>

I don't see D as an experimental language.  "Experimental" implies it's trying features that may or may not work, and is trying to see how they pan out.  From what I can tell, the vast majority of D's features are borrowed from other time-tested, proven languages, and the D specific features that I've employed have been powerful and liberating.

>
> My problem: I have 3 developers(still hiring more) that work for me please lets not use D, it is not stable release to release (and has a bunch of things no one needs).

1) Not stable release-to-release?
Which regressions?  Please provide links to the bug reports.  Without these, there's nothing actionable, and you're argument against D is questionable.

Ask your developers for something specific and actionable.  You may be pleasantly surprised by the helpfulness of this community...or not.

2) has a bunch of things no one needs...
The things you don't need don't need to be employed.

This seems to be your fundamental premise:  Because D is a feature rich language with limited resources, resources are spread too thin making D unmaintainable with the resources it has.  I'm not convinced.  First of all, that doesn't appear to be how this effort works.  There isn't a CEO setting priorities and cracking a whip over contributors' heads. Contributors largely work on those things that interest them. Second, if D were to move non-core features downstream, the contributors that work on those features will also move downstream, thinning D's core resources even further.

If regressions are the problem, please file bug reports and bring them to this community's attention.  You may find some helpful people willing to address them for free simply because it interests them or it was something they were meaning to get to, and just needed a little motivation.  If there isn't anyone willing to address them, the problem could potentially be monetized.

Mike



November 29, 2014
On Friday, 28 November 2014 at 23:38:40 UTC, H. S. Teoh via Digitalmars-d wrote:

> Of course, being stuck with an obsolete version of D is Not Nice, so one
> idea that occurs to me is to do infrequent, preplanned upgrades.

Yes, the important word there is "preplanned".  We still use Visual Studio 6.0 (1998) for some of our products.  I spent the last month working in Visual Studio 2008.

Mike

November 29, 2014
On Friday, 28 November 2014 at 23:38:40 UTC, H. S. Teoh via Digitalmars-d wrote:
>
>
> For example, say you standardize on dmd 2.066 for the current product release. This means that you will *only* compile with 2.066 and nothing else, and developers are required to use only the "official" compiler version for their work.  This also means that you, being the boss, decide what feature sets are / aren't allowed in your codebase. For example, glancing over Google's C++ style guide, many C++11 features are banned outright -- even though ostensibly they must have *some* value to *somebody*, otherwise they wouldn't be in the spec! So basically, you decide on the compiler version and feature set, and stick with that.
>
> Now, the product may take a long time to develop, ship, and  maintain, so
> by the time it's in maintenance mode, it could be several years from now. By then, upstream dmd may already be at 2.074, say, but your codebase is still only 2.066. At some point, you'll want to upgrade -- but that should be done wisely.
>
> Never ever upgrade your mainline development wholesale -- if  you run
> into problems (e.g., a language feature you depended on has changed and it's non-trivial to update the code to work around it) you don't want to find yourself in the unpleasant position of having a non-working
> codebase. Instead, start out a pilot branch with one or two dedicated developers who will (attempt to) build the code with the latest
> compiler. If you're extremely, extremely, extreeeeeemely lucky, all goes well and you can run some unittests, then some controlled field testing,
> then switch over.
>
> More likely, though, you'll run into some issues -- compiler bugs that got fixed and code that relied on that is now broken, etc.. So you take the time to investigate -- on the pilot branch -- how to work around it.
> Once everything is figured out, start investing more effort into porting the codebase over to the new compiler -- at the same time evaluating possible new feature sets to include in your list of permitted  features
> -- while still keeping the 2.066 compiler and the original branch around. Don't ever throw out the old compiler until the older product release is no longer supported. Ideally, the new branch will coincide with the next major release of your product, which should give you some time to sort out compiler bugs (new and old), language workarounds, etc.. Once the new release is stabilized enough to be shippable, you can start deprecating the older codebase and eventually drop it.
>
> T

Well said. I don't know any "sane" software company not being ruled by a process like that.

Here in SRLabs we are sticking with what you have described, with one variation: one person (and only one), it's constantly trying to build a branches of the projects with the latest compiler, testing our internal pull requests.

Many bugs are avoided, as detected by more recent compiler version, and it's very rare not being able to keep the pace with the other developers.

From time to time, during the development of some product, we are so confident by the state of the "unstable" to just jump everyone to a more recent compiler version.

But when the product is released, we stick with the version of the compiler used to release it, till it's end-of-life.

---
Paolo


November 29, 2014
On Friday, 28 November 2014 at 20:00:36 UTC, Vic wrote:
> (...)

IMHO you should be ready to debug the compiler.

DMD's codebase can be obscure at times (while the LDC mid-end is pretty straightforward) so the first time looking for the source of a bug may be a frustrating and time-consuming learning experience, but you'll be empowered to tackle compiler bugs without relying on volunteers to do it for you, and without stopping D from evolving.
November 29, 2014
On Saturday, 29 November 2014 at 15:37:12 UTC, Elie Morisse wrote:
> IMHO you should be ready to debug the compiler.
>
> DMD's codebase can be obscure at times (while the LDC mid-end is pretty straightforward) so the first time looking for the source of a bug may be a frustrating and time-consuming learning experience, but you'll be empowered to tackle compiler bugs without relying on volunteers to do it for you, and without stopping D from evolving.

If this is a long term strategy then this makes D unsuitable for contract work. You really don't want any uncertain factors when bidding on a contract since that will skew your estimates. Stability is more important than features in a commercial setting.
November 29, 2014
On Saturday, 29 November 2014 at 15:46:03 UTC, Ola Fosheim Grøstad wrote:
<snip>
>
> If this is a long term strategy then this makes D unsuitable for contract work. You really don't want any uncertain factors when bidding on a contract since that will skew your estimates. Stability is more important than features in a commercial setting.

yes