December 12, 2013
On 12/12/2013 11:12 AM, Daniel Murphy wrote:
> "ed" <sillymongrel@gmail.com> wrote in message
> news:ibnfbsvxqzjxyfpnzseh@forum.dlang.org...
>>
>> I'm writing my C code with DMD. When tested and tweaked I do a final
>> compile with C compiler (test once more) then commit for our QA to pick
>> up.  Occasionally I'll compile with the C compiler to ensure I haven't
>> leaked any D into the code and to minimise the #include fixups at the end.
>>
>
> I used to do this for all my university assignments in C/C++/java.
>
>

I've actually visited a course where submissions in D were allowed.
December 12, 2013
On Thursday, 12 December 2013 at 09:01:17 UTC, Paulo Pinto wrote:
> Currently I always advocate that C and C++ development should
> always be done with warnings as errors enabled, coupled with
> static analyzers at very least during CI builds, breaking them if
> anything is found.

I literally can't imagine any large C project surviving any long
without mandatory doing all listed stuff. It gets to state of
unmaintainable insanity so fast.

That said, there are very different C projects. When I am
speaking that coding C in D is less convenient than in C I don't
mean some "normal" but performance-intensive application. I can't
imagine anyone picking C motivated only by performance - it is
more about being very controllable. Of course with modern
optimizers C can no more be called "macro assembler" but it is
still much much closer to that than D. To remove all "smart"
side-effects in D you need to get rid of all druntime, avoid
using some language features and resort to inline assembly
relatively often. It is definitely possible and Adam has done
some very nice job to prove it. But it leaves you with a very
crippled language that does not even help you in sticking with
that crippled subset. At this point you really start asking
yourself - what does this give me over raw C to motivate the
transition? So far I don't see anything convincing.
December 12, 2013
On Thursday, 12 December 2013 at 11:16:07 UTC, Dicebot wrote:
> what does this give me over raw C to motivate the
> transition? So far I don't see anything convincing.

Every time I write #define in one of my 8bit μC pet projects, I know a reason.
December 12, 2013
On 12 December 2013 21:16, Dicebot <public@dicebot.lv> wrote:

> On Thursday, 12 December 2013 at 09:01:17 UTC, Paulo Pinto wrote:
>
>> Currently I always advocate that C and C++ development should always be done with warnings as errors enabled, coupled with static analyzers at very least during CI builds, breaking them if anything is found.
>>
>
> I literally can't imagine any large C project surviving any long without mandatory doing all listed stuff. It gets to state of unmaintainable insanity so fast.
>

I feel quite the opposite, I would say that about C++ personally.

I've built a C codebase from the ground over the course of a decade with
~25 programmers.
It takes discipline, and a certainly sense of simplicity in your solutions.
I personally advocate C over C++ for this very reason, it emphasises
simplicity in your solutions. It's impossible to get carried away and
create the sort of unmaintainable bullshit that C++ leads to.
I like C, I just find it verbose, and prone to boiler plate, which has a
tendency to waste programmers time... and what is more valuable than a
programmers time?


That said, there are very different C projects. When I am
> speaking that coding C in D is less convenient than in C I don't mean some "normal" but performance-intensive application. I can't imagine anyone picking C motivated only by performance - it is more about being very controllable. Of course with modern optimizers C can no more be called "macro assembler" but it is still much much closer to that than D. To remove all "smart" side-effects in D you need to get rid of all druntime, avoid using some language features and resort to inline assembly relatively often. It is definitely possible and Adam has done some very nice job to prove it. But it leaves you with a very crippled language that does not even help you in sticking with that crippled subset. At this point you really start asking yourself - what does this give me over raw C to motivate the transition? So far I don't see anything convincing.
>

I still consider C a macro assembler... I can easily (and usually do) visualise the asm output I expect the compiler to produce while I'm coding. If I'm writing performance intensive code, I am constantly disassembling and checking that the compiler is producing the code I am expecting. This feels normal to me.

What would you want inline assembly for in D? Inline assembly is almost always a mistake, unless you're writing a driver. You can't possibly schedule code better than the compiler. And in my experience, without breaking the ABI, I don't know any constructs I could produce manually in assembly that I can't easily coerce the compiler to generate for me (with better scheduling). Perhaps prefetching branch prediction hinting, which the compiler would typically require running a profile guided optimisation pass to generate, but there are intrinsics to insert those manually which don't interrupt the compiler's ability to reschedule the function.


December 12, 2013
On Thursday, 12 December 2013 at 11:42:12 UTC, Manu wrote:
> On 12 December 2013 21:16, Dicebot <public@dicebot.lv> wrote:
>
>> On Thursday, 12 December 2013 at 09:01:17 UTC, Paulo Pinto wrote:
>>
>>> Currently I always advocate that C and C++ development should
>>> always be done with warnings as errors enabled, coupled with
>>> static analyzers at very least during CI builds, breaking them if
>>> anything is found.
>>>
>>
>> I literally can't imagine any large C project surviving any long
>> without mandatory doing all listed stuff. It gets to state of
>> unmaintainable insanity so fast.
>>
>
> I feel quite the opposite, I would say that about C++ personally.
>
> I've built a C codebase from the ground over the course of a decade with
> ~25 programmers.
> It takes discipline, and a certainly sense of simplicity in your solutions.
> I personally advocate C over C++ for this very reason, it emphasises
> simplicity in your solutions. It's impossible to get carried away and
> create the sort of unmaintainable bullshit that C++ leads to.
> I like C, I just find it verbose, and prone to boiler plate, which has a
> tendency to waste programmers time... and what is more valuable than a
> programmers time?
>

I favor C++ over C, thanks to the safer constructs it offers me
with a type safety closer to the Pascal family of languages, that
C will never be able to offer.

However I tend to code very seldom in C or C++ nowadays, besides
hobby projects, as the enterprise world nowadays is all about GC
enabled languages, with a little C++ for performance hotspots.

In any case, given my enterprise experience with subcontractors,
I think it is very hard to find good developers that are able to
write error free C or C++ code without lots of enforced
guidelines to guide them screaming along the way.

--
Paulo



December 12, 2013
On Thursday, 12 December 2013 at 11:42:12 UTC, Manu wrote:
> I've built a C codebase from the ground over the course of a decade with
> ~25 programmers.
> It takes discipline, and a certainly sense of simplicity in your solutions.

It may work if you can afford to guarantee certain level of competence of majority of programmers in the team but I think is exception in practice, not rule. Also I had a bit larger teams in mind as it tends to happen with enterprise C :)

> and what is more valuable than a
> programmers time?

At some point new servers + server maintenance becomes more expensive than programmers time. Much more expensive.

> I still consider C a macro assembler... I can easily (and usually do)
> visualise the asm output I expect the compiler to produce while I'm coding.
> If I'm writing performance intensive code, I am constantly disassembling
> and checking that the compiler is producing the code I am expecting. This
> feels normal to me.

Did you use many different compilers? I am afraid that doing that on a common basis is feat of strength beyond my imagination :)

> What would you want inline assembly for in D? Inline assembly is almost
> always a mistake, unless you're writing a driver.

I can't find code Adam used to provide minimal d runtime stubs to compile C-like programs but he was forced to use in-line assembly there in few cases. Can't remember details, sorry.

And of course I am speaking about drivers / kernels / barebone. I can't imagine any other domain where using C is still absolutely necessary for practical reasons.

> You can't possibly
> schedule code better than the compiler.
> ...

I am not implying that one should do anything by hand because compiler is bad at it. I have not actually used inline assembly with C even a single time in my life. That wasn't about it.
December 12, 2013
On 12 December 2013 22:21, Dicebot <public@dicebot.lv> wrote:

> On Thursday, 12 December 2013 at 11:42:12 UTC, Manu wrote:
>
>> I've built a C codebase from the ground over the course of a decade with
>> ~25 programmers.
>> It takes discipline, and a certainly sense of simplicity in your
>> solutions.
>>
>
> It may work if you can afford to guarantee certain level of competence of majority of programmers in the team but I think is exception in practice, not rule. Also I had a bit larger teams in mind as it tends to happen with enterprise C :)


Completely true. Fortunately I've always worked on tech/engine teams, which are mostly populated with seniors, or competent up-and-comers.

 and what is more valuable than a
>> programmers time?
>>
>
> At some point new servers + server maintenance becomes more expensive than programmers time. Much more expensive.


But that's not a concern for typical programmers. That the responsibility
of sysadmins.
What I meant was, 'what's more valuable [to a programmer]...'


 I still consider C a macro assembler... I can easily (and usually do)
>> visualise the asm output I expect the compiler to produce while I'm
>> coding.
>> If I'm writing performance intensive code, I am constantly disassembling
>> and checking that the compiler is producing the code I am expecting. This
>> feels normal to me.
>>
>
> Did you use many different compilers? I am afraid that doing that on a common basis is feat of strength beyond my imagination :)


Yup. Over the past 10 years, my day job involved:
VisualC (it's changed a LOT over the years), GCC (for many architectures),
CodeWarrior (for many architectures), SNC (for many architectures), Clang,
and some other proprietary compilers. You learn each of their quirks with
time, and also how to reconcile the differences between them. You also
learn every preprocessor trick imaginable...
Worse than the compilers is the standard libraries, which are anything but
standard. In the end the ONLY function from the CRT that we called, was
sprintf(). We had our own implementations of everything else we used.

I'm absolutely conscious of these sorts of issues when I consider my
approach to D. Many of my vocal opinions stem from a desire to mitigate
these sorts of problems in the future, and make sure it is possible to
directly express codegen concepts that I've previously only been able to
indirectly express in C compilers, which often requires some coercion for
different compilers, and invariably leads to #ifdef.
It's important to be able to explicitly express low-level codegen concepts,
even if these are rarely used features, it means it's possible to write
code that is reliably portable. Sadly, most people really don't care too
much about portability.

 What would you want inline assembly for in D? Inline assembly is almost
>> always a mistake, unless you're writing a driver.
>>
>
> I can't find code Adam used to provide minimal d runtime stubs to compile C-like programs but he was forced to use in-line assembly there in few cases. Can't remember details, sorry.
>

Right. It's usually necessarily for hacks that have to interact with, or bend/subvert the ABI. But that's a pretty rare necessity, not a requirement in day-to-day code.


And of course I am speaking about drivers / kernels / barebone. I can't
> imagine any other domain where using C is still absolutely necessary for practical reasons.
>

You mean C-like-native-languages? There's not really anything C offers that
C++/D doesn't also offer at the lowest level.
Our choice to use C rather than C++ was in a sense, a funny way to enforce
a coding standard. Like I say, it forces simplicity, and a consistent
approach to problems.


 You can't possibly
>> schedule code better than the compiler.
>> ...
>>
>
> I am not implying that one should do anything by hand because compiler is bad at it. I have not actually used inline assembly with C even a single time in my life. That wasn't about it.
>

The only thing I've ever had to use it for in recent years is manually fiddling with flags registers, or interacting with hardware-specific concepts that C/C++ doesn't have ways to express (interrupt levels, privilege control, MMU control, context switching). Also, SIMD. C++ compilers traditionally didn't have any vector support, so before intrinsics were common, you had to do all SIMD code in asm >_< .. Fortunately, that's a thing of the past.


December 12, 2013
On Thursday, 12 December 2013 at 12:21:31 UTC, Dicebot wrote:
> I can't find code Adam used to provide minimal d runtime stubs to compile C-like programs but he was forced to use in-line assembly there in few cases. Can't remember details, sorry.

http://arsdnet.net/dcode/minimal.zip (not sure if it still compiles on new dmd, I haven't played with it for months and druntime is a moving target)

The main inline asm usage was to make system calls on Linux without libc or to poke the hardware on bare metal; there isn't a lot of it that is strictly necessary.

On Thursday, 12 December 2013 at 11:16:07 UTC, Dicebot wrote:
> But it leaves you with a very
> crippled language that does not even help you in sticking with
> that crippled subset. At this point you really start asking
> yourself - what does this give me over raw C to motivate the
> transition? So far I don't see anything convincing.

There's still some nice benefits, you can use the compile time stuff of D, exceptions, classes, custom array types; a lot of the language actually works if you spend the time on it. Though i never did anything serious with it, I stopped at the proof of concept phase.
December 12, 2013
On 12/12/2013 3:16 AM, Dicebot wrote:
> To remove all "smart"
> side-effects in D you need to get rid of all druntime, avoid
> using some language features and resort to inline assembly
> relatively often.

I don't see why you'd have to resort to inline assembler in D any more than in C.

> But it leaves you with a very crippled language

Not more "crippled" than C is.

> that does not even help you in sticking with that crippled subset.

Is there a point to having a compiler flag that'll warn you if you use "pure"?

> At this point you really start asking
> yourself - what does this give me over raw C to motivate the
> transition? So far I don't see anything convincing.

Off the top of my head:

1. compile speed
2. dependable sizes of basic types
3. unicode
4. wchar_t that is actually usable
5. thread local storage
6. no global errno being set by the math library functions
7. proper IEEE 754 floating point
8. no preprocessor madness
9. modules
10. being able to pass array types to functions without them degenerating to pointers
11. inline assembler being a part of the language rather than an extension that is in a markedly different format for every compiler
12. forward referencing (no need to declare everything twice)
13. no need for .h files
14. no ridonculous struct tag name space with all those silly

    typedef struct S { ... } S;

declarations.
15. no need for precompiled headers
16. struct alignment as a language feature rather than an ugly extension kludge
17. no #include guard kludges
18. #define BEGIN { is thankfully not possible
19. no need for global variables when qsorting
20. no global locale madness

And if you use D features even modestly, such as auto, purity, out variables, @safe, const, etc., you can get a large improvement in clarity in function APIs.


December 12, 2013
On Thursday, 12 December 2013 at 17:56:12 UTC, Walter Bright wrote:
> 5. thread local storage

I think this is a negative. D's TLS has caused me more problems than it has fixed: for example, if you write an in-process COM server in Windows XP, it will crash the host application if you hit almost any druntime call. Why? Because the TLS stuff isn't set up properly when the dll is loaded.

Windows Vista managed to fix this, but there's a lot of people who use XP, and this is a big problem.

Same thing running D on bare metal. Maybe I can fix this by setting up the segment registers or reading the executable, idk, but __gshared just works in that environment, whereas tls doesn't.

As I understand it, the Android and Macintosh operating systems has, or at least had, TLS problems too.


I agree with the rest of them, but D's default TLS has been a big pain to me.