July 06, 2016
$subj.
July 06, 2016
On Wednesday, 6 July 2016 at 10:26:27 UTC, qznc wrote:
> If you want to distribute a binary

gods save me! why should i do that? i am GPL fanatic. and if i'm doing contract work, i know what machines my contractor will have.

> for x86 you only have the 386 instructions. Ok, 686 is probably common enough today. For more special instructions, you could guard them and provide a fallback.

nope. just write in the readme: "you need at least Nehalem-grade CPU to run that". maybe check it at startup time and fail if CPU is too old. that's all, several lines of code. not any different from demanding 64-bit system. also note that some 64-bit systems can run 32-bit apps, but not vice versa.

> GCC has a switch (-mx32) to store pointers as 32bit on a 64bit system. That is probably very close to what you want.

except that i should either build everything with that flag and hope for the best, or pay for the things i don't need.
July 07, 2016
On 06/07/16 07:01, Walter Bright wrote:

> Apple has dropped all 32 bit support

No. For ARM 32bit is still relevant. On OS X the Simulator (used to test iOS applications) are running the iOS applications as x86 (both 32 and 64bit) even though the iOS deceives are running ARM.

Apparently some users are still running 32bit applications on OS X because they have plugins that are only available as 32bit, think audio and music software.

-- 
/Jacob Carlborg
July 08, 2016
On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
> Thanks for taking the time to write this. Let me see if I can help.

Wow, this was very well handled. Thanks for keeping your head cool and answering in a constructive, friendly and informative manner. It's even more admirable coming from someone who says he used not to be exactly the most "people person" / good boss / group leader, or whatever the expression was.

> Sometimes I idly wonder what would have happened if D were available in the 80's. Sort of like if you put a modern car for sale in the 1960's.

I've also thought about that from time to time. I think D would have been very "mainstream-successful". Starting from where it actually started, I think things have worked out well for D, despite its still limited success. Looking back all of these years I think that D's marketing mistake was the garbage collection. Given its target audience and design trade-offs, I believe adoption of the language was disproportionally affected by that choice. If D had started with stronger support for nogc, even at the cost of delaying some other nice features, I believe adoption would have been quite stronger (and more easily snowballed) -- irrespective of the actual engineering merit of that D variant vs the true D. (it would also have avoided all the current piecemeal work of trying to remove GC allocation from Phobos, etc.; also, notice that nogc marketing would probably have been even more important in the 80s).
July 08, 2016
On Wednesday, 6 July 2016 at 04:56:07 UTC, Walter Bright wrote:
> It's certainly doable, but in an age of priorities I suspect the time is better spent on

\o/

> improving 64 bit code generation.

/o\

July 08, 2016
On Friday, 8 July 2016 at 01:17:55 UTC, Luís Marques wrote:

>> Sometimes I idly wonder what would have happened if D were available in the 80's. Sort of like if you put a modern car for sale in the 1960's.
>
> I've also thought about that from time to time. I think D would have been very "mainstream-successful". Starting from where it actually started, I think things have worked out well for D, despite its still limited success. Looking back all of these years I think that D's marketing mistake was the garbage collection. Given its target audience and design trade-offs, I believe adoption of the language was disproportionally affected by that choice. If D had started with stronger support for nogc, even at the cost of delaying some other nice features, I believe adoption would have been quite stronger (and more easily snowballed) -- irrespective of the actual engineering merit of that D variant vs the true D. (it would also have avoided all the current piecemeal work of trying to remove GC allocation from Phobos, etc.; also, notice that nogc marketing would probably have been even more important in the 80s).

This is a futile discussion. D is in many respects a "hindsight language" as regards C/C++.[1] People naturally lacked hindsight back in the 80ies and a lot of D's features would have been frowned upon as "Don't need it!" (templates), "Waste of memory!" (e.g. `array.length`) etc. And remember computers and computing power were not as common as they are today. You were also dealing with a different crowd, there are by far more programmers around now than there used to be in the 80ies, with different expectations. In the 80ies most programmers were either hard core nerds (hence the nerdy image programmers have) or people who had lost their jobs elsewhere and had gone through re-educational programs to become programmers and thus were not really interested in the matter.

As for GC, it's hard to tell. When D was actually (not hypothetically) created, GC was _the_ big thing. Java had just taken off, people were pissed off with C/C++, programming and coding was becoming more and more common. Not having GC might actually have been a drawback back in the day. People would have complained that "Ah, D is like C++, no automatic memory management, I might as well stick to C++ or go for Java!" So no, I think D is where it is, because things are like they are, and "what if" discussions are useless. D has to keep on keeping on, there's no magic.

[1] Sometimes I think that D should to be careful not to become a language looked on by yet another "hindsight language".
July 08, 2016
On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
> If the program is compiled with -g and it crashes (seg faults) you'll usually at least get a stack trace. Running it under a debugger will get you much more information.

Only on Windows, and that's a common source of frustration for me :(
July 08, 2016

On 7/8/16 8:22 AM, Luís Marques via Digitalmars-d wrote:
> On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
>> If the program is compiled with -g and it crashes (seg faults) you'll usually at least get a stack trace. Running it under a debugger will get you much more information.
>
> Only on Windows, and that's a common source of frustration for me :(
>
I've had reasonable success using lldb on mac.

July 08, 2016
On Friday, 8 July 2016 at 15:17:33 UTC, Luís Marques wrote:
> On Sunday, 3 July 2016 at 06:23:05 UTC, Walter Bright wrote:
>> If the program is compiled with -g and it crashes (seg faults) you'll usually at least get a stack trace. Running it under a debugger will get you much more information.
>
> Only on Windows, and that's a common source of frustration for me :(

=== z00.d ===
void func () {
  assert(0, "BOOM!");
}

void main () {
  func();
}

# dmd -g z00.d
# ./z00.d

core.exception.AssertError@z00.d(2): BOOM!
----------------
??:? _d_assert_msg [0xb7534687]
z00.d:2 void z00.func() [0x80489f2]
z00.d:6 _Dmain [0x80489ff]
??:? rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).runAll().__lambda1() [0xb7566326]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).tryExec(scope void delegate()) [0xb75661a0]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).runAll() [0xb75662d3]
??:? void rt.dmain2._d_run_main(int, char**, extern (C) int function(char[][])*).tryExec(scope void delegate()) [0xb75661a0]
??:? _d_run_main [0xb75660ff]
??:? main [0x8048a83]
??:? __libc_start_main [0xb6f3f696]

what am i doing wrong? O_O
July 08, 2016
On Friday, 8 July 2016 at 12:46:03 UTC, Chris wrote:

> As for GC, it's hard to tell. When D was actually (not hypothetically) created, GC was _the_ big thing. Java had just taken off, people were pissed off with C/C++, programming and coding was becoming more and more common. Not having GC might actually have been a drawback back in the day. People would have complained that "Ah, D is like C++, no automatic memory management, I might as well stick to C++ or go for Java!" So no, I think D is where it is, because things are like they are, and "what if" discussions are useless. D has to keep on keeping on, there's no magic.

Yep. If you're going to pick any feature to use to sell a new language, lack of GC is the worst. The only ones that care (and it's a small percentage) are the ones that are least likely to switch due to their existing tools, libraries, and knowledge.