June 10, 2015
On Wednesday, 10 June 2015 at 14:29:51 UTC, Thiez wrote:
> On Wednesday, 10 June 2015 at 09:23:54 UTC, Chris wrote:
>> One big difference between the D community and other languages' communities is is that D people keep criticizing the language and see every little flaw in every little corner, which is good and which is why D is the way it is.
>
> Or perhaps D simply has more flaws to criticize.

How can you tell that e.g. Nim has less flaws when it's still so young? It's too early to tell.

> On Wednesday, 10 June 2015 at 09:23:54 UTC, Chris wrote:
>> Other languages' communities are more like "This is theeeeeee
>> language of the future, it's super-duper, no question asked,
>> none permitted either!"
>
> Perhaps you are depicting other communities as a bunch of group-think hipsters because you are insecure about your own community?
>
> Look, I can make baseless accusations too. Wouldn't you agree it would be nicer (and more effective, I imagine) to promote your community by calling attention rather to its positive qualities, rather than demonizing other communities? Especially when your negative portrayals of other communities are not accompanied by any evidence?
>
> I'm sure you're a smart person and will for each of the communities in question be able to find evidence of at least one person who at some point in time acted in the way you suggested. Of course such a thing would not prove that the behaviour is representative of the community, so please don't.

I've been following post-C(++) programming languages for quite a while now. Back in the day Java was a big thing, and Python was also hip. Then we had Ruby and whatnot. The base line would always be "it's a cool language, it's the future" and flaws would hardly ever be mentioned, critical voices silenced. All the benchmarking tricks used by the Java community to make people believe it's as fast as native code - while you know from your own experience that it's not - are just one example. Ah, and there was Ajax, remember? How's jQuery doing, by the way? I've used some of these technologies and none of them would live up to my expectations. But the pattern is always the same "It's theeee thing, wow, a must-have!" Sorry, but whenever I hear a language is (almost) perfect and theee way to go, I grow suspicious. If all communities are as critical as D's, why then do we have so much mediocre technology out there?

I am interested in Nim and welcome it. But it's too early to say whether it's good or mediocre. I wonder, though, when you look Nim up on Wikipedia it states:

Influenced by
Ada, Modula-3, Lisp, C++, Object Pascal, Python, Oberon

Did they really never get any inspiration from D?? I wonder. Seems a bit odd, but well.
June 10, 2015
On Wednesday, 10 June 2015 at 15:37:46 UTC, Chris wrote:
> I am interested in Nim and welcome it. But it's too early to say whether it's good or mediocre.

Yeah, I think it would be nice if one could change the culture of programming so that people easily could combine any 2 languages in the same project. But that takes either significant creator-goodwill/cooperation or platforms like .NET/JVM. I could see myself wanting to do some things in "Prolog", some things in "Lisp" and some things in "C". Today that takes too much FFI work.

A problem that both Nim and D share is that they aim broad. I think that makes it a harder sell as that tend to make the language more complex and unpolished. I think most languages that gain traction by starting focused. C was very focused on OS dev. C++ piggy-backed on that by adding abstractions. Php was very focused on web scripting. Perl on text processing. Erlang on fault tolerance. Smalltalk on interactive programming. Pascal piggybacked on Algol going too big IIRC.  Turbo Pascal's success was IDE focused IMO.

>  I wonder, though, when you look Nim up on Wikipedia it states:
>
> Influenced by
> Ada, Modula-3, Lisp, C++, Object Pascal, Python, Oberon
>
> Did they really never get any inspiration from D?? I wonder. Seems a bit odd, but well.

Probably related to the main creator's programming-experience, but as far as credits go one should really credit the first language/author to bring about a concept. (e.g. Lisp, Simula, BCPL etc)
June 10, 2015
On Wednesday, 10 June 2015 at 16:02:36 UTC, Ola Fosheim Grøstad wrote:
> Yeah, I think it would be nice if one could change the culture of programming so that people easily could combine any 2 languages in the same project.

But shouldn't there be one language that's right for everyone?


(BTW I wanted to use that line in my dconf talk and forgot to!)
June 10, 2015
On Wednesday, 10 June 2015 at 16:02:36 UTC, Ola Fosheim Grøstad wrote:
> Probably related to the main creator's programming-experience, but as far as credits go one should really credit the first language/author to bring about a concept. (e.g. Lisp, Simula, BCPL etc)

I wonder why Walter was inspired to add modules to D, Walter?
June 10, 2015
On Wednesday, 10 June 2015 at 16:02:36 UTC, Ola Fosheim Grøstad wrote:
> Yeah, I think it would be nice if one could change the culture of programming so that people easily could combine any 2 languages in the same project. But that takes either significant creator-goodwill/cooperation or platforms like .NET/JVM. I could see myself wanting to do some things in "Prolog", some things in "Lisp" and some things in "C". Today that takes too much FFI work.

Wasn't LLVM supposed to solve that, being a "virtual machine" for compilation to low level native code?
June 10, 2015
On Wednesday, 10 June 2015 at 16:22:51 UTC, Idan Arye wrote:
> Wasn't LLVM supposed to solve that, being a "virtual machine" for compilation to low level native code?

May still be possible, Apple just announced that the default format to submit apps for iOS will be bitcode from now on, which people are speculating is some form of llvm bitcode:

http://arstechnica.com/apple/2015/06/app-thinning-will-be-a-major-boon-for-8gb-and-16gb-iphones-and-ipads/

Apple will then compile the bitcode for you on their servers, before sending the final binary to users.
June 10, 2015
On Wednesday, 10 June 2015 at 16:34:40 UTC, Joakim wrote:
> Apple will then compile the bitcode for you on their servers, before sending the final binary to users.

Thanks for the link. That's pretty interesting. I suspect it means they plan to change CPUs in 2 years or so. But it makes me feel a bit uneasy that Apple get to control the program at that level.
June 10, 2015
On Wednesday, 10 June 2015 at 16:34:40 UTC, Joakim wrote:
> On Wednesday, 10 June 2015 at 16:22:51 UTC, Idan Arye wrote:
>> Wasn't LLVM supposed to solve that, being a "virtual machine" for compilation to low level native code?
>
> May still be possible, Apple just announced that the default format to submit apps for iOS will be bitcode from now on, which people are speculating is some form of llvm bitcode:
>
> http://arstechnica.com/apple/2015/06/app-thinning-will-be-a-major-boon-for-8gb-and-16gb-iphones-and-ipads/
>
> Apple will then compile the bitcode for you on their servers, before sending the final binary to users.

Apple is catching up with Microsoft on Windows 8.x/10 here.

http://channel9.msdn.com/Shows/Going+Deep/Mani-Ramaswamy-and-Peter-Sollich-Inside-Compiler-in-the-Cloud-and-MDIL

http://channel9.msdn.com/Shows/Going+Deep/Inside-NET-Native

--
Paulo
June 10, 2015
On 6/10/2015 9:22 AM, Joakim wrote:
> I wonder why Walter was inspired to add modules to D, Walter?

It never occurred to me not to. Modules are hardly an innovative idea. It'd be like not supporting the + operator.
June 10, 2015
On Wednesday, 10 June 2015 at 15:13:41 UTC, Brian Rogoff wrote:
> On Wednesday, 10 June 2015 at 15:09:21 UTC, anonymous wrote:
>> On Wednesday, 10 June 2015 at 15:08:08 UTC, anonymous wrote:
>>> any community dumb enough to buy merchandise with a programming language's name on it is full of idiots.
>>> bye.
>>
>> p.s., Nim has the absolute worst community out of any of these languages.
>> http://slashdot.org/comments.pl?sid=6771453&cid=48860921
>
> You're not doing the D community any great credit with this post, either. Try and stay classy.

Translation: "Let me try to shame you because I don't have any actual argument..."