February 15, 2017
On Wednesday, 15 February 2017 at 14:44:55 UTC, Ola Fosheim Grøstad wrote:
> Another example is Swift. Swift managed to take over Objective-C rather quickly IMO, but Swift has also absorbed the non-C semantics of Objective-C, thus it did not require changing existing practice significantly.

Swift took over quickly because Apple has mandated it. While I'm happy about that, there's no denying that Swift wouldn't be where it is without the weight of Apple behind it. I'd go as far as to say that Swift's success is assured (unless Apple drops it, which looks unlikely) and that because Swift has money behind it, more money will follow, and so will a thriving ecosystem, on and off OS X.

As a PL, Swift looks nice, but they'll have to come up with a more complete story around concurrency.



February 15, 2017
On Wednesday, 15 February 2017 at 16:41:31 UTC, bpr wrote:
> Swift took over quickly because Apple has mandated it. While I'm happy about that, there's no denying that Swift wouldn't be where it is without the weight of Apple behind it. I'd go as far as to say that Swift's success is assured (unless Apple

It may have been assured, but the adoption _rate_ comes from having a good match on semantics and existing practices.

Replace Swift with Haskell and the adoption would have been much slower.

> As a PL, Swift looks nice, but they'll have to come up with a more complete story around concurrency.

Do you mean parallell execution (CPU) or concurrency as a modelling paradigm? One cannot really assume that Apple hardware has more than 2 CPUs. So as a starting point you can presume that one core taken by the main UI event loop and the other one taken by real time code. Whatever is left is for Apple's "Operation Queues" (dispatch queues, basically worker threads working in a FIFO manner IIRC).

https://developer.apple.com/library/content/documentation/General/Conceptual/ConcurrencyProgrammingGuide/

February 15, 2017
On Wednesday, 15 February 2017 at 17:08:37 UTC, Ola Fosheim Grøstad wrote:
> modelling paradigm? One cannot really assume that Apple hardware has more than 2 CPUs.

Typo: I mean't that one cannot assume that Apple hardware has more than 2 cores (so one has to write applications that perform well with only 2 cores).


February 15, 2017
On Wednesday, 15 February 2017 at 16:07:18 UTC, Ola Fosheim Grøstad wrote:
> I think Go has benefitted some from having limited and stable language semantics and continuously improving on the implementation. IMO that should make it attractive in the server space, i.e. you get low tooling-related maintenance cost and still get real benefits from recompiling with new versions of the compiler.

That is a very good point, I had never considered the consequences of
semantics stabilization on tooling. It strenghtens (along with DIP1005)
my opinion that D is fine as it is and that no new feature should be
added before at least two years, while fixing the implementation should
get the focus. It doesn't mean we can't add anything new, better C++
integration or memory management semantics would be fine, as long as it
is an already begun project.

There's little point in having more features if what's already there is
half broken and not well-defined.
February 15, 2017
On Wednesday, 15 February 2017 at 19:47:28 UTC, Cym13 wrote:
> There's little point in having more features if what's already there is half broken and not well-defined.

This is what Manu and deadalnix have been saying for the past three years. Its fallen on deaf ears.
February 15, 2017
On Wednesday, 15 February 2017 at 19:47:28 UTC, Cym13 wrote:
> There's little point in having more features if what's already there is
> half broken and not well-defined.

+1
February 15, 2017
On Wednesday, 15 February 2017 at 20:53:58 UTC, Jack Stouffer wrote:
> On Wednesday, 15 February 2017 at 19:47:28 UTC, Cym13 wrote:
>> There's little point in having more features if what's already there is half broken and not well-defined.
>
> This is what Manu and deadalnix have been saying for the past three years. Its fallen on deaf ears.

Isn't that a little uncharitable?
February 15, 2017
On Wednesday, 15 February 2017 at 17:53:43 UTC, Ola Fosheim Grøstad wrote:
> Typo: I mean't that one cannot assume that Apple hardware has more than 2 cores (so one has to write applications that perform well with only 2 cores).

You're missing what I consider to be 'the Big Picture', namely that Swift will become popular on non-Apple platforms, and it needs to be fairly capable to compete with Go, Java, and C++, and others. IBM is already backing server side Swift to some degree.

February 15, 2017
On Wednesday, 15 February 2017 at 21:16:51 UTC, Meta wrote:
> Isn't that a little uncharitable?

I just spent about 20 minutes list out all of my problems with the language, and how somethings are pretty broken. But I deleted it and I'm not going to post it.

It was just another rant. One that doesn't change anything. All I'll say is the current language maintainers know what's broken. And I sincerely hope they work to fix them before adding in a bunch of new DIPs which will further complicate matters, especially with regard to function signitures.
February 15, 2017
On Wednesday, 15 February 2017 at 21:46:32 UTC, bpr wrote:
> You're missing what I consider to be 'the Big Picture', namely that Swift will become popular on non-Apple platforms, and it needs to be fairly capable to compete with Go, Java, and C++, and others. IBM is already backing server side Swift to some degree.

I don't know if that will happen anytime soon. I think the functionality that the original creator has suggested for system-level programming has to be in place first. Currently the language/runtime  is geared towards best practices inherited from Foundation/Objective-C/Cocoa/macOS... Which makes sense, but makes Swift a second rate citizen in other environments.