July 11, 2016
On 2016-07-11 14:23, Luís Marques wrote:

> Doesn't seem to work for me on 10.11.5. Maybe you need to enable that on
> the latest OSes?

It works for me. I don't recall specifically enabling crash reports. Are you looking at "All Messages"? You can also look at ~/Library/Logs/DiagnosticReports to see if a new file shows up.

> In any case, that will probably get you a mangled stack
> trace, right?

Well, OS X doesn't no anything about D mangling ;). But it will demangle C++ symbols.

> It would still be useful (especially if the stack trace if
> correct, in LLDB I get some crappy ones sometimes) but it would not be
> as convenient as the stack trace on Windows generated by the druntime.

Yes, of course.

-- 
/Jacob Carlborg
July 11, 2016
Garbage collection allows many syntax "liberalizations" that lack of garbage collection renders either impossible or highly dangerous.  (In this definition of "garbage collection" I'm including variations like reference counting.)  For an example of this consider the dynamic array type.  You MUST have garbage collection to use that safely...unless you require the freeing of memory with every change in size.  C++ does that with the STL, but if you want the dynamic types built into the language, then you need garbage collection built into the language.  (This is different from saying it needs to be active everywhere, but once you've got it, good places to use it keep showing up.)

One of the many advantages of the dynamic array type being built into the language is that arrays of different sizes are reasonably comparable by methods built into the language.  This is used all over the place.  In D I almost never need to use "unchecked conversion".

On 07/11/2016 02:30 AM, Chris via Digitalmars-d wrote:
> On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad wrote:
>>
>> Just like there is no C++ book that does not rant about how great RAII is... What do you expect from a language evangelic? The first Java implementation Hotspot inherited its technology from StrongTalk, a Smalltalk successor. It was not a Java phenomenon, and FWIW both Lisp, Simula and Algol68 were garbage collected.
>
> Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me - but this is not the point. My point is that GC became a big thing in the late 90ies early 2000s which is in part owed to Java having become the religion of the day (not Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC when it first came out. It was expected of a (new) language to provide GC by then - and GC had become a selling point for new languages.
>
> [1] And of course computers had become more powerful and could handle the overhead of GC better than in the 80ies.
>
>> What was "new" with Java was compile-once-run-everywhere. Although, that wasn't new either, but it was at least marketable as new.
>>
>>> Java was the main catalyst for GC - or at least for people demanding it. Practically everybody who had gone through IT courses, college etc. with Java (and there were loads) wanted GC. It was a given for many people.
>>
>> Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
>
>>> Yes, it didn't last long. But the fact that they bothered to introduce it, shows you how big GC was/is.
>>
>> No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
>
> It wasn't demanding. I wrote a lot of code in Objective-C and it was perfectly doable. You even have features like `autorelease` for return values. The thing is that Apple had become an increasingly popular platform and more and more programmers were writing code for OS X. So they thought, they'd make it easier and reduce potential memory leaks (introduced by not so experienced Objective-C coders) by adding GC, especially because a lot of programmers expected GC "in this day and age".
>

July 12, 2016
On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:
> Actually NeXTStep drivers were written in Objective-C.
>

NeXT was a cool concept, but it was sad that  they picked such an annoying language to build it.

> They are not alone, as of Android N, Google is making it pretty clear that if one tries to circuvent the constrained set of NDK APIs and workaround the JNI to
> access existing shared objects, the application will be simply be killed.

I don't do Android programming, but NDK is actually fairly rich in comparison to Apple OSes without Objective-C bindings AFAIK. The problem seems to be more in the varying hardware configurations / quality of implementation.

Not using Java on Android sounds like a PITA to be honest.

> If you check the latest BUILD, the current approach being evangelised is .NET Native for 90% of the code, C++/CX or plain C++ with WRL for glueing to low level code until C# gets the missing features from System C#, and C++ for everything else.

I don't know much about .NET Native, does it apply to or will they bring it to .NET Core?

A change in recent years is that Microsoft appears to invest more in their C++ offering, so apparently they no longer see C# as a wholesale replacement.

> The WinRT, User Driver Framework, the new container model and Linux subsystem, the Checked C, input to the C++ Core

I haven't paid much attention to WinRT lately, they have a Linux subsystem?

July 12, 2016
On Tuesday, 12 July 2016 at 03:25:38 UTC, Ola Fosheim Grøstad wrote:
> On Monday, 11 July 2016 at 18:14:11 UTC, Paulo Pinto wrote:

> I don't do Android programming, but NDK is actually fairly rich in comparison to Apple OSes without Objective-C bindings AFAIK. The problem seems to be more in the varying hardware configurations / quality of implementation.

Not really, it is a real pain to use and feals like an half-baked
solution developed by people that were forced by their manager to
support anything else other than Java.

The iOS and WP SDKs have much better support for C++, specially the
integration with native APIs via Objective-C++ and C++/CX and the
debugging tools.

>
> Not using Java on Android sounds like a PITA to be honest.

Yes, the Android team goes to great lengths to make it feel like that.


> I don't know much about .NET Native, does it apply to or will they bring it to .NET Core?

Yes, it is called CoreRT.

>
> A change in recent years is that Microsoft appears to invest more in their C++ offering, so apparently they no longer see C# as a wholesale replacement.

Not really, the big looser is C.

After the OS Dev team won the political war against the DevTools team,
thanks to the Longhorn debacle, the wind changed into the whole going native
theme.

Parallel to that the whole Midori effort was ramped down and its learnings
brought back to the production side of Microsft.

Also contrary to what Microsoft tried to push with C++/CX on WinRT, besides
game developers not many decided to embrace it.

So the result is C# getting the nice features from System C#, AOT compilation to
native code via the Visual C++ backend.

At the same time, the internal efforts to clean C++ code where taken outside and the C++ Core Guidelines were born.

Also Kenny Kerr a very vocal C++ MVP (and MSDN Magazine collaborator) against C++/CX was hired, and is now driving the effort to create a WinRT projection using plain standard modern C++.

>
>> The WinRT, User Driver Framework, the new container model and Linux subsystem, the Checked C, input to the C++ Core
>
> I haven't paid much attention to WinRT lately, they have a Linux subsystem?

Yes, will be available in the upcoming Windows 10 Anniversary edition.

It is built on top of the Drawbrige picoprocesses that are now a Windows 10 feature.

Basically it only supports x64 ELF binaries and makes use of the pico-processes infrastructure to redirect Linux syscalls into NT ones.

It is a collaboration between Microsoft and Ubuntu and there are quite a few Channel 9 videos describing how the whole stack works.

July 12, 2016
On Tuesday, 12 July 2016 at 07:01:05 UTC, Paulo Pinto wrote:

> Also contrary to what Microsoft tried to push with C++/CX on WinRT, besides
> game developers not many decided to embrace it.
>

We didn't embrace it at all, we just have no choice but to use it for a lot of XBoxOne SDK calls. Any files with CX enabled compile much slower and we try to encapsulate them as much as possible.

April 11, 2021
On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
> On Sunday, 10 July 2016 at 03:25:16 UTC, Ola Fosheim Grøstad wrote:
>>
>> Just like there is no C++ book that does not rant about how great RAII is... What do you expect from a language evangelic? The first Java implementation Hotspot inherited its technology from StrongTalk, a Smalltalk successor. It was not a Java phenomenon, and FWIW both Lisp, Simula and Algol68 were garbage collected.
>
> Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me -

Leonardo Da Vinci was coding in Haskell but he was calling it Haskellius idioma programatoribus...

but this is not the point. My point is that GC
> became a big thing in the late 90ies early 2000s which is in part owed to Java having become the religion of the day (not Lisp or SmallTalk)[1]. D couldn't have afforded not to have GC when it first came out. It was expected of a (new) language to provide GC by then - and GC had become a selling point for new languages.
>
> [1] And of course computers had become more powerful and could handle the overhead of GC better than in the 80ies.
>
>> What was "new" with Java was compile-once-run-everywhere. Although, that wasn't new either, but it was at least marketable as new.
>>
>>> Java was the main catalyst for GC - or at least for people demanding it. Practically everybody who had gone through IT courses, college etc. with Java (and there were loads) wanted GC. It was a given for many people.
>>
>> Well, yes, of course Java being used in universities created a demand for Java and similar languages. But GC languages were extensively used in universities before Java.
>
>>> Yes, it didn't last long. But the fact that they bothered to introduce it, shows you how big GC was/is.
>>
>> No, it shows how demanding manual reference counting was in Objective-C on regular programmers. GC is the first go to solution for easy memory management, and has been so since the 60s. Most high level languages use garbage collection.
>
> It wasn't demanding. I wrote a lot of code in Objective-C and it was perfectly doable. You even have features like `autorelease` for return values. The thing is that Apple had become an increasingly popular platform and more and more programmers were writing code for OS X. So they thought, they'd make it easier and reduce potential memory leaks (introduced by not so experienced Objective-C coders) by adding GC, especially because a lot of programmers expected GC "in this day and age".


April 11, 2021
On Sunday, 11 April 2021 at 15:42:55 UTC, Alessandro Ogheri wrote:
> On Monday, 11 July 2016 at 09:30:37 UTC, Chris wrote:
>>
>> Please stop intentionally missing the point. I don't care if Leonardo Da Vinci already had invented GC - which wouldn't surprise me -
>
> Leonardo Da Vinci was coding in Haskell but he was calling it Haskellius idioma programatoribus...

I appreciate a good joke as much as the next guy, but was it really necessary to reply to a four-year-old thread for this?
5 6 7 8 9 10 11 12 13 14 15
Next ›   Last »