November 02, 2017
On 02/11/17 07:13, H. S. Teoh wrote:
> There is another side to this argument, though.  How many times have
> *you*  reviewed the source code of the software that you use on a daily
> basis?  Do you really*trust*  the code that you theoretically*can*
> review, but haven't actually reviewed?  Do you trust the code just
> because some random strangers on the internet say they've reviewed it
> and it looks OK?

This question misses the point. The point is not that you, personally, review every piece of code that you use. That is, if not completely impossible, at least highly impractical.

The real point is that it is *possible* to review the code you use. You don't have to personally review it, so long as someone did.

I think the best example of how effective this capability is is when it, supposedly, failed: OpenSSL and HeartBlead.

Recap: some really old code in OpenSSL had a vulnerability that could remotely expose secret keys from within the server. The model came under heavy criticism because it turned out that despite the fact that OpenSSL is a highly used library, it's code was so convoluted that nobody reviewed it.

The result: a massive overhaul effort, lead by the OpenBSD team, which resulted in a compatible fork, called LibreSSL.

In other words, even when the "many eyes" assumption fails, the recovery is much faster than when the code is close.

Shachar
November 02, 2017
On Thu, Nov 02, 2017 at 07:08:51AM +0000, codephantom via Digitalmars-d wrote:
> On Wednesday, 1 November 2017 at 18:42:07 UTC, Bo wrote:
> > Linux as a market that is so fragmented on the desktop level.
> 
> This demonstrates an all to often misunderstanding of the purpose of Linux, and open source in general (depending on what licence is used).
> 
> Fragmentation is an important, necessary, and inevitable outcome of open source.
> 
> Open source provides the freedom for the user to adapt the software to their own environment. That's the whole point..to enable that kind of 'fragmentation'.
[...]

Yeah I'm not sure 'fragmentation' is the right word to use, but certainly 'customizability' is a big, *huge* factor in my choosing to use Linux instead of Windows. And I don't mean just customization in the way of choosing "themes", which is just purely cosmetic.  I mean reaching into the guts of the system and changing how it works, according to my liking.  In fact, I despise the so-called "desktop metaphor" -- I think it's a silly idea that doesn't match how the machine works -- so I reconfigure my X server to use Ratpoison instead, a "window" manager that eliminates the mouse and basically maximizes everything into single-screen, single-window, no toolbars, no title decorations, nothing. And keyboard controls for everything.  But almost every other Linux user (needless to say Windows user) won't even be able to *breathe* in such a setup, but that's OK, because Linux is not tied to a single UI, whether it be GUI or something else.  You can use whatever GUI or "desktop" environment you wish, and it will still all work.  This flexibility allows everyone to customize their environment to what suits them best, rather than have some predefined, unchangeable default shoved down everyone's throats.

With Windows, there is no way to go that far... even what it *does* allow you to do can cause random stuff to break, 'cos programs are written with the assumption that you *never* change how things work. (Try changing mouse focus to lazy focus sometime...  and watch how many applications malfunction, behave oddly, or just plain break. And this is not even a major customization!)

Understandably, though, most non-programmer types prefer the familiarity and comfort of Windows' default environment.  That's why Windows will still be around for the next little while. :-P


T

-- 
I see that you JS got Bach.
November 02, 2017
On Thu, Nov 02, 2017 at 09:16:02AM +0000, Dave Jones via Digitalmars-d wrote:
> On Thursday, 2 November 2017 at 08:59:05 UTC, Patrick Schluter wrote:
> > On Thursday, 2 November 2017 at 06:28:52 UTC, codephantom wrote:
> > > 
> > > But Ken Thompson summed it all up nicely: "You can't trust code that you did not totally create yourself."
> > 
> > Even that is wrong. You can trust code you create yourself only if it was reviewed by others as involved as you. I do not trust the code I write. The code I write is generally conforming to the problem I think it solves. More than once I was wrong on my assumptions and therefore my code was wrong, even if perfectly implemented.
> 
> He means trust in the sense that there's no nefarious payload hidden in there, not that it works properly.
[...]

Sometimes the line is blurry, though.  OpenSSL with the Heartbleed bug has no nefarious payload -- but I don't think you could say you "trust" it.  Trust is a tricky thing to define.

But more to the original point: Thompson's article on trusting trust goes deeper than mere code.  The real point is that ultimately, you have to trust some upstream vendor "by faith", as it were, because if you want to be *really* paranoid, you'll have to question not only whether your compiler comes with a backdoor of the kind Thompson describes in the article, but also whether there's something nefarious going on with the *hardware* your code is running on.  I mean, these days, CPUs come with microcode, so even if you had access to a known-to-be-uncompromised disassembler and reviewed the executable instruction by instruction, in a philosophical sense you *still* cannot be sure that when you hand this machine code to the CPU, it will not do something nefarious. What if the microcode was compromised somewhere along the line?  And even if you could somehow review the microcode and verify that it doesn't do anything nefarious, do you really trust that the CPU manufacturer hasn't modified some of the CPU design circuitry to do something nefarious? You can review the VLSI blueprints for the CPU, but how do you know the factory didn't secretly modify the hardware?  If you *really* wish to be 100% sure about anything, you'll have to use a scanning electron microscope to verify that the hardware actually does what the manufacturer says it does and nothing else.

(Not to mention, even if you *were* able to review every atom of your CPU to be sure it does what it's supposed to and nothing else, how do you know your hard drive controller isn't somehow compromised to deliver a different, backdoored version of your code when you run the executable, but deliver the innocent reflection of the source code when you're reviewing the binary? So you'll have to use the electron microscope on your HD controller too. And the rest of your motherboard and everything else attached to it.)

Of course, practically speaking, somewhere on the line between reviewing code and using an electron microscope (and even in the latter case, one has to question whether the microscope manufacturer inserted something nefarious to hide a hardware exploit -- so you'd better build your own electron microscope from ground up), there is a line where you'd just say, OK, this is good enough, I'll just have to take on faith that below this level, everything works as advertised.  Otherwise, you'd get nothing done, because nobody has a long enough lifetime, nor patience, nor the requisite knowledge, to review *everything* down to the transistor level.  Somewhere along the line you just have to stop and take on faith that everything past that point isn't compromised in some way.

And yes, I said and meant take on *faith* -- because even peer review is a matter of faith -- faith that the reviewers don't have a hidden agenda or are involved in secret collusions to push some agenda. It's very unlikely to happen in practice, but you can't be *sure*. And that's the point Thompson was getting at.  You have to build up trust from *somewhere* other than ground zero.  And because of that, you should, on the other hand, always be prepared to mitigate unexpected circumstances that may compromise the trust you've placed in something.  Rather than becoming paranoid and locking yourself in a Faraday cage inside an underground bunker, isolated from the big bad world, and building everything from scratch yourself, you decide to what level to start building your trust on, and prepare ways to mitigate problems when it turns out that what you trust wasn't that trustworthy after all.

So if you want to talk about trust, open source code is only the tip of the iceberg.  The recent fiasco about buggy TPM chips generating easily-cracked RSA keys is ample proof of this.  Your OS may be fine, but when it relies on a TPM chip that has a bug, you have a problem. And this is just a *bug* we're talking about.  What if it wasn't a bug, but a deliberate backdoor inserted by the NSA or some agency with an ulterior motive?  Your open source OS won't help you here. And yes, the argument has been made that if only the TPM code were open source, the bug would have been noticed. But again, that depends. Just because the code is open source doesn't guarantee it's getting the attention it needs. And even if it is, there's always the question of whether the hardware it's running on isn't compromised.  At *some* point, you just have to draw the line and take things on faith, otherwise you have no choice but to live in a Faraday cage inside an underground bunker.


T

-- 
2+2=4. 2*2=4. 2^2=4. Therefore, +, *, and ^ are the same operation.
November 02, 2017
On Thu, Nov 02, 2017 at 08:53:07AM +0000, Patrick Schluter via Digitalmars-d wrote:
> On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
[...]
> And that's a nice argument for D (dmd, phobos) as it is quite compact and relatively well written so that it can be reviewed by mere mortals. Ever tried to read gcc or glibc ? Forget about it if you're not an astronaut.

Yeah, I've (tried to) read glibc source code before.  It's ... not for
the uninitiated. :-P

Which brings up another point about open source code: just because you can *see* the code, doesn't guarantee you'll *understand* enough of it to verify its correctness.


> Even when not knowing D all to well I could understand what was going on in phobos and check some of the common pitfalls [...]

Yeah, that was one thing that totally amazed me about D the first time I looked at the Phobos source code.  It's sooo readable!!!!!  Very unlike most of the source of standard libraries of other languages that I've tried to read.  The fact that D allows the Phobos authors to express complex concepts needed in standard libraries in a readable, maintainable way, was a big selling point of D to me.

There *are* some dark, dirty corners in Phobos where the code makes you cringe... but generally speaking, these are restricted to only a few rare places, rather than pervasive throughout the code the way, say, glibc source code is.  Or any sufficiently-complex C/C++ library, really, that generally tends to slide into macro spaghetti hell, conditional #ifdef nightmare, and/or non-standard compiler extension soup that drowns out any semblance of "normal" C/C++ syntax.


> > At least with open source code disinterested 3rd parties can review the code without undue bias and notice problems (and ostensibly, fix them).  But let's not kid ourselves that open source is *necessarily* better. It *can* be better in some cases, but it depends.  Trust is a far more complex issue than "proprietary is bad, open source is good", as certain open source zealots would have us believe.  It takes more than just being open source; other factors also play a critical role, so just because something is open source guarantees nothing.
> > 
> There's also some open source projects are also maintained by dicks and working with them make the whole experience nasty.

Yeah.  There's always the option to fork, of course, which isn't possible with proprietary software.  But even then, they can still make your life a living hell if you're unlucky enough to get on their wrong side.


T

-- 
Philosophy: how to make a career out of daydreaming.
November 02, 2017
On Thu, Nov 02, 2017 at 11:38:21AM +0200, Shachar Shemesh via Digitalmars-d wrote:
> On 02/11/17 07:13, H. S. Teoh wrote:
> > There is another side to this argument, though.  How many times have *you* reviewed the source code of the software that you use on a daily basis?  Do you really *trust* the code that you theoretically *can* review, but haven't actually reviewed?  Do you trust the code just because some random strangers on the internet say they've reviewed it and it looks OK?
> 
> This question misses the point. The point is not that you, personally, review every piece of code that you use. That is, if not completely impossible, at least highly impractical.
> 
> The real point is that it is *possible* to review the code you use. You don't have to personally review it, so long as someone did.

That only shifts one question to another, though: do you trust the "someone" who did review the code?  That is what I mean by "some random strangers on the internet".  When you download, say, glibc, whose authors you presumably never met and probably may not have heard of until that point, you're basically trusting that these authors have done their due diligence in reviewing the code and making sure it meets some standard of quality.  But you *don't know* if they reviewed it or not, and even if they did, you don't know whether their standard of quality matches yours.  After all, they are just some "random strangers on the internet" whom you've never met, and probably never heard of.  Yet you're putting your trust in them to write proper software that will be running on your system.

Please keep in mind, I'm not saying that *in general*, you can't trust the upstream authors.  But the issue here, which is also Thompson's point in his article, is, how do you know whether or not your trust is misplaced?  You can't know for sure.  At some level, you just have to stop and take it *on faith* that these "random online strangers" are giving you good code, because as you said, to go down the complete paranoia road is highly impractical, if not completely impossible.

But if you're going to put your trust in said random online strangers, what makes you think they are more trustworthy than some random anonymous employees of some big corporation, whose proprietary software you're running on your system?  Again, you can't know *for sure*.  At some point, it just comes down to trusting that they have done their jobs well, and without nefarious motives.  So where you put your trust is a matter of faith, not fact, because you can't *objectively* be completely sure unless you go down the paranoia road to personally verifying everything, which is an infeasible, if not outright impossible, task.


> I think the best example of how effective this capability is is when it, supposedly, failed: OpenSSL and HeartBlead.
> 
> Recap: some really old code in OpenSSL had a vulnerability that could remotely expose secret keys from within the server. The model came under heavy criticism because it turned out that despite the fact that OpenSSL is a highly used library, it's code was so convoluted that nobody reviewed it.

And that's the other thing about open source: sure, the code is available for everyone to read.  But how many will actually understand it?  If it's so convoluted, as you said, nobody will review it. Or if they did, you'd have less confidence whether they caught all of the problems.


> The result: a massive overhaul effort, lead by the OpenBSD team, which resulted in a compatible fork, called LibreSSL.
> 
> In other words, even when the "many eyes" assumption fails, the recovery is much faster than when the code is close.
[...]

Ahem. It took the LibreSSL folk *years* to cleanup the original OpenSSL code and bring it up to equivalent functionality.  That's hardly what I'd call "much faster".

Don't get me wrong; personally I agree with you that open source is better.  All I'm saying is that this eventually boils down to a matter of opinion, because ultimately, you're trusting, on faith, that this way of doing things will produce better results.  Does it actually?  It's hard to say.  I like to intepret the evidence as yes, and I think you do too, but I'm not 100% sure it's not just confirmation bias.  It's hard to be sure because you can't know until you personally verify everything. But you can't do that, so eventually you have to just trust that it does what you think it does, and hope for the best.  How will things pan out eventually? It's anyone's guess.


T

-- 
English is useful because it is a mess. Since English is a mess, it maps well onto the problem space, which is also a mess, which we call reality. Similarly, Perl was designed to be a mess, though in the nicest of all possible ways. -- Larry Wall
November 03, 2017
On Wednesday, 1 November 2017 at 08:49:05 UTC, Joakim wrote:
> On Wednesday, 1 November 2017 at 00:16:19 UTC, Mengu wrote:
>> On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
>>>
>>> I don't know how intense your data analysis is, but I replaced a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM as my daily driver a couple years ago, without skipping a beat.
>>>  I built large mixed C++/D codebases on my ultrabook, now I do that on my Android/ARM tablet, which has a slightly weaker chip than my smartphone.

How does the performance compare between an i5 laptop and an Android tablet?


>>
>> Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space?
>
> Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market.  As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
>

The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now. At this point, were they do to that, they would end up with a machine that has less power in most cases (there are Atom and Celeron laptops), and probably less memory and disk storage. That solution would be most attractive to Chromebook type users and very low end laptop users. And while people buy low spec laptops and desktops, there are still many laptops and desktops sold with chips that aren't named Atom and Celeron or arm. If phones and tablets try to get chips as powerful as those for the desktop and laptops they run into the chip maker's problem - the more processing power, the more the electricity the chip uses. Phones and tablets don't plug into the wall and they are smaller than the batteries in laptops. And in order to use a phone/tablet as a "lean forward" device (as opposed to "lean back") and do work, they will have to spend money on a "laptop shell" that will have a screen and keyboard and probably an SSD/HD which will cancel most of the cost savings from not buying a laptop.

In the case of trying to court Android development, I read that 95% of Android is done on Java (and maybe other JVM languages like the now "officially supported" Kotlin) and 5% in C or C++. But that 5% is for applications that have a need for high performance, which is mostly games. Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit.

>> I have seen conflicting reports about what OS is bigger in the server market, but Windows is substantial and the more frequent winner.
>>
>> https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends
>>
>> https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/
>
> I have never seen any report that Windows is "bigger in the server market."

I linked one that said:

"And what OSes are running in virtual machines and on physical servers around the world? It turns out like with client OSes, Microsoft is dominant. Fully 87.7% of the physical servers and VMs in the Spiceworks network (which are mostly on-premises) run Microsoft Windows Server."

> Last month's Netcraft survey notes,
>
> "which underlying operating systems are used by the world's web facing computers?
>
> By far the most commonly used operating system is Linux, which runs on more than two-thirds of all web-facing computers. This month alone, the number of Linux computers increased by more than 91,000; and again, this strong growth can largely be attributed to cloud hosting providers, where Linux-based instances are typically the cheapest and most commonly available."
> https://news.netcraft.com/archives/2017/09/11/september-2017-web-server-survey.html

Web-facing server is a subset of servers. Shared web hosting services are probably a harder target for native-code applications than internal IT servers.

But regardless of whether Windows is dominant, or just widely used, you haven't made predictions that Windows servers are going to die.

>
> Your first link is actually a bad sign for Windows, as it's likely just because companies are trying to save money by having their employees run Windows apps off a virtualized Windows Server, rather than buying a ton more Windows PCs.

I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.

> Meanwhile, your second link sees "Linux maintaining a noticeable lead" in the web-hosting market.

Don't know why I linked that as it doesn't even have a percentage breakdown. My intent was to show a web server breakdown but I will concede that Linux is bigger for web servers. However, Windows is still big and you aren't predicting it will die.

>
>> And if desktop OSes were going to go away, the MacOS would go before Windows.
>
> Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays.  Do you know the last time Apple released a standalone desktop computer?  2014, when they last updated the Mac Mini.  They haven't updated the Mac Pro since 2013.

Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?

>
> They see the writing on the wall, which is why they're lengthening their release cycles for such legacy products.
>

Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.


If you look at the graph in this article, the iPad has declined more as a percentage of Apple revenue than the macOS line has in the last five years.

https://www.statista.com/statistics/382260/segments-share-revenue-of-apple/


There is a case to be made for supporting  Android/iOS cross-compilation. But it doesn't have to come at the expense of Windows 64-bit integration. Not sure they even involve the same skillsets. Embarcadero and Remobjects both now support Android/iOS development from their Windows (and macOS in the case of Remobjects) IDEs.
November 03, 2017
On Thursday, 2 November 2017 at 05:13:42 UTC, H. S. Teoh wrote:
> One thing is clear, though: claiming that Windows is "dead" is, frankly, ridiculous.  Even a non-Windows person like me who rarely has any reason to notice things Windows-related, can see enough circumstantial evidence around me that Windows is still very much alive and kicking.  (Even if in my ideal world there would be no Windows... but then, if the world were my ideal, 90% of computer users out there would probably be very angry about being forced to use obscure text-only interfaces that I'm completely comfortable in.  So it's probably not a bad thing the real world doesn't match my ideal one. :-D)

Congratulations, you find a claim that literally nobody has made in this thread to be ridiculous.  Next you'll say that Walter's claim that Java will replace COBOL is ridiculous or Adam's claim that we should write a full crypto stack ourselves in D is a bad idea, both of which neither ever said.

On Friday, 3 November 2017 at 06:20:25 UTC, Tony wrote:
> On Wednesday, 1 November 2017 at 08:49:05 UTC, Joakim wrote:
>> On Wednesday, 1 November 2017 at 00:16:19 UTC, Mengu wrote:
>>> On Monday, 30 October 2017 at 13:32:23 UTC, Joakim wrote:
>>>>
>>>> I don't know how intense your data analysis is, but I replaced a Win7 ultrabook that had a dual-core i5 and 4 GBs of RAM with an Android tablet that has a quad-core ARMv7 and 3 GBs of RAM as my daily driver a couple years ago, without skipping a beat.
>>>>  I built large mixed C++/D codebases on my ultrabook, now I do that on my Android/ARM tablet, which has a slightly weaker chip than my smartphone.
>
> How does the performance compare between an i5 laptop and an Android tablet?

My core i5 ultrabook died in late 2015, so I never ran any performance comparisons.  I'd say that its 2012 Sandy Bridge dual-core i5 was likely a little faster to compile the same code than the 2014 quad-core Cortex-A15 I'm using in my tablet now.  I've recently been trying out AArch64 support for D on a 2017 Android tablet which has one of the fastest quad-core ARMv8 chips from 2016, I'd guess that's faster than the i5.  But this is all perception, I don't have measurements.

>>> Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space?
>>
>> Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market.  As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
>>
>
> The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.

To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen.  And of course it's happening right now, why do you think PC sales are down 25% over the last six years, after rising for decades?  For many people, a PC was overkill but they didn't have a choice of another easier form factor and OS.  Now they do.

> At this point, were they do to that, they would end up with a machine that has less power in most cases (there are Atom and Celeron laptops), and probably less memory and disk storage. That solution would be most attractive to Chromebook type users and very low end laptop users. And while people buy low spec laptops and desktops, there are still many laptops and desktops sold with chips that aren't named Atom and Celeron or arm. If phones and tablets try to get chips as powerful as those for the desktop and laptops they run into the chip maker's problem - the more processing power, the more the electricity the chip uses. Phones and tablets don't plug into the wall and they are smaller than the batteries in laptops. And in order to use a phone/tablet as a "lean forward" device (as opposed to "lean back") and do work, they will have to spend money on a "laptop shell" that will have a screen and keyboard and probably an SSD/HD which will cancel most of the cost savings from not buying a laptop.

You seem wholly ignorant of this market and the various points I've made in this thread.  Do you know what the median Windows PC sold costs?  Around $400.  Now shop around, are you finding great high-spec devices at that price?  The high-spec market that you focus on is a tiny niche, the bulk of the PC market is easily eclipsed by mobile performance, which is why people are already turning in their PCs for mobile.

Battery life on mobile is already much better than laptops, for a variety of reasons including the greater efficiency of mobile ARM chips.  And the Sentio laptop shell I already linked in this thread has a screen, keyboard, and battery but no SSD/HD, which is why it only costs $150, much less than a laptop.

> In the case of trying to court Android development, I read that 95% of Android is done on Java (and maybe other JVM languages like the now "officially supported" Kotlin) and 5% in C or C++. But that 5% is for applications that have a need for high performance, which is mostly games. Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit.

I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games.  I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device.

btw, the mobile gaming market is now larger than the PC gaming market, so to think that they're sitting around using tools and IDEs optimized for that outdated PC platform is silly:

https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/

>>> I have seen conflicting reports about what OS is bigger in the server market, but Windows is substantial and the more frequent winner.
>>>
>>> https://community.spiceworks.com/networking/articles/2462-server-virtualization-and-os-trends
>>>
>>> https://www.1and1.com/digitalguide/server/know-how/linux-vs-windows-the-big-server-check/
>>
>> I have never seen any report that Windows is "bigger in the server market."
>
> I linked one that said:
>
> "And what OSes are running in virtual machines and on physical servers around the world? It turns out like with client OSes, Microsoft is dominant. Fully 87.7% of the physical servers and VMs in the Spiceworks network (which are mostly on-premises) run Microsoft Windows Server."
>
>> Last month's Netcraft survey notes,
>>
>> "which underlying operating systems are used by the world's web facing computers?
>>
>> By far the most commonly used operating system is Linux, which runs on more than two-thirds of all web-facing computers. This month alone, the number of Linux computers increased by more than 91,000; and again, this strong growth can largely be attributed to cloud hosting providers, where Linux-based instances are typically the cheapest and most commonly available."
>> https://news.netcraft.com/archives/2017/09/11/september-2017-web-server-survey.html
>
> Web-facing server is a subset of servers. Shared web hosting services are probably a harder target for native-code applications than internal IT servers.

Web servers are a subset but by far the largest one, so any accounting of market share is going to be determined by them.  Native code has been dying on the server regardless of web or internal servers, but the real distinction is performance.  Facebook writes their backend in C++, the same for any server service that really needs to scale out, which is not likely to be internal IT.

> But regardless of whether Windows is dominant, or just widely used, you haven't made predictions that Windows servers are going to die.

I don't think about niche platforms that hardly anybody uses.

>> Your first link is actually a bad sign for Windows, as it's likely just because companies are trying to save money by having their employees run Windows apps off a virtualized Windows Server, rather than buying a ton more Windows PCs.
>
> I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.

I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it.  Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime.  I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be.  But regardless of how you slice it, this isn't a good sign for Windows.

>> Meanwhile, your second link sees "Linux maintaining a noticeable lead" in the web-hosting market.
>
> Don't know why I linked that as it doesn't even have a percentage breakdown. My intent was to show a web server breakdown but I will concede that Linux is bigger for web servers. However, Windows is still big and you aren't predicting it will die.

I've actually said elsewhere in this forum that the cloud server market is way overblown and will greatly diminish in the coming years because of greater p2p usage, so yeah, I think both linux and Windows on the server will largely die off.

>>> And if desktop OSes were going to go away, the MacOS would go before Windows.
>>
>> Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays.  Do you know the last time Apple released a standalone desktop computer?  2014, when they last updated the Mac Mini.  They haven't updated the Mac Pro since 2013.
>
> Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?

The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years.  Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS.

As for the iOS Macbook, it's out, it's called the iPad Pro.  Their CEO, Tim Cook, is always boasting about how it's all he uses these days:

https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/
http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone

>> They see the writing on the wall, which is why they're lengthening their release cycles for such legacy products.
>>
>
> Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.

Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market.  The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out.

They don't break out iMac sales but given that it's much more expensive than the Mac Mini, it's doubtful that it sells better.  I was only talking about Apple's standalone desktops because they're most comparable to the PC market, but it's true that PC/Mac all-in-ones like the iMac have done better lately, one of the few growing segments.  But when the entire desktop/laptop market is shrinking and the much more expensive all-in-one sales are so small, that doesn't mean much.

> If you look at the graph in this article, the iPad has declined more as a percentage of Apple revenue than the macOS line has in the last five years.
>
> https://www.statista.com/statistics/382260/segments-share-revenue-of-apple/

I don't have access to that chart, but yes, the iPad and tablet markets have been shrinking.  It's possible that more people would rather use their smartphone, which usually has a more powerful chip than Android tablets, with the Dex dock or a Sentio-like laptop shell than a tablet.  But neither group is using a PC: both are mobile, smartphone even more so.

> There is a case to be made for supporting  Android/iOS cross-compilation. But it doesn't have to come at the expense of Windows 64-bit integration. Not sure they even involve the same skillsets. Embarcadero and Remobjects both now support Android/iOS development from their Windows (and macOS in the case of Remobjects) IDEs.

You're right that some of the skills are different and D devs could develop for mobile from a Windows IDE.  But my point was more about general investment and focus, the currently dominant platform, Android, needs it, while the fading platform, Windows, shouldn't get much more.

Frankly, I find it tiresome that some Windows devs in this thread think the reason IDE support isn't better is because somebody is listening to me.  More likely, Rainer or whoever would do that work is already invested in Windows, but doesn't have the time or interest to do much more.

You'd be much better off finding that person and helping or sponsoring them rather than debating me, as I likely have no effect on that person's thinking.  I wish it were otherwise, but I doubt it.
November 03, 2017
On Friday, 3 November 2017 at 09:16:42 UTC, Joakim wrote:
>
>>>> Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space?
>>>
>>> Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market.  As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
>>>
>>
>> The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.
>
> To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen.

You said 99% would go away. So "almost all".

> And of course it's happening right now, why do you think PC sales are down 25% over the last six years, after rising for decades?  For many people, a PC was overkill but they didn't have a choice of another easier form factor and OS.  Now they do.

There are others reasons for PC sales declining beyond someone just using a phone or a tablet. Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past - only a hard drive failure will trigger a PC upgrade for them.

Some have cut down from a desktop and a laptop to just a laptop as the laptops got faster. Or a family replaces some combination of laptops and desktops with a combination of laptops/desktops/tablets/phones.

That 25% is not indicative of 25% of homes getting rid of ALL of their PC/laptops.


>
>> At this point, were they do to that, they would end up with a machine that has less power in most cases (there are Atom and Celeron laptops), and probably less memory and disk storage. That solution would be most attractive to Chromebook type users and very low end laptop users. And while people buy low spec laptops and desktops, there are still many laptops and desktops sold with chips that aren't named Atom and Celeron or arm. If phones and tablets try to get chips as powerful as those for the desktop and laptops they run into the chip maker's problem - the more processing power, the more the electricity the chip uses. Phones and tablets don't plug into the wall and they are smaller than the batteries in laptops. And in order to use a phone/tablet as a "lean forward" device (as opposed to "lean back") and do work, they will have to spend money on a "laptop shell" that will have a screen and keyboard and probably an SSD/HD which will cancel most of the cost savings from not buying a laptop.
>
> You seem wholly ignorant of this market and the various points I've made in this thread.  Do you know what the median Windows PC sold costs?  Around $400.  Now shop around, are you finding great high-spec devices at that price?

You said 99% are going away. You need to talk about a lot more than median prices. But nevertheless, $400 laptops have better specs and performance than $400 tablets and phones. And you are good to go with a laptop. People who want to go down to the coffee shop and work on their term paper on a laptop just take the laptop. People who want to go down to the coffee shop and work on their term paper on a phone or tablet, have to bring a keyboard and monitor (phone) or a keyboard and tablet stand and squint at their screen (tablet).


> The high-spec market that you focus on is a tiny niche, the bulk of the PC market is easily eclipsed by mobile performance, which is why people are already turning in their PCs for mobile.

I don't think that phones/tablets can compete performance-wise with $400 and up machines, which you claim is over 50% of the market.

> Battery life on mobile is already much better than laptops, for a variety of reasons including the greater efficiency of mobile ARM chips.

That is a common belief, but it is referred to as a myth in many places, including this research paper after performing tests on different architectures. It ends with:

"An x86 chip can be more power efficient than an ARM processor, or vice versa, but it’ll be the result of other factors — not whether it’s x86 or ARM."

https://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient/3


> And the Sentio laptop shell I already linked in this thread has a screen, keyboard, and battery but no SSD/HD, which is why it only costs $150, much less than a laptop.

I see that 11.6" screen setup with the small storage of a phone as competition for $150 Chromebooks, not $400 Windows laptops. I would prefer to be on my Chromebook and take a call on my cell phone, rather than having my cellphone plugged into a docking station and have to unplug it or put it on speaker phone.


>
>> In the case of trying to court Android development, I read that 95% of Android is done on Java (and maybe other JVM languages like the now "officially supported" Kotlin) and 5% in C or C++. But that 5% is for applications that have a need for high performance, which is mostly games. Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit.
>
> I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games.  I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device.

I am assuming that game developers work in both spaces, if not concurrently, they move between the two.

It also may be incorrect to assume that D  would be acceptable in its current incarnation for game development due to the non-deterministic activity of the garbage collector. In which case, it would have little rationale for Android development. As far as iOS, there are two native code languages with a large lead, and both use Automatic Reference Counting, rather than garbage collection which would presumably give them give them the advantage for games. But D could potentially compete for non-game development.

>
> btw, the mobile gaming market is now larger than the PC gaming market, so to think that they're sitting around using tools and IDEs optimized for that outdated PC platform is silly:
>
> https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/

Are you suggesting they are developing their games for iOS and Android devices ON those devices? Apple has XCode for developing iOS apps and it runs on macOS machines only. There is also the Xamarin IDE or IDE plug-in from Microsoft that allows C# on iOS, but it runs on macOS or WIndows. For Android, there is Android Studio - "The Official IDE of Android" - which runs on Windows, macOS and Linux. There is no Android version.



>
>> But regardless of whether Windows is dominant, or just widely used, you haven't made predictions that Windows servers are going to die.
>
> I don't think about niche platforms that hardly anybody uses.

It is the dominant internal IT platform. That is not niche and not something that is "hardly used". But what you could say is that given your prediction that Windows sales will decline by 99%, Microsoft will go out of business.


>
>>> Your first link is actually a bad sign for Windows, as it's likely just because companies are trying to save money by having their employees run Windows apps off a virtualized Windows Server, rather than buying a ton more Windows PCs.
>>
>> I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
>
> I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it.  Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime.  I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be.  But regardless of how you slice it, this isn't a good sign for Windows.

They use virtualization for Linux for the same reason I stated - so the application/http/email/database server can be on an OS that can be rebooted to complete upgrades or a VM can be used as an isolated "sandbox" for testing upgrades of a particular server or some in-house developed software.


>>>> And if desktop OSes were going to go away, the MacOS would go before Windows.
>>>
>>> Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays.  Do you know the last time Apple released a standalone desktop computer?  2014, when they last updated the Mac Mini.  They haven't updated the Mac Pro since 2013.
>>
>> Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
>
> The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years.  Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS.
>
> As for the iOS Macbook, it's out, it's called the iPad Pro.  Their CEO, Tim Cook, is always boasting about how it's all he uses these days:
>
> https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/
> http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone
>

A CEO is a baby user of a PC. What would he do besides email? He has people to do his powerpoint and documents. Not a good endorsement. And the iPad Pro is twice the price of what you say is the average price of a PC laptop. You could buy a Windows laptop and an Android Zenpad tablet and still have paid less than an iPad.

I'd like to be there when Cook tells all Apple employees they need to turn in their MacBooks for iPads.


>>> They see the writing on the wall, which is why they're lengthening their release cycles for such legacy products.
>>>
>>
>> Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.
>
> Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market.  The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out.

Why did they fund development of a new iMac Pro which is coming this December as well as the new MacBook Pros that came out this June? That's a contradiction of "milk it like an iPod".
November 03, 2017
On Friday, 3 November 2017 at 11:57:58 UTC, Tony wrote:
> On Friday, 3 November 2017 at 09:16:42 UTC, Joakim wrote:
>>
>>>>> Why do predictions about the future matter when at the present Windows dominates the desktop and is also strong in the server space?
>>>>
>>>> Because that desktop market matters much less than it did before, see the current mobile dominance, yet the D core team still focuses only on that dying x86 market.  As for the future, why spend time getting D great Windows IDE support if you don't think Windows has much of a future?
>>>>
>>>
>>> The concept that you are proposing, that people will get rid of ALL their desktops and laptops for phones or tablets, doesn't seem to be happening right now.
>>
>> To begin with, I never said they'd "ALL" be replaced in the paragraph you're quoting above, but yes, that's essentially what will eventually happen.
>
> You said 99% would go away. So "almost all".

Yes, I was simply noting not "in the paragraph you're quoting above."

>> And of course it's happening right now, why do you think PC sales are down 25% over the last six years, after rising for decades?  For many people, a PC was overkill but they didn't have a choice of another easier form factor and OS.  Now they do.
>
> There are others reasons for PC sales declining beyond someone just using a phone or a tablet. Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past - only a hard drive failure will trigger a PC upgrade for them.
>
> Some have cut down from a desktop and a laptop to just a laptop as the laptops got faster. Or a family replaces some combination of laptops and desktops with a combination of laptops/desktops/tablets/phones.
>
> That 25% is not indicative of 25% of homes getting rid of ALL of their PC/laptops.

Sure, there are multiple reasons that PC sales are declining and many homes still keep a residual PC to get their work done.  With the DeX dock and Sentio shell coming out this year, my prediction is that those residual PCs will get swept out over the coming 5-10 years.

But that established PC userbase shrinking is not what you should be worried about.  I've talked to multiple middle-class consumers in developing markets- they would be considered poor in the US if you converted their income to dollars- who tell me that they recently got their first smartphone for $150-200 and that it is the first time they ever used the internet, with cheap 3G/4G plans that are only now springing up.  They don't use the web, only mobile chat or social apps.

Now, do you think these billions of new users of computing and the internet are more likely to buy a cheap laptop shell or dock for their smartphone when they someday need to do some "lean forward" work, as you call it, or spend much more on a Windows PC?  I know where my bet is.

>>> At this point, were they do to that, they would end up with a machine that has less power in most cases (there are Atom and Celeron laptops), and probably less memory and disk storage. That solution would be most attractive to Chromebook type users and very low end laptop users. And while people buy low spec laptops and desktops, there are still many laptops and desktops sold with chips that aren't named Atom and Celeron or arm. If phones and tablets try to get chips as powerful as those for the desktop and laptops they run into the chip maker's problem - the more processing power, the more the electricity the chip uses. Phones and tablets don't plug into the wall and they are smaller than the batteries in laptops. And in order to use a phone/tablet as a "lean forward" device (as opposed to "lean back") and do work, they will have to spend money on a "laptop shell" that will have a screen and keyboard and probably an SSD/HD which will cancel most of the cost savings from not buying a laptop.
>>
>> You seem wholly ignorant of this market and the various points I've made in this thread.  Do you know what the median Windows PC sold costs?  Around $400.  Now shop around, are you finding great high-spec devices at that price?
>
> You said 99% are going away. You need to talk about a lot more than median prices. But nevertheless, $400 laptops have better specs and performance than $400 tablets and phones. And you are good to go with a laptop. People who want to go down to the coffee shop and work on their term paper on a laptop just take the laptop. People who want to go down to the coffee shop and work on their term paper on a phone or tablet, have to bring a keyboard and monitor (phone) or a keyboard and tablet stand and squint at their screen (tablet).

No, they'll bring a Sentio-like laptop shell, which only costs $150.  Your performance or portability arguments for PCs are losers, that's not affecting this mobile trend at all.  The biggest issue is that productivity apps have historically been developed for desktop OS's and are only starting to be ported over to or cloned on mobile, like Office Mobile or Photoshop Express.

>> The high-spec market that you focus on is a tiny niche, the bulk of the PC market is easily eclipsed by mobile performance, which is why people are already turning in their PCs for mobile.
>
> I don't think that phones/tablets can compete performance-wise with $400 and up machines, which you claim is over 50% of the market.

$400 PCs are vastly over-specced for most of their owners, they won't even use most of the compute headroom on a $200 smartphone, which is why they're already shifting.  The only issues holding the remaining 75% back are the need for mobile work accessories like Dex/Sentio and some PC-only apps, both of which are changing this year.

>> Battery life on mobile is already much better than laptops, for a variety of reasons including the greater efficiency of mobile ARM chips.
>
> That is a common belief, but it is referred to as a myth in many places, including this research paper after performing tests on different architectures. It ends with:
>
> "An x86 chip can be more power efficient than an ARM processor, or vice versa, but it’ll be the result of other factors — not whether it’s x86 or ARM."
>
> https://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-arm-x86-or-mips-intrinsically-more-power-efficient/3

I'm not making theoretical comparisons about RISC versus CISC, but actual power and battery life measurements where mobile ARM devices like the iPad Pro come out way ahead of equivalent x86 PCs like the Surface Pro 4 (scroll down to the sections on Energy Management):

https://www.notebookcheck.net/Apple-iPad-Pro-Tablet-Review.156404.0.html
https://www.notebookcheck.net/Apple-iPad-Pro-10-5-Tablet-Review.228714.0.html

Now, I initially said that ARM efficiency is only one factor in greater battery life, no doubt iOS is much more optimized for battery life than Windows.  But all benchmarks pretty much find the same results for just ARM chips.  I'm not interested in theories about how CISC x86 could be just as good if Intel just tried harder, especially since they threw up the white flag and exited the mobile smartphone/tablet market:

https://www.recode.net/2016/5/2/11634168/intel-10-billion-on-mobile-before-giving-up

>> And the Sentio laptop shell I already linked in this thread has a screen, keyboard, and battery but no SSD/HD, which is why it only costs $150, much less than a laptop.
>
> I see that 11.6" screen setup with the small storage of a phone as competition for $150 Chromebooks, not $400 Windows laptops. I would prefer to be on my Chromebook and take a call on my cell phone, rather than having my cellphone plugged into a docking station and have to unplug it or put it on speaker phone.

I don't know why you're so obsessed with storage when even midrange smartphones come with 32 GBs nowadays, expandable to much more with a SD card.  My tablet has only 16 GBs of storage, with only 10-12 actually accessible, but I've never had a problem building codebases that take up GBs of space with all the object files, alongside a 64 GB microSD card for many, mostly HD TV shows and movies.

You're right that taking calls while using the smartphone to get work done could be a pain for some, I don't see that being a big issue however.  Maybe those people will start carrying around cheap $10-20 bluetooth handsets to take calls when their smartphone is tied up doing work, like some rich Chinese supposedly do with their phablets: ;)

https://www.theverge.com/2013/1/25/3915700/htc-mini-tiny-phone-companion-for-your-oversized-smartphone

>>> In the case of trying to court Android development, I read that 95% of Android is done on Java (and maybe other JVM languages like the now "officially supported" Kotlin) and 5% in C or C++. But that 5% is for applications that have a need for high performance, which is mostly games. Good luck selling game developers on using D to develop for Android, when you can't supply those same game developers a top-notch development environment for the premier platform for performance critical games - Windows 64-bit.
>>
>> I don't think the numbers favor Java quite so much, especially if you look at the top mobile apps, which are mostly games.  I don't know what connection you think there is between the AAA Windows gaming market and mobile games, nobody runs Halo on their mobile device.
>
> I am assuming that game developers work in both spaces, if not concurrently, they move between the two.

I think the overlap is much less than you seem to think.

> It also may be incorrect to assume that D  would be acceptable in its current incarnation for game development due to the non-deterministic activity of the garbage collector. In which case, it would have little rationale for Android development. As far as iOS, there are two native code languages with a large lead, and both use Automatic Reference Counting, rather than garbage collection which would presumably give them give them the advantage for games. But D could potentially compete for non-game development.

Yeah, I already went over some of this in the other dlang forum thread about mobile that I linked initially.  Most mobile games would do better if written in D, but we don't yet have the D mobile libraries needed to make that easy on them.

>> btw, the mobile gaming market is now larger than the PC gaming market, so to think that they're sitting around using tools and IDEs optimized for that outdated PC platform is silly:
>>
>> https://www.digitaltrends.com/gaming/pc-market-grew-in-2016-led-by-mobile-and-pc-gaming/
>
> Are you suggesting they are developing their games for iOS and Android devices ON those devices? Apple has XCode for developing iOS apps and it runs on macOS machines only. There is also the Xamarin IDE or IDE plug-in from Microsoft that allows C# on iOS, but it runs on macOS or WIndows. For Android, there is Android Studio - "The Official IDE of Android" - which runs on Windows, macOS and Linux. There is no Android version.

Yes, of course they're still largely developing mobile games on PCs, though I'm not sure why you think that matters.  But your original claim was that they're still using PC-focused IDEs, as opposed to new mobile-focused IDEs like XCode or Android Studio, which you now highlight.

I don't use any IDEs, so I honestly don't care which ones D supports, but my point was that mobile game devs don't need to use outdated PC-focused tools when mobile is a bigger business and they have their own mobile-focused tools nowadays.

>>> But regardless of whether Windows is dominant, or just widely used, you haven't made predictions that Windows servers are going to die.
>>
>> I don't think about niche platforms that hardly anybody uses.
>
> It is the dominant internal IT platform. That is not niche and not something that is "hardly used". But what you could say is that given your prediction that Windows sales will decline by 99%, Microsoft will go out of business.

Yes, Windows is dominant, dominant in a niche, internal IT.  The consumer mobile market is much larger nowadays, and Windows has almost no market share there.

As for Microsoft, Windows is not their only product, they have moved Office onto the dominant mobile platforms.  As long as they keep supporting mobile, they could eke out an existence.  Their big bet on Azure is going to end badly though.

>>>> Your first link is actually a bad sign for Windows, as it's likely just because companies are trying to save money by having their employees run Windows apps off a virtualized Windows Server, rather than buying a ton more Windows PCs.
>>>
>>> I would say that is an unlikely scenario. Companies use virtual machines for servers because it allows for the email server and/or http server and/or database server and/or application server to be on one physical machine, and allow for the system administrator to reboot the OS or take the server offline when making an upgrade/bug fix, and not affect the applications running on the other servers.
>>
>> I see, so your claim is that process or software isolation is so weak on Windows Server that they run multiple virtualized instances of Windows Server just to provide it.  Or maybe that Windows Server needs to be patched for security so often, that this helps a little with downtime.  I doubt they are running many WinServer instances like you say, given how resource-heavy each Windows Server instance is going to be.  But regardless of how you slice it, this isn't a good sign for Windows.
>
> They use virtualization for Linux for the same reason I stated - so the application/http/email/database server can be on an OS that can be rebooted to complete upgrades or a VM can be used as an isolated "sandbox" for testing upgrades of a particular server or some in-house developed software.

It seems containerization is taking off more on linux now for such things, though Windows is trying to get into this too, following far behind as always.

>>>>> And if desktop OSes were going to go away, the MacOS would go before Windows.
>>>>
>>>> Oh, Apple wants that to happen, one less legacy OS to support, which is why all the Mac-heads are crying, because macOS doesn't get much attention nowadays.  Do you know the last time Apple released a standalone desktop computer?  2014, when they last updated the Mac Mini.  They haven't updated the Mac Pro since 2013.
>>>
>>> Why do you think it is that they haven't come out with an iOS Mac Mini or iOS MacBook?
>>
>> The Mac Mini is easy, they're just winding down that legacy form factor, like they did with the iPod for years.  Their only entry in that market is Apple TV running tvOS, which is more iOS than macOS.
>>
>> As for the iOS Macbook, it's out, it's called the iPad Pro.  Their CEO, Tim Cook, is always boasting about how it's all he uses these days:
>>
>> https://9to5mac.com/2012/02/14/tim-cook-ipad-80-90-of-tim-cooks-work-is-on-ipad-work-and-consumption/
>> http://appleinsider.com/articles/15/11/09/apple-ceo-tim-cook-says-he-travels-with-just-an-ipad-pro-and-iphone
>>
>
> A CEO is a baby user of a PC. What would he do besides email? He has people to do his powerpoint and documents. Not a good endorsement. And the iPad Pro is twice the price of what you say is the average price of a PC laptop. You could buy a Windows laptop and an Android Zenpad tablet and still have paid less than an iPad.

Sure, are you saying you can't do powerpoint and docs well on an iPad Pro or smartphone/Sentio though?  The iPad Pro aims for the high end of this PC-replacing mobile market, with its extremely powerful Apple-designed chip, while a $150 laptop shell combined with the smartphone you already have aims for the low end.  That basically leaves no space for a PC, once all the software is ported over.

> I'd like to be there when Cook tells all Apple employees they need to turn in their MacBooks for iPads.

Heh, most would likely rejoice by then. :)

>>>> They see the writing on the wall, which is why they're lengthening their release cycles for such legacy products.
>>>>
>>>
>>> Do they want them to go away, or do they see the handwriting on the wall? The fact that they still make them, it appears that they don't want them to go away. They can stop making them at any time. And by them, I mean their entire macOS (i.e. their non-mobile) line. I think that the Mac Mini/Mac Pro pale in sales to the iMacs as far as Apple desktop sales go.
>>
>> Simple, they see the writing on the wall, ie much smaller sales than mobile, so they want the legacy product to go away, which means they can focus on the much bigger mobile market.  The only reason they still make them is to milk that market and support their legacy userbase, the same reason they were still selling the iPod Touch all these years after the iPhone came out.
>
> Why did they fund development of a new iMac Pro which is coming this December as well as the new MacBook Pros that came out this June? That's a contradiction of "milk it like an iPod".

Because their userbase was rebelling?  I take it you're not that familiar with Mac users, but they were genuinely scared that Apple was leaving them behind, since they weren't refreshing Mac and Macbooks much anymore and all Apple's focus is on iOS:

"more and more people point to the current Mac Pro’s stagnation as proof that Apple is abandoning the Mac Pro market."
https://daringfireball.net/2017/04/the_mac_pro_lives

Apple threw them a bone, because they're long-time users who likely all buy iPhones and iPads too.  Pretty soon, there will be so few of these Mac laggards, just like iPod users, that they will stop doing so.
November 03, 2017
On Friday, 3 November 2017 at 14:12:56 UTC, Joakim wrote:
> [snip]
>
> But that established PC userbase shrinking is not what you should be worried about.  I've talked to multiple middle-class consumers in developing markets- they would be considered poor in the US if you converted their income to dollars- who tell me that they recently got their first smartphone for $150-200 and that it is the first time they ever used the internet, with cheap 3G/4G plans that are only now springing up.  They don't use the web, only mobile chat or social apps.
>
> Now, do you think these billions of new users of computing and the internet are more likely to buy a cheap laptop shell or dock for their smartphone when they someday need to do some "lean forward" work, as you call it, or spend much more on a Windows PC?  I know where my bet is.
>

It's pretty clear from this and some of the other posts that your primary focus is computer users. The work you've done in getting LDC to compile programs for Android is a good example. You want to be able to compile D programs that go on a smart phone because that's where the growth of computer users is coming from. I get that. 100%.

I think a source of pushback on the Windows subject is that programmers are a mere subset of all computer users. Maybe the billions might buy a cheap laptop shell or dock, but that doesn't mean they will be programmers. Thus, it's good to be able to compile programs for that platform, but it doesn't mean that work done to improve the experience of programmers on other platforms is a waste of time.