November 07, 2017
On 07/11/2017 11:12 AM, codephantom wrote:
> On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
>> No, the reason they don't improve is consumers don't need the performance.
>>
> 
> I don't agree. Consumers would welcome more performance - and many of us 'need' it too.
> 
> But cpu's have hit the heat barrier, and so manufacturers tend to focus on more cores, better caching algorithms, and such...
> 
> but I am sure that consumers would find a 10GHz quad core processor far more useful than a 4Ghz 24 core one.
> 
> Then you have the challenges of redesigning programming languages and software development methodologies to take better advantage of the multi-core thing...
> 
> There is also the problem of no real competition against Intel, so real innovation is not occuring as rapidly as it once did.
> 
> What we really need, is to get rid of that heat barrier - which means lots and lots  of money (potentially billions) into new research... and without competition, why should Intel bother? They can just do a few minor tweaks here and there, increment a number, and call the tweaked i7 ..the i9.

Not quite, but along the right line of thinking IMO.

Speed wise we have well and truly hit the limit of what we can do with silicon.

The speed improvements today are not the same kind done 20 years ago. Today's speed improvements come from changing and making what instructions run cheaper.

Consumers most definitely would benefit from higher number of cores even if they are slower. Why? Two reasons. First of all common programs like web browsers tend to use a LOT of threads. Which would mean less context switching over all (quite expensive and slow). Second most people do not max out their RAM both speed and quantity wise. RAM that matches the CPU clock speed is very expensive when comparing against high end CPU's and RAM is the real bottle neck today. Most people never get close to using up a CPU to its maximum capacity, its sitting idle a good bit of the time.

Intel has competition, every heard of AMD and ARM? Intel has made a lot of changes to their strategy in the last 10-30 years e.g. being more energy efficient because of ARM and AMD64 (with micro ops to implement it).

I am quite surprised that Intel even created i9 actually, it just wasn't required. Its like as if they took their Xeon lines, removed a bunch of features and only based it on the higher end ones.

Remember Xeon = non-consumer (so you get e.g. reliability and performance along with all the new features) and i-series = cheap consumer products.
November 07, 2017
On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
> One is a touch-first mobile OS that heavily restricts what you can do in the background and didn't even have a file manager until this year, while the other is a classic desktop OS, so there are significant differences.

Yes, there are differences for the end user, such as the the sandboxing, but that also applies to applications in OS-X appstore though. I don't expect iOS to change much in that department, I think Apple will continue to get people into their iCloud…

On the API level iOS is mostly a subset, and features that was only in iOS has been made available on OS-X. The main difference is in some UI classes, but they both use the same tooling and UI design strategies.

So in terms of XCode they are kinda similar.

> I never said they don't write apps for macOS, I said iOS is a much bigger market which many more write for.

Yes, there are more Apple developers in general. Not sure if the number of people doing OS-X development has shrunk, maybe it has.

> The same may happen to the iPhone some day, but it shows no signs of letting up.

They probably will hold that market for a while as non-techies don't want to deal with a new unfamiliar UI.

> Since they still have a ways to go to make the cameras or laptop-functionality as good as the standalone products they replaced, it would appear they can still convince their herd to stay on the upgrade cycle.

That is probably true, e.g. low light conditions.

> While I disagree that you can't commoditize the Mac, as you could just bundle most of the needed functionality into an iPhone

My point was that it is easier to commoditize the iPhone than the Mac. There is a very limited set of apps that end users must have on a phone.

> they've already significantly cut the team working on it.

Ok, didn't know that. I've only noticed that they stopped providing competitive products after Jobs died.

> No, the reason they don't improve is consumers don't need the performance.

I don't think this is the case. It is because of the monopoly they have in the top segment. Intel was slow at progress until Athlon bit them too. If they felt the pressure they would put their assets into R&D. Remember that new products have to pay off R&D before making a profit, so by pushing the same old they get better ROI. Of course, they also have trouble with heat and developing a new technological platform is very expensive. But if they faced stiff competition, then they certainly would push that harder.

In general the software market has managed to gobble up any performance improvements for decades. As long as developers spend time optimizing their code then there is a market for faster hardware (which saves development costs).

The Intel i9-7900X sells at $1000 for just the chip. That's pretty steep, I'm sure they have nice profit margins on that one.

> You are conflating two different things, fashionable academic topics and industry projections for actual production, which is what I was talking about.

What do you mean by industry projections? It was quite obvious by early 2000s that most people with cellphones (which basically was everyone in Scandinavia) would switch to smart phones. It wasn't a surprise.

> confident in them that you bet your company on them.  Nobody other than Apple did that, which is why they're still reaping the rewards today.

Only Microsoft had a comparable starting point. iOS is closely related to OS-X. Not sure if Nokia could have succeed with scaling up Symbian. Maybe, dunno.

November 07, 2017
On Tuesday, 24 October 2017 at 13:20:10 UTC, Andrei Alexandrescu wrote:
> A person who donated to the Foundation made a small wish list known. Allow me to relay it:
>
> * better dll support for Windows.
>
> Andrei

This should be better sent to Walter rather then here.
November 07, 2017
On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
> I am quite surprised that Intel even created i9 actually, it just wasn't required.

AMD Ryzen Threadripper:

https://www.cpubenchmark.net/high_end_cpus.html

November 07, 2017
On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
> On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
>> I am quite surprised that Intel even created i9 actually, it just wasn't required.
> 
> AMD Ryzen Threadripper:
> 
> https://www.cpubenchmark.net/high_end_cpus.html
> 

I do not trust that benchmark.

https://www.intel.com/content/www/us/en/products/compare-products.html?productIds=126699,120496,125056

But after looking at those numbers, I have a strange feeling that Intel is pushing those i9's past 'safe' limits. Ah huh they are messing with threading and cpu clock speeds via Intel Turbo Boost Max Technology 3.0. Nasty.
November 07, 2017
On Tuesday, 7 November 2017 at 13:29:19 UTC, rikki cattermole wrote:
> On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
>> On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
>>> I am quite surprised that Intel even created i9 actually, it just wasn't required.
>> 
>> AMD Ryzen Threadripper:
>> 
>> https://www.cpubenchmark.net/high_end_cpus.html
>> 
>
> I do not trust that benchmark.

Well, this is another one with a comparison of two products with similar price:

http://cpu.userbenchmark.com/Compare/Intel-Core-i9-7900X-vs-AMD-Ryzen-TR-1950X/3936vs3932

I think the Xeons might be for overcommited server situations. Larger caches and many threads. Sometimes people are more interested in responsiveness (prevent starvation) and not necessarily max speed. So if you do a lot of I/O system calls you might want the ability to run many threads at the same time and focus less on number crunching, perhaps?

> But after looking at those numbers, I have a strange feeling that Intel is pushing those i9's past 'safe' limits.

I think they just turn off cores that does not work and put those chips into the lower end, and the high end is very expensive at $2000 (so maybe low yield or just greed :-)…


November 07, 2017
On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
> Also, nobody saw mobile growing so gigantic, so fast, not even Jobs by all indications.  Mobile has really been a tidal wave over the last decade.  Funny how all you hear is bitching and whining from a bunch of devs on proggit/HN about how they missed the '80s PC boom or '90s dot.com boom and there's nothing fundamentally exciting like that now, all while the biggest boom of them all, the mobile boom, just grew and grew right in front of their faces. :D

Well, I was there in the early nineties when the Microsoft WinPad was being talked about. This was almost 20 years before the iPad came out. I remember going through the 90's with Window CE interations, which eventually evolved into Window Mobile 2003 - which is when I purchased my first 'smart phone', and learnt how to write apps for it ( actually my current phone still runs Windows Mobile 6.1 ;-).

I tried getting people around me interested in mobile devices, including the business I worked in. Nobody was really interested. They were all happy with their little push button nokias.

Microsoft had the vision though, and they had it earlier than perhaps anyone else. But the vision was too far ahead of its time, and, around the early 2000's they refused to lose any more money, put it on the back burner, and competitors came in a took over - at a time when 'consumers' were just beginning to share the vision too....

But I think what really made it take off so fast and unexpectadly, was the convergence of mobile devices, mobile communication technology (i.e wifi, gps and stuff), and of course the internet... as well as the ability to find cheap labour overseas to build the produces on mass.

I doubt anyone could have envisioned that convergence...but some companies were in a better position (more agile) than others, at the time, to capitalise on it.

But the vision of being mobile was certainly there, back in the early nineties - and Microsoft were leading it.

November 07, 2017
On 07/11/2017 1:48 PM, Ola Fosheim Grøstad wrote:
> On Tuesday, 7 November 2017 at 13:29:19 UTC, rikki cattermole wrote:
>> On 07/11/2017 12:58 PM, Ola Fosheim Grøstad wrote:
>>> On Tuesday, 7 November 2017 at 11:31:03 UTC, rikki cattermole wrote:
>>>> I am quite surprised that Intel even created i9 actually, it just wasn't required.
>>>
>>> AMD Ryzen Threadripper:
>>>
>>> https://www.cpubenchmark.net/high_end_cpus.html
>>>
>>
>> I do not trust that benchmark.
> 
> Well, this is another one with a comparison of two products with similar price:
> 
> http://cpu.userbenchmark.com/Compare/Intel-Core-i9-7900X-vs-AMD-Ryzen-TR-1950X/3936vs3932 
> 
> 
> I think the Xeons might be for overcommited server situations. Larger caches and many threads. Sometimes people are more interested in responsiveness (prevent starvation) and not necessarily max speed. So if you do a lot of I/O system calls you might want the ability to run many threads at the same time and focus less on number crunching, perhaps?

That sounds an awful like the average user too ;)

>> But after looking at those numbers, I have a strange feeling that Intel is pushing those i9's past 'safe' limits.
> 
> I think they just turn off cores that does not work and put those chips into the lower end, and the high end is very expensive at $2000 (so maybe low yield or just greed :-)…

The way I think of it is that Xeon's get all the newest and greatest features, with them slowly trickling down to the i-series. Invest in the Xeon production line one generation and in next use it for i7's ext. Basically R&D cost go all on the Xeon's and then eventually once its paid off it goes straight to the consumers.

But i9 is looking like its a completely different beast to the rest of the i-series with Intel actively adding new unique features to it. Quite scary that this doesn't sound like a good move especially when those features could very well make those cpu's last not very long.

Looks like they are changing tactic after the last 10 years or so. I do wonder if you're on the right track and turning a Xeon into an i9 is just a firmware upgrade...



November 07, 2017
On Tuesday, 7 November 2017 at 13:59:26 UTC, codephantom wrote:
> Microsoft had the vision though, and they had it earlier than perhaps anyone else. But the vision was too far ahead of its time, and, around the early 2000's they refused to lose any more money, put it on the back burner, and competitors came in a took over - at a time when 'consumers' were just beginning to share the vision too....

Yes, HP had the IPAQ: https://en.wikipedia.org/wiki/IPAQ

It was kinda interesting, but a bit too clunky and a bit expensive for personal use. I guess it was used for things like filling out forms on-site in businesses or doing measurements and things like that.

And touch screen and battery quality was a consideration as well, but either way, Apple App Store was probably a big factor for iOS to succeed. And the iPad was very popular with journalist who saw it as a device for electronic news papers and already was in the Apple fold (desk top publishing) I guess, so the iPad 1 got lots of free marketing.

So the technology has to be effortless, but there are also such social factors that drive free media coverage that come into play. If regular news paper journalists had not been enamoured by it, then it would have faded away…

> But I think what really made it take off so fast and unexpectadly, was the convergence of mobile devices, mobile communication technology (i.e wifi, gps and stuff), and of course the internet... as well as the ability to find cheap labour overseas to build the produces on mass.

You could attach lots of stuff to IPAQ, just like any laptop (Wifi, probably GPS, etc…)

November 07, 2017
On Tuesday, 7 November 2017 at 11:12:19 UTC, codephantom wrote:
> On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
>> No, the reason they don't improve is consumers don't need the performance.
>>
>
> I don't agree. Consumers would welcome more performance - and many of us 'need' it too.

There is an easy test of this: are they running out to upgrade to the latest higher performance x86 CPUs?  No, as Tony noted earlier, "Some find their current PC fast enough and see no reason to upgrade as frequently as they did in the past," though I'd modify that "some" to most.

> But cpu's have hit the heat barrier, and so manufacturers tend to focus on more cores, better caching algorithms, and such...
>
> but I am sure that consumers would find a 10GHz quad core processor far more useful than a 4Ghz 24 core one.

Right before it melted down. :)

> Then you have the challenges of redesigning programming languages and software development methodologies to take better advantage of the multi-core thing...

Since you have tons of background processes or apps running these days, even on Android, you don't really need multi-threaded apps to make good use of multi-core.

> There is also the problem of no real competition against Intel, so real innovation is not occuring as rapidly as it once did.
>
> What we really need, is to get rid of that heat barrier - which means lots and lots  of money (potentially billions) into new research... and without competition, why should Intel bother? They can just do a few minor tweaks here and there, increment a number, and call the tweaked i7 ..the i9.

Rikki answered all this: the real competition is from below from ARM and the performance gains now come from scaling out horizontally with multi-core, not vertically with faster clock speeds.

More importantly, the market has settled on cheap, power-sipping chips in mobile devices as the dominant platform.  x86 has failed miserably at fitting into that, which is why even MS is moving towards ARM:

https://www.thurrott.com/windows/windows-10/134434/arm-based-always-connected-windows-10-pcs-approach-finish-line

On Tuesday, 7 November 2017 at 11:40:21 UTC, Ola Fosheim Grøstad wrote:
> On Tuesday, 7 November 2017 at 08:53:46 UTC, Joakim wrote:
>> One is a touch-first mobile OS that heavily restricts what you can do in the background and didn't even have a file manager until this year, while the other is a classic desktop OS, so there are significant differences.
>
> Yes, there are differences for the end user, such as the the sandboxing, but that also applies to applications in OS-X appstore though. I don't expect iOS to change much in that department, I think Apple will continue to get people into their iCloud…
>
> On the API level iOS is mostly a subset, and features that was only in iOS has been made available on OS-X. The main difference is in some UI classes, but they both use the same tooling and UI design strategies.
>
> So in terms of XCode they are kinda similar.

I've never programmed for Apple devices and never would- I got my first and last Apple device more than a decade ago, a Powerbook laptop, don't buy any of their stuff since because of their ridiculous patent stance- so I can't speak to the similarity of APIs between macOS and iOS, but obviously there are significant developer and IDE differences in targeting a mobile OS versus a desktop OS, even if iOS was initially forked from macOS.

>> I never said they don't write apps for macOS, I said iOS is a much bigger market which many more write for.
>
> Yes, there are more Apple developers in general. Not sure if the number of people doing OS-X development has shrunk, maybe it has.

Let me correct that for you: there are many more iOS developers now, because it is a _much_ bigger market.

>> The same may happen to the iPhone some day, but it shows no signs of letting up.
>
> They probably will hold that market for a while as non-techies don't want to deal with a new unfamiliar UI.
>
>> Since they still have a ways to go to make the cameras or laptop-functionality as good as the standalone products they replaced, it would appear they can still convince their herd to stay on the upgrade cycle.
>
> That is probably true, e.g. low light conditions.
>
>> While I disagree that you can't commoditize the Mac, as you could just bundle most of the needed functionality into an iPhone
>
> My point was that it is easier to commoditize the iPhone than the Mac. There is a very limited set of apps that end users must have on a phone.

Just a couple responses above, you say the iPhone UI will keep those users around.  I'd say the Mac is actually easier to commoditize, because the iPhone is such a larger market that you can use that scale to pound the Mac apps, _once_ you can drive a multi-window, large-screen GUI with your iPhone, on a monitor or 13" Sentio-like laptop shell.

I agree that very few apps are used on phones, and that they aren't as sticky as desktop apps as a result.  Hopefully that means we'll see more competition in mobile than just android/iOS in the future.

>> they've already significantly cut the team working on it.
>
> Ok, didn't know that. I've only noticed that they stopped providing competitive products after Jobs died.
>
>> No, the reason they don't improve is consumers don't need the performance.
>
> I don't think this is the case. It is because of the monopoly they have in the top segment. Intel was slow at progress until Athlon bit them too. If they felt the pressure they would put their assets into R&D. Remember that new products have to pay off R&D before making a profit, so by pushing the same old they get better ROI. Of course, they also have trouble with heat and developing a new technological platform is very expensive. But if they faced stiff competition, then they certainly would push that harder.
>
> In general the software market has managed to gobble up any performance improvements for decades. As long as developers spend time optimizing their code then there is a market for faster hardware (which saves development costs).
>
> The Intel i9-7900X sells at $1000 for just the chip. That's pretty steep, I'm sure they have nice profit margins on that one.

Lack of competition at the high end certainly played a role, but as I noted to codephantom above, consumers not needing the performance played a much larger role, which is why Samsung, with their much weaker SoCs, just passed Intel as the largest semiconductor vendor:

http://fortune.com/2017/07/27/samsung-intel-chip-semiconductor/

>> You are conflating two different things, fashionable academic topics and industry projections for actual production, which is what I was talking about.
>
> What do you mean by industry projections? It was quite obvious by early 2000s that most people with cellphones (which basically was everyone in Scandinavia) would switch to smart phones. It wasn't a surprise.

Yes, but would that be in 2020 or 2050?  Would people who never had a cellphone get a smartphone, driving that market even larger, as is happening today in developing markets?

My point is that vague tech chatter about the potential next big thing is irrelevant, what matters is who was actually projecting hard numbers like a billion smartphones sold in 2013:

https://mobile.twitter.com/lukew/status/842397687420923904

Jobs certainly wasn't, almost nobody was.  If there were a few making wild-eyed claims, how many millions of dollars did they actually bet on it, as Jobs did?  Nobody else did that, which shows you how much they believed it.

>> confident in them that you bet your company on them.  Nobody other than Apple did that, which is why they're still reaping the rewards today.
>
> Only Microsoft had a comparable starting point. iOS is closely related to OS-X. Not sure if Nokia could have succeed with scaling up Symbian. Maybe, dunno.

I'm not sure how the starting point matters, google funded Android from nothing and it now ships on the most smartphones.  But even the google guys never bet the company on it, just gave it away for free for others to build on, which is why they never made as much money as Apple either.

On Tuesday, 7 November 2017 at 13:59:26 UTC, codephantom wrote:
> On Monday, 6 November 2017 at 08:33:16 UTC, Joakim wrote:
>> Also, nobody saw mobile growing so gigantic, so fast, not even Jobs by all indications.  Mobile has really been a tidal wave over the last decade.  Funny how all you hear is bitching and whining from a bunch of devs on proggit/HN about how they missed the '80s PC boom or '90s dot.com boom and there's nothing fundamentally exciting like that now, all while the biggest boom of them all, the mobile boom, just grew and grew right in front of their faces. :D
>
> Well, I was there in the early nineties when the Microsoft WinPad was being talked about. This was almost 20 years before the iPad came out. I remember going through the 90's with Window CE interations, which eventually evolved into Window Mobile 2003 - which is when I purchased my first 'smart phone', and learnt how to write apps for it ( actually my current phone still runs Windows Mobile 6.1 ;-).
>
> I tried getting people around me interested in mobile devices, including the business I worked in. Nobody was really interested. They were all happy with their little push button nokias.
>
> Microsoft had the vision though, and they had it earlier than perhaps anyone else. But the vision was too far ahead of its time, and, around the early 2000's they refused to lose any more money, put it on the back burner, and competitors came in a took over - at a time when 'consumers' were just beginning to share the vision too....

Yes, that is the impression I have too: MS got in too early, got discouraged that consumers didn't want their bulky hardware and weird software, and backed off right when the mobile market took off.

> But I think what really made it take off so fast and unexpectadly, was the convergence of mobile devices, mobile communication technology (i.e wifi, gps and stuff), and of course the internet... as well as the ability to find cheap labour overseas to build the produces on mass.
>
> I doubt anyone could have envisioned that convergence...but some companies were in a better position (more agile) than others, at the time, to capitalise on it.
>
> But the vision of being mobile was certainly there, back in the early nineties - and Microsoft were leading it.

Right, a significant minority of techies saw mobile coming, but I'm talking about forecasting the giant scope and scale and timing of the actual sales chart above.  There was nothing special about the minority who thought mobile could be big, the Nokia 7710 shipped with a touchscreen years before the iPhone:

https://en.m.wikipedia.org/wiki/Nokia_7710

The N800 shipped before the iPhone:

https://en.m.wikipedia.org/wiki/Nokia_N800

Intel had been talking about their MID platform around the same time:

https://gizmodo.com/253189/intel-ultra-mobile-platform-2007-officially-announced-mids-and-menlow-to-follow

Which of them saw that giant tidal wave coming, sunk every penny on a surfboard, and swam out to ride it?  Almost no one, other than Apple to some extent, add even they seem to have underestimated its size.