January 16, 2020
On Sunday, 12 January 2020 at 20:29:59 UTC, aberba wrote:
> https://tonsky.me/blog/disenchantment/
>
> Let's kill the bloat!!
>

And there is an other effect of this ever growing bloat.
I have two old iPads one iPad 1 and an iPad 2.
Both are in perfect hardware condition but, you can not use them
for much anymore, because of their small (256 and 512 MB) RAM the available browsers
are not able to render most of 'modern' webpages.

So the ever increasing need of memory for the simplest tasks is killing
old hardware.

The last computer, which software was optimized to the ultimate was probably the Commodore C64. After that the availability of more and more resources (CPU speed and RAM) has started building an ever increasing amount of additional layers between input and output.

Just look at a simple - statically linked - "hello world" DMD compilation result,
how many C64 times floppy discs (180KByte) you would need to store?

I think this process will not end as long as new storage and bandwidth is getting cheaper all the time.

But maybe I am wrong and the next generation of software engineers will bring the gain of Moors Law to us. (And the resources needed for computing world wide will stop increasing.)
January 16, 2020
On Thursday, 16 January 2020 at 14:03:15 UTC, Martin Tschierschke wrote:
> But maybe I am wrong and the next generation of software engineers will bring the gain of Moors Law to us. (And the resources needed for computing world wide will stop increasing.)

It is especially the current young generation of coders that gets socialized with HTML+JS "GUIs" (yup, scare quotes!) and "apps" that are just services on somebody else's server.

There's just so many incentives pointing the wrong way:

- Cloud providers want to lock their customers in (Google, Amazon, MS)
- Software developers see how they can squeeze juicy subscription fees out of their customers when they don't sell installable software, but run it as a service
- Commercial users see shiny presentations that tell them that not running their software in-house is so much cheaper (and it's likely true until they lose access to their data or a critical 3rd party service falls over)

I only see a single chance to get out of this particular hole: completely new devices that are more desirable than PCs, tablets or smartphones and for which the web as it exists today makes absolutely no sense. I see one chance of this happening if everyday augmented reality matures in about 5 to 10 years - and that's still a pretty big if.
January 16, 2020
On Thu, Jan 16, 2020 at 03:08:47PM +0000, Gregor Mückl via Digitalmars-d wrote: [...][
> There's just so many incentives pointing the wrong way:
> 
> - Cloud providers want to lock their customers in (Google, Amazon, MS)

Yep, that's why I'm skeptical of this whole cloud hype. It's just like the days of Java all over again: OO, which does have its uses, being sold far beyond its scope of usefulness, and Java, which actually isn't a *bad* language in certain respects, being sold as the panacea that will solve all your programming problems and relieve you of the necessity of thought. Only today, substitute OO with "cloud" and Java with "webapps".

Cloud vendors want to lock you in, when the correct strategy is multiple redundant systems (cf. Walter's rants about Boeing design). But you can't have multiple redundant systems -- not in Walter's sense of multiple *independent* systems that aren't running upon the same principles that might fail *at the same time* -- if cloud vendors refuse to let you interoperate with their competitors' systems, or only allows arbitrarily restricted such interoperation, such that your redundancy is essentially crippled and you might as well not bother.


> - Software developers see how they can squeeze juicy subscription fees out of their customers when they don't sell installable software, but run it as a service

Yeah, I have a lot of ideological problems with that. The first and foremost being that your ability to use potentially mission-critical functionality is now dependent on the state of some remote server farm that's completely beyond your control.  Last year's AWS outage is just the tip of the iceberg of what might happen if everyone becomes dependent on the web (they already are) and the web becomes fragile because of reliance on a small number of points of failure (already happened: cloud providers), and something happens to one of these points of failure.

(Well, you object, cloud providers have multiple redundant distributed servers, so they're not vulnerable to single-point-of-failure problems. Wrong, their *individual* servers can failover transparently, but sometimes the *entire service* goes down for whatever reason -- faulty software that all servers are running copies of, for instance. Or targeted cybercriminal attacks on that service as a whole. Or the company goes bust suddenly, who knows. Centralization of critical services -- esp. on a 3rd party whose interests may not coincide with yours -- is not a wise move.)


> - Commercial users see shiny presentations that tell them that not running their software in-house is so much cheaper (and it's likely true until they lose access to their data or a critical 3rd party service falls over)
[...]

Yeah, this is another major ideological problem I have with this whole cloud hype. Your data doesn't belong to you anymore; it's sitting on the hard drives of some 3rd party whose interests do not necessarily coincide with yours. The accessibility of your mission-critical data is dependent upon the availability of some remote service that isn't under your control.  You're in trouble if the service goes down, or becomes unavailable for whatever reason during the critical times when you most need your data. You're in trouble if the 3rd party gets hacked and now your supposedly private data is out in the open.  Or there's a serious security flaw that you were never aware of, that has left your data that you thought was securely stored open to the whole world.  And worst of all, your data is in the hands of a 3rd party who has the power to do what they want with it, and their interests may not coincide with yours.

How anyone could be comfortable with that idea is beyond me.


T

-- 
Why have vacation when you can work?? -- EC
January 16, 2020
On Monday, 13 January 2020 at 18:22:19 UTC, H. S. Teoh wrote:
> On Mon, Jan 13, 2020 at 05:40:08PM +0000, Arine via Digitalmars-d wrote:
>> On Monday, 13 January 2020 at 11:54:19 UTC, user5678 wrote:
>> > (https://hexus.net/tech/news/peripherals/113648-modern-computer-complexity-heavy-impact-keyboard-latency/) and other stuff too.
>> 
>> He's comparing two different technologies. If you want low input lag, get a TN panel gaming monitor with a high refresh rate. The thing is those cost $$$. All the while most of the devices he's testing are laptops. I'd love to a see a CRT display in a laptop. Read between the lines, that the author doesn't know what their doing.
>
> You're totally missing the point.  The point is to take a step back at the current state of things and evaluate just how much it (doesn't) make sense:

It does make sense. Software back then wasn't complicated, it didn't have to be. Developer time has remained constant. Software companies failed because they were trying to shoot for perfection. You can't create a perfect piece of software. You have to use the limited developer time you have to and allocate that time effectively. Not trying to reduce file size cause some UX designed that doesn't know what he's doing or talking about rants about it on his blog.

> 4) Technologically speaking, today we have enough processing power to run AAA games that process hundreds of thousands of objects per frame running at 60 fps.  We're talking about things like *real-time raytracing* here, something completely unimaginable in the 70's.
>
> Yes, all of this can be explained, and if you lose sight of the forest for the trees, every step in the history of how this came about can be logically explained. But when you step back and look at the forest as a whole, the whole situation looks completely ridiculous.  The necessary tech is all there to make things FAR more efficient. The development methodologies are all there, and we have orders of magnitude more manpower than in the 70's.  What a word processor has to compute is peanuts compared to an AAA game with real-time raytracing running at 60 fps.

Raytracing is just a marketing buzzword, it's exist for decades in games and it's been used in realtime for almost as long. That's the problem when you have people like you that don't understand what they are talking about, throwing things like. Oh we can do "raytracing" in real time then comparing that as if it means something because we can do that. GPUs have been doing operations like that for a long time, doing a lot simple tasks thousands at a time in parallel. But there's still a reason you can't run an operating system using a GPU. It's fundamentally difference.

> 5) Yet a browser app of today, built with said modern technology with modern processing power, still runs just as horribly slowly as a word processor from the 70's running on ancient ultra-slow hardware, with just as horrible a lag between input keystrokes.
>
> Yet here we are, stuck with a completely insane web design philosophy building horribly slow and unreliable apps that are barely a step above an ancient word processor from the 70's.

I use VS Code and Discord (both made using electron btw) all the time, there's no lag. It's probably more responsive than most bloated IDEs that weren't built using electron. Bad programs are going to be bad.


January 16, 2020
On Thursday, 16 January 2020 at 19:38:21 UTC, Arine wrote:
> Raytracing is just a marketing buzzword, it's exist for decades in games and it's been used in realtime for almost as long. That's the problem when you have people like you that don't understand what they are talking about, throwing things like. Oh we can do "raytracing" in real time then comparing that as if it means something because we can do that. GPUs have been doing operations like that for a long time, doing a lot simple tasks thousands at a time in parallel. But there's still a reason you can't run an operating system using a GPU. It's fundamentally difference.

That's not true. While I believe the current trend of 'raytracing' is mostly hype build by NVidia to sell their RTX GPUs, real time raytracing wasn't viable in the past. It only worked for simple scenes with few cubes and spheres and was very low resolution/noisy. Now we have the performance to do that, also we can use machine learning to denoise the image in a much better way than the previous algorithms did.
January 17, 2020
On Thursday, 16 January 2020 at 17:59:53 UTC, H. S. Teoh wrote:
> On Thu, Jan 16, 2020 at 03:08:47PM +0000, Gregor Mückl via Digitalmars-d wrote: [...][
>> There's just so many incentives pointing the wrong way:
>> 
>> - Cloud providers want to lock their customers in (Google, Amazon, MS)
>
> Yeah, this is another major ideological problem I have with this whole cloud hype. Your data doesn't belong to you anymore; it's sitting on the hard drives of some 3rd party whose interests do not necessarily coincide with yours. The accessibility of your mission-critical data is dependent upon the availability of some remote service that isn't under your control.  You're in trouble if the service goes down, or becomes unavailable for whatever reason during the critical times when you most need your data. You're in trouble if the 3rd party gets hacked and now your supposedly private data is out in the open.  Or there's a serious security flaw that you were never aware of, that has left your data that you thought was securely stored open to the whole world.  And worst of all, your data is in the hands of a 3rd party who has the power to do what they want with it, and their interests may not coincide with yours.
>
> How anyone could be comfortable with that idea is beyond me.
>

It depends. The business world is more dynamic today. As a startup company you have access to advanced technologies that you never dream of in a blink of time.

Last year I started a new company. In 30 minutes I had a fully fledged e-mail system, a communication platform, a secure environment and a nice pack of development software. Uploaded my databases, opened Visual Studio, load the project, changed some settings in the configuration file, hit Build, hit Publish button. Zbang, my web application is up and running in the wild. As a service, I don't even need a virtual machine for this.

The company doesn't even have a physical office, we are three partners and all we got are three laptops working from home. 300 eur/month licenses and services.

Now imagine the same scenario years ago. Buy some servers, buy storage, buy firewall, configure, install. Set-up e-mail, setup network, have a server room, put some cables. 30k eur at least.

More than that, since I am working in the payroll industry, clients ask for security certifications. We cannot afford to buy such systems and services to meet their criteria. Instead I gave them the security certifications of the cloud provider, which are state of the art. I have access to secure technologies like data leaking prevention, audit, logging without any supplementary investment.



January 17, 2020
On Thursday, 16 January 2020 at 14:03:15 UTC, Martin Tschierschke wrote:
> On Sunday, 12 January 2020 at 20:29:59 UTC, aberba wrote:
>> https://tonsky.me/blog/disenchantment/
>>
>> Let's kill the bloat!!
>>
>
> And there is an other effect of this ever growing bloat.
> I have two old iPads one iPad 1 and an iPad 2.
> Both are in perfect hardware condition but, you can not use them
> for much anymore, because of their small (256 and 512 MB) RAM the available browsers
> are not able to render most of 'modern' webpages.
>
> So the ever increasing need of memory for the simplest tasks is killing
> old hardware.
>
> The last computer, which software was optimized to the ultimate was probably the Commodore C64. After that the availability of more and more resources (CPU speed and RAM) has started building an ever increasing amount of additional layers between input and output.
>
> Just look at a simple - statically linked - "hello world" DMD compilation result,
> how many C64 times floppy discs (180KByte) you would need to store?
>
> I think this process will not end as long as new storage and bandwidth is getting cheaper all the time.
>
> But maybe I am wrong and the next generation of software engineers will bring the gain of Moors Law to us. (And the resources needed for computing world wide will stop increasing.)

This was already known in the 80ies. It was called the hardware-software spiral or something like that. It's partly natural and partly by design to sell hardware and software. The more powerful the hardware the more software developers do (think of image and video editing), the more powerful the software, the slower the existing hardware, so you need to buy a new, more powerful machine, rinse and repeat...

As regards your iPads, Apple have always been mean with RAM and storage (unless you spend like $2000+). That's also by design, if Apple gives you 256/512MB RAM, they know exactly what they are doing, because they know that your iPad will soon be useless given the way the internet is evolving. They are aware of the hardware-software spiral.
January 17, 2020
On Thursday, 16 January 2020 at 17:59:53 UTC, H. S. Teoh wrote:
>
> Yeah, this is another major ideological problem I have with this whole cloud hype. Your data doesn't belong to you anymore; it's sitting on the hard drives of some 3rd party whose interests do not necessarily coincide with yours. The accessibility of your mission-critical data is dependent upon the availability of some remote service that isn't under your control.  You're in trouble if the service goes down, or becomes unavailable for whatever reason during the critical times when you most need your data. You're in trouble if the 3rd party gets hacked and now your supposedly private data is out in the open.  Or there's a serious security flaw that you were never aware of, that has left your data that you thought was securely stored open to the whole world.  And worst of all, your data is in the hands of a 3rd party who has the power to do what they want with it, and their interests may not coincide with yours.
>
> How anyone could be comfortable with that idea is beyond me.
>
>
> T

All valid points, but what do you suggest as an alternative? Create your own service from scratch? Can you guarantee your customers that your own software is secure and will not be hacked easily? All their personal data and financial transactions. The whole thing is just too big to roll your own. If you buy a car, you're "locked in", but does that mean you should build your own car? The market is about division of labor, else there wouldn't be progress. Gone are the romantic days of yore when people where farmers, thatchers and fishermen at the same time.
January 17, 2020
On Friday, 17 January 2020 at 12:19:18 UTC, Chris wrote:
> As regards your iPads, Apple have always been mean with RAM and storage (unless you spend like $2000+). That's also by design, if Apple gives you 256/512MB RAM, they know exactly what they are doing, because they know that your iPad will soon be useless given the way the internet is evolving. They are aware of the hardware-software spiral.

Sadly, the iPad1 is quite capable, but is stuck on iOS5 and thus the browser will fail on many sites. In my experience RAM and CPU is not the main issue. I actually like the ergonomic shape of the iPad1 more than later models, but planned obsoletion is what you have to live with these days...  I still use it for wikipedia and pdfs, though :-)



January 17, 2020
On Friday, 17 January 2020 at 13:36:52 UTC, Ola Fosheim Grøstad wrote:
>
> Sadly, the iPad1 is quite capable, but is stuck on iOS5 and thus the browser will fail on many sites. In my experience RAM and CPU is not the main issue. I actually like the ergonomic shape of the iPad1 more than later models, but planned obsoletion is what you have to live with these days...  I still use it for wikipedia and pdfs, though :-)

That's, of course, another trick to render devices useless. Make them un-updatable.