May 23, 2019
On 2019-05-23 09:28:59 +0000, kdevel said:

> Awesome. Compared to the video you posted some days ago there is also almost no visible aliasing.

Thanks.

>  Do you plan to create a web browser based on your framework?

No, I don't see any business model behind a web browser...

-- 
Robert M. Münch
http://www.saphirion.com
smarter | better | faster

May 23, 2019
On 2019-05-23 07:28:49 +0000, Ola Fosheim Grøstad said:

> On Thursday, 23 May 2019 at 06:07:53 UTC, Robert M. Münch wrote:
>> On 2019-05-22 17:01:39 +0000, Manu said:
>>> I mean, there are video games that render a complete screen full of zillions of high-detail things every frame!
>> 
>> Show me a game that renders this with a CPU only approach into a memory buffer, no GPU allowed. Total different use-case.
> 
> I wrote a very flexible generic scanline prototype renderer in the 90s that rendered 1024x768 using 11 bits each for red and green and 10 for blue and hardcoded alpha blending. It provided interactive framerates on the lower end for a large number of circular objects covering the screen, but it took almost all the CPU.

When doing the real-time resizing in the screencast, the CPU usage is around 5% - 6%

-- 
Robert M. Münch
http://www.saphirion.com
smarter | better | faster

May 23, 2019
On Thursday, 23 May 2019 at 16:36:17 UTC, Robert M. Münch wrote:
> When doing the real-time resizing in the screencast, the CPU usage is around 5% - 6%

Yeah, that leaves a lot of headroom to play with. Do you think there is a market for a x86 CPU software renderer though?

Or do you plan on support CPUs where there is no GPU available?

May 23, 2019
On 5/22/19 6:39 PM, Ola Fosheim Grøstad wrote:
>> There's a reason games can simulate a rich world full of dynamic data and produce hundreds of frames a second, is
> 
> Yes, it is because they cut corners and make good use of special cases... The cool kids in the demo-scene even more so. That does not make them good examples to follow for people who care about accuracy and correctness. 

Serious photographers and videographers use things like JPEG and MPEG which are *fundamentally based* on cutting imperceptible corners and trading accuracy for other benefits. The idea of a desktop GUI absolutely requiring perfect pristine accuracy in all things is patently laughable.
May 23, 2019
On Thursday, 23 May 2019 at 19:13:11 UTC, Nick Sabalausky (Abscissa) wrote:
> Serious photographers and videographers use things like JPEG and MPEG which are *fundamentally based* on cutting imperceptible corners and trading accuracy for other benefits. The idea of a desktop GUI absolutely requiring perfect pristine accuracy in all things is patently laughable.

What do you mean?

Besides, it is wrong. If you create a font editor you want accuracy. If you create an image editor you want accuracy. If you create a proofing application you want accuracy. If you create a PDF application you want accuracy.

When designing a game, you can adapt your game design to the provided engine.
Or you can design an engine to fit a type of game design.

When creating a user interface framework you should work with everything from sound editors, oscilloscopes, image editors, vector editors, CAD programs, spreadsheets etc.

You cannot really assume much about anything. What you want is max flexibility.

Most GUI frameworks fail at this, so you have to do all yourself if you want anything with descent quality, but that is not how it should be.

Apple's libraries provides options for higher accuracy. This is good. This is what you want. You don't want to roll your own all the time because the underlying framework is just "barely passing" in terms of quality.


May 23, 2019
On 5/22/19 6:33 PM, H. S. Teoh wrote:
> On Wed, May 22, 2019 at 02:18:58PM -0700, Manu via Digitalmars-d-announce wrote:
>> On Wed, May 22, 2019 at 10:20 AM Ola Fosheim Grøstad via
>> Digitalmars-d-announce <digitalmars-d-announce@puremagic.com> wrote:
> [...]
>>> But you shouldn't design a UI framework like a game engine.
>>>
>>> Especially not if you also want to run on embedded devices
>>> addressing pixels over I2C.
>>
>> I couldn't possibly agree less; I think cool kids would design
>> literally all computer software like a game engine, if they generally
>> cared about fluid experience, perf, and battery life.
> [...]
> 
> Wait, wha...?!  Write game-engine-like code if you care about *battery
> life*??  I mean... fluid experience, sure, perf, OK, but *battery
> life*?!  Unless I've been living in the wrong universe all this time,
> that's gotta be the most incredible statement ever.  I've yet to see a
> fluid, high-perf game engine *not* drain my battery like there's no
> tomorrow, and now you're telling me that I have to write code like a
> game engine in order to extend battery life?
> 
> I think I need to sit down.

You're conflating "game engine" with "game" here. And YES, there is very meaningful distinction:

Game engines *MUST* be *EFFICIENT* in order facilitate the demands the games place on them. And "efficiency" *means* efficiency: it means minimizing wasted processing, and that *inherently* means *both* speed and battery.

The *games*, not the engines, then take that efficiency and use it to fill the hardware to the absolute brim, maximizing detail and data and overall lushness of the simulated world (and, in the case of indie titles, it's also increasingly used to offset sloppy game code - with engines like Unity, indie game programming is increasingly done by people with very little programming experience). THAT is what kills battery: Taking an otherwise efficient engine and using it to saturate the hardware, thus trading battery for either maximal data being processed or for lowering the programmer's barrier to entry.

Due to the very nature of "efficiency", the fundamental designs behind any good game engine could just as easily be applied to minimizing battery usage as they can be to maximizing CPU/GPU utilization.
May 23, 2019
On Thursday, 23 May 2019 at 19:29:26 UTC, Ola Fosheim Grøstad wrote:
> Most GUI frameworks fail at this, so you have to do all yourself if you want anything with descent quality, but that is not how it should be.

I meant «decent»! *grin*


(But really, photographers and videographers use RAW format exactly because they want to be able to edit without artifacts showing up. Not really relevant in this context though.)

May 23, 2019
On Thursday, 23 May 2019 at 19:32:28 UTC, Nick Sabalausky (Abscissa) wrote:
> Game engines *MUST* be *EFFICIENT* in order facilitate the demands the games place on them. And "efficiency" *means* efficiency: it means minimizing wasted processing, and that *inherently* means *both* speed and battery.

I think there is a slight disconnection in how different people view efficency. You argue that this is some kind of absolute metric. I would argue that it is a relative metric, and it is relative to flexibility and power.

This isn't specific to games.

For instance, there is no spatial datatructure that is inherently better or more efficient than all other spatial datastructures.

It depends on what you need to represent. It depends on how often you need to update. It depends on what kind of queries you want to do. And so on.

This is where a generic application/UI framework will have to give priority to being generally useful in the most general sense and give priority to flexibility and expressiveness.

A first person shooter game engine, can however make a lot of assumptions. That will make it more efficient for a narrow set of cases, but also completely useless in the most general sense. It also limits what you can do, quite severely.

May 23, 2019
On 5/22/19 8:34 PM, H. S. Teoh wrote:
> And this isn't just for mobile apps; even the pervasive desktop browser
> nowadays seems bent on eating up as much CPU, memory, and disk as
> physically possible -- everybody has their neighbour's dog wants ≥60fps
> hourglass / spinner animations and smooth scrolling, eating up GBs of
> memory, soaking up 99% CPU, and cluttering the disk with caches of
> useless paraphrenelia like spinner animations.
> 
> Such is the result of trying to emulate game-engine-like behaviour.

No, that resource drain is BECAUSE they're trying to do game-like things WITHOUT understanding what game engine developers have learned from experience about how to do so *efficiently*.

>  And
> now you're recommending that everyone should write code like a game
> engine!


Why is it so difficult for programmers who haven't worked on games to understand the basic fundamental notion that (ex.) 0.1 milliseconds of actual CPU/GPU work is ALWAYS, ALWAYS, ALWAYS *both* faster *and* lower power drain than (ex.) 10 milliseconds of actual CPU/GPU work. And that *that* is *ALL* there is to software efficiency! Nothing more!

So yes, absolutely. If you *are* going to be animating the entire screen every frame for a desktop UI (and I agree that's not always a great thing to do, in part for battery reasons), then yes, I'd ABSOLUTELY rather it be doing so in a game-engine-like way so that it can achieve the same results with less demand on the hardware. And if you're NOT animating the entire screen every frame, then I'd STILL rather it take advantage of a game-engine like architecture, because I'd rather my static desktop UI take 0.01% CPU utilization than 2% CPU utilization (for example).

> 
> (Once, just out of curiosity (and no small amount of frustration), I
> went into Firefox's about:config and turned off all smooth scrolling,
> animation, etc., settings.  The web suddenly sped up by at least an
> order of magnitude, probably more. Down with 60fps GUIs, I say.  Unless
> you're making a game, you don't *need* 60fps. It's squandering resources
> for trivialities where we should be leaving those extra CPU cycles for
> actual, useful work instead, or *actually* saving battery life by not
> trying to make everything emulate a ≥60fps game engine.)

Yes, this is true. There's no surprise there. Doing less work is more efficient. Period. But what I'm continually amazed that the majority of non-game developers seem to find so incredibly difficult to grasp is that NO MATTER WHAT FRAMERATE or update rate you're targeting: What is MORE efficient and what is LESS efficient...DOES NOT CHANGE!!! PERIOD.

If you ARE cursed to run a 60fps GUI desktop, which would you prefer:

A. 80% system resource utilization, *consistent* 60fps, and 2 hours of battery power. Plus the option of turning OFF animations to achieve 1% system resource utilization and 12 hours of battery.

or:

B. 100% system resource utilization, *inconsistent* 60fps with frequent drops to 30fps or lower, and 45 minutes of battery power. Plus the option of turning OFF animations to achieve 15% system resource utilization and 4 hours of battery.

Which is better? Because letting you have A instead of B is *exactly* what game engine technology does for us. This is what efficiency is all about.
May 23, 2019
On 5/23/19 3:29 PM, Ola Fosheim Grøstad wrote:
> On Thursday, 23 May 2019 at 19:13:11 UTC, Nick Sabalausky (Abscissa) wrote:
>> Serious photographers and videographers use things like JPEG and MPEG which are *fundamentally based* on cutting imperceptible corners and trading accuracy for other benefits. The idea of a desktop GUI absolutely requiring perfect pristine accuracy in all things is patently laughable.
> 
> What do you mean?
> 
> Besides, it is wrong. If you create a font editor you want accuracy. If you create an image editor you want accuracy. If you create a proofing application you want accuracy. If you create a PDF application you want accuracy.

They want accuracy TO THE EXTENT THEY (and others) CAN PERCEIVE IT. That is the key. Human perception is far more limited than most people realize.