May 25, 2019
On 25/05/2019 5:04 AM, Robert M. Münch wrote:
> On 2019-05-24 10:12:10 +0000, Ola Fosheim Grøstad said:
> 
>> I guess server rendering means that you can upgrade the software without touching the clients, so that you have a network protocol that transfers the graphics to a simple and cheap client-display. Like, for floor information in a building.
> 
> Even much simpler use-cases make sense, example: Render 3D previews of 100.000 CAD models and keep them up to date when things change. You need some CLI tool to render it, but most likely you don't have OpenGL or a GPU on the server.

Be careful with that assumption. Server motherboards made by Intel come with GPU's as standard.
May 24, 2019
On Friday, 24 May 2019 at 17:19:23 UTC, rikki cattermole wrote:
> Be careful with that assumption. Server motherboards made by Intel come with GPU's as standard.

Yes, they also have CPUs with FPGAs... And NVIDIA has embedded units with crazy architectures, like this entry level mode ($99?):

https://developer.nvidia.com/embedded/buy/jetson-nano-devkit

The stagnation of CPU capabilities had led to some interesting moves.

Anyway, having a solid CPU renderer doesn't prevent one from using a GPU as well, if the architecture is right.


May 25, 2019
On 25/05/2019 5:33 AM, Ola Fosheim Grøstad wrote:
> On Friday, 24 May 2019 at 17:19:23 UTC, rikki cattermole wrote:
>> Be careful with that assumption. Server motherboards made by Intel come with GPU's as standard.
> 
> Yes, they also have CPUs with FPGAs... And NVIDIA has embedded units with crazy architectures, like this entry level mode ($99?):
> 
> https://developer.nvidia.com/embedded/buy/jetson-nano-devkit
> 
> The stagnation of CPU capabilities had led to some interesting moves.
> 
> Anyway, having a solid CPU renderer doesn't prevent one from using a GPU as well, if the architecture is right.

Oh no, you found something that I want now.
May 24, 2019
On 5/23/19 5:01 PM, Ola Fosheim Grøstad wrote:
> On Thursday, 23 May 2019 at 20:20:52 UTC, Nick Sabalausky (Abscissa) wrote:
>> flexibility. And I think you're *SEVERELY* underestimating the flexibility of modern game engines. And I say this having personally used modern game engines. Have you?
> 
> No, I don't use them. I read about how they are organized, but I have no need for the big gaming frameworks which seems to look very bloated, and frankly limiting. I am not really interested in big static photorealistic landscapes.

Wow, you're just deliberately *trying* not to listen at this point, aren't you? Fine, forget it, then.
May 24, 2019
On Friday, 24 May 2019 at 19:32:38 UTC, Nick Sabalausky (Abscissa) wrote:
> Wow, you're just deliberately *trying* not to listen at this point, aren't you? Fine, forget it, then.

I have no problem listening. As far as I can tell generic scenegraph frameworks like Inventor, Ogre (and I presume Horde) seem to have lost terrain in favour of more dedicated solutions.

May 24, 2019
Is the source available anywhere? Would be interesting to look through unless this is close source?
May 25, 2019
On 2019-05-24 23:35:18 +0000, Exil said:

> Is the source available anywhere? Would be interesting to look through unless this is close source?

No, not yet. Way to early and because we can't support it in anyway.

I see that there is quite some interest in the topic, but I think we should get it to some usable point before releasing. Otherwise the noise level will be to high.

-- 
Robert M. Münch
http://www.saphirion.com
smarter | better | faster

May 25, 2019
On Thursday, 23 May 2019 at 00:34:42 UTC, H. S. Teoh wrote:

> And this isn't just for mobile apps; even the pervasive desktop browser nowadays seems bent on eating up as much CPU, memory, and disk as physically possible

This has been going on ever since the Amiga 1000, Atari 1040ST, and the 286 started edging out the C-64. If we ever break out of this anti-Moorean loop and start seeing 8 gHz or even 16 gHz CPU speeds, maybe the machines will finally manage to keep the resource hounds at bay.

May 25, 2019
On Saturday, 25 May 2019 at 19:10:44 UTC, Ron Tarrant wrote:
> On Thursday, 23 May 2019 at 00:34:42 UTC, H. S. Teoh wrote:
>
>> And this isn't just for mobile apps; even the pervasive desktop browser nowadays seems bent on eating up as much CPU, memory, and disk as physically possible
>
> This has been going on ever since the Amiga 1000, Atari 1040ST, and the 286 started edging out the C-64. If we ever break out of this anti-Moorean loop and start seeing 8 gHz or even 16 gHz CPU speeds, maybe the machines will finally manage to keep the resource hounds at bay.

I'm not sure. Maybe there's something of the human nature in resources wasting.
Maybe at the beginning everybody will be happy but at the end people would start using slower scripting languages, less optimized, or more simply would use those existing to achieve more complex tasks and after a while the situation we know now will repeat itself.
May 25, 2019
On Saturday, 25 May 2019 at 19:35:35 UTC, user1234 wrote:
> Maybe at the beginning everybody will be happy but at the end people would start using slower scripting languages, less optimized, or more simply would use those existing to achieve more complex tasks and after a while the situation we know now will repeat itself.

Haha, yup. As neural network and deep learning algorithms become available in hardware, programmers will start using them as building blocks for implementing nondeterministic solutions to problems that we would create carefully crafted deterministic algorithms for.

Meaning, they will move towards "sculpting" software whereas we are "constructing" software.

We see this already in webdev.  Some devs just "sculpt" WordPress sites with plugins and tweaks. With very little insight into what the various pieces of code actually do… Actually, they might have a very vague idea of what programming is…

Oh well.