July 29, 2015
On Wednesday, 29 July 2015 at 14:17:34 UTC, Sebastiaan Koppe wrote:
> On Wednesday, 29 July 2015 at 07:30:50 UTC, yawniek wrote:
>> In times of reactive frameworks it makes no sense anymore to render html in the backend.
>
> Nowadays with the many client-side dom manipulations it is tempting to just do it all in the client. But in terms of speed it also makes sense to do an initial render on the server, and only render the delta's on the client.
>
> On Wednesday, 29 July 2015 at 07:30:50 UTC, yawniek wrote:
>> backends/services should concentrate on data transformations (and security etc).
>
> Yeah, it makes it easier for frontend development to transform data into a nice structure.
>
>> in my opinion also the REST style apis will come to an end as we can easily have
>> stateless apis these days ( 86.62%of browsers have websockets already according to http://caniuse.com/#feat=websockets ).
>
> Was a big fan of REST - but since it is hard to aggregate requests, you end up making too many - so am looking for alternatives.
> When you say stateful, what exactly do you want to keep between requests?
>
>> The whole ghetto around maintaining state over http for a web application is nuts if you think about how native applications work.
>
> Care to elaborate?

check how volt or meteor work
e.g. https://www.youtube.com/watch?v=P27EPQ4ne7o
the idea is that you don't to the whole MVC pattern twice.

the controler or similar objects reside on the server side, reacting on events from both, backends AND frontend/views to then act and push data out to be rendered.
aka the data from the backend will not be transformed[0] again within the client prior to being consumed by the rendering logic.


[0] meaning that there is no datatype specific code or other code that can't be hidden away in a generic framework. you may transform it from msgpack/protobuf to javascript object etc..

July 29, 2015
On Wednesday, 29 July 2015 at 01:23:54 UTC, Etienne Cimon wrote:
> On Wednesday, 29 July 2015 at 00:12:21 UTC, Brandon Ragland wrote:
>> For actual web applications, and front-end development currently done in your more traditional languages, D could be used, in a style similar to Java's JSP, JSTL, and EL. Just without the notion of scripts in the pages themselves, as this would mean writing an on-the-fly interpreter, or compiling whole pages, which surely isn't an option for a compiled performant language; if we want it to be readily adapted.
>>
>> Apologizes if I jumped around a lot, and misspelled. More difficult than I thought typing from my phone.
>
> Most developers nowadays are having a lot of success building web apps with an AngularJS MVC & Vibe.d, rather than rendering the page entirely from the back-end. Heck, they can even build android or ios native apps with this architecture (see Ionic framework). So I think this makes more sense than rendering pages in the back-end, even if most legacy web stuff did that.

This is so very true. Unfortunately, it also assumes some things which are still not true:

* Every visitor has a speedy connection
* Every visitor is on a powerful machine (so to speak)

The problem with the first is that, half the the US is still on a connection with speeds less than 6mbps. Downloading a whole page (such as your Facebook timeline feed, which can easily be 6+MiBs [see Akami link]) takes a full minute. That's pathetic for any kind of real-world business logic web app. Maybe okay for Facebook, but we could do better.

I can't even begin to tell you how many friends I have who are non-developers who complain on a daily basis that "they're phone has faster connection" or "Facebook loads slow" or "Gmail takes 2 minutes to open".

The idea of pushing *everything* to the client to render (which often means payloads of 4+MiBs) is ridiculous and not being fair to over 80% of the world.

http://www.akamai.com/dl/documents/akamai_soti_q213.pdf?WT.mc_id=soti_Q213
July 30, 2015
On Wednesday, 29 July 2015 at 07:30:50 UTC, yawniek wrote:
> On Monday, 27 July 2015 at 06:10:29 UTC, Sebastiaan Koppe wrote:
>
>> For instance, for rendering pages I would rather front the D backend with some (stateless) node app that fetches the data from the D backend and uses something like React to render server/client side. If the D backend could implement the upcoming GraphQL that would be awesome.
>>
>> It has the benefit that a) the frontend-end devs still get their familiar language, tools and libraries; and b) that all the real stuff happens in D.
>
> this is absolutely the way to go.
>
> In times of reactive frameworks it makes no sense anymore to render html in the backend.
> backends/services should concentrate on data transformations (and security etc).
>
> in my opinion also the REST style apis will come to an end as we can easily have
> stateless apis these days ( 86.62%of browsers have websockets already according to http://caniuse.com/#feat=websockets ).
> The whole ghetto around maintaining state over http for a web application is nuts if you think about how native applications work.
>
> so the whole discussion about servlets is a bit moot or at least its very backward looking.
>
> while its probably a few years too early to see frameworks that _push_ webassembly to the client, we already see frameworks where javascript (or stuff that then compiles into it) is being pushed out. voltframework.com looks quite interesting.
>
> bottom line: in my opinion it will be hard to convince java style web programmers to switch to d. it might be a better strategy to build frameworks that can be used as solid, fast and stateful backends for apps and especially for games.

I can agree that it would be difficult to convince Java style developers to switch.

Unfortunately Java and JSP based web systems are typically very business oriented, and deployed for internal uses. Take for example 1 World Sync. A huge part of it is built in Java and uses JSF. It's super speedy to load pages, but isn't exactly what mainstream sites are doing today.

The problem with Java being business oriented is that those businesses will be reluctant to try new systems that aren't as old, mature, or well-proven.


July 30, 2015
On Wednesday, 29 July 2015 at 17:40:30 UTC, Etienne wrote:
> On Wednesday, 29 July 2015 at 14:30:49 UTC, Sebastiaan Koppe wrote:
>> On Wednesday, 29 July 2015 at 13:22:43 UTC, Etienne Cimon wrote:
>>> I actually use the size of a vibe.d application (2mb) to my advantage to produce a plugin that will overload certain requests on the client's computer (via a windows service or launchd daemon and reverse proxy). This allows much more extensive use of local resources, which is really untapped way of developing web applications at the moment, it really lets your imagination fly.
>>
>> That is very interesting. But how do you push those apps to the end-users without interrupting their browser experience?
>
> You have to make them download the app and agree to elevate. It's not going to be useful for content-based websites, but it definitely has potential in areas where a download would've/could've been necessary anyways e.g. music/video/image editing, phone calls, file sharing, productivity, games, etc.
>
> It really depends on how appealing it makes your application. If your offer beats competition by far, a download won't be regarded as disruptive.

I was hesitant to agree with you, as the whole point of a web app is to have it *mostly* in the web browser. But one thing is for sure a detriment to things: web apps that should have been native apps.

There's nothing like firing up a project management system online, and wondering why it's slow, unresponsive, and offers little *powerful* features. Open Microsoft Project as a native app under Windows, and boom. You're in *real* business.

For these kinds of apps, they shouldn't have ever been made "web apps" but the way the world works, everybody wants everything to be a "google search away" which is great and all, but places serious limitations on just exactly what can be done.

A downloaded plugin, would be a man-in-the-middle solution. Users get there "google search away" and developers get the necessary native speed, flexibility, and components necessary to perform better work.

The browser is a stellar user-interface engine. Certainly better than GTK+ or MFC by a long shot. I just don't think the browser is a useful *operating system* for all these "web apps" that should be native.
July 30, 2015
On Wednesday, 29 July 2015 at 11:06:03 UTC, Ola Fosheim Grøstad wrote:
> On Wednesday, 29 July 2015 at 10:39:54 UTC, yawniek wrote:
>> sorry typo. i meant "we now can have statefull apis".
>
>
> Ok, then I get it. ;)
>
>> and i disagree on the limited usefulness.
>>
>> do you have REST api in native apps? i don't see much reason why we should not develop web applications the way we develop native apps.
>
> The goal should be to keep the server-side simple, robust, transactional and generic. Then push all the fickle special-casing to the client side.
>
> Why do work on the server when you can do almost everything on the client, caching data in Web Storage/IndexedDB?

Because connections are slow, and 80% of the world's "up and coming" nations are still way behind European download speeds. Chine barely got past 2Mbps last year. 50% the US is still under 6Mbps. Africa can't get over 1.5Mbps.

Trying to send globs of data to the user to render each and every request is aching for a user to reject your service or app as "slow and stupid".

The browser was never intended to be a "virtual machine" but instead a rendering engine capable of rendering web-pages. Not parsing, and then interpreting JavaScript (or should we say: booty-script. I've never encountered a more horrid language with terrible speeds. V8 runs like a 2 cylinder lawn mower.)

Apps that should be native, should be kept native. The world hasn't *really* evolved fast enough yet to actually take on the browser as the "app virtual machine" we developers want it to be.
July 30, 2015
On Thursday, 30 July 2015 at 00:08:52 UTC, Brandon Ragland wrote:
> A downloaded plugin, would be a man-in-the-middle solution. Users get there "google search away" and developers get the necessary native speed, flexibility, and components necessary to perform better work.
>
> The browser is a stellar user-interface engine. Certainly better than GTK+ or MFC by a long shot. I just don't think the browser is a useful *operating system* for all these "web apps" that should be native.

There's been lots of improvements in the DOM, those slick CSS3 transitions are actually hardware accelerated with OpenGL, lots of GUI front-ends don't event have transitions in the first place. I wouldn't rely on Javascript for crunching data though, the "slowness" you talk about? Mostly stems from that. Otherwise, for display, it certainly is a great tool with lots of open source components.

Just look at this data table with 640k rows: http://ui-grid.info/docs/#/tutorial/404_large_data_sets_and_performance

THAT is some good javascript, and it certainly beats what I've seen Excel do with sheets harboring 10x less data.

While some are still blaming the hammer, I blame the person trying to hit the nail on the head with his eyes closed.
July 30, 2015
On Thursday, 30 July 2015 at 01:42:02 UTC, Etienne wrote:
> There's been lots of improvements in the DOM, those slick CSS3 transitions are actually hardware accelerated with OpenGL, lots of GUI front-ends don't event have transitions in the first place. I wouldn't rely on Javascript for crunching data though, the "slowness" you talk about? Mostly stems from that. Otherwise, for display, it certainly is a great tool with lots of open source components.

In my experience performance issues are either DOM/redraw related, or old versions of IE. Javascript performance and download speed are nonissues if you do it right. Send data in a format that compresses well and those 2MB turn into 200K. Add caching and background loading... Etc...

> While some are still blaming the hammer, I blame the person trying to hit the nail on the head with his eyes closed.

Yes. Unfortunately many Javascript programmers don't know how to do it right... Or their superiors halt development before performance tuning because it "works".
July 30, 2015
On Thursday, 30 July 2015 at 01:42:02 UTC, Etienne wrote:
> On Thursday, 30 July 2015 at 00:08:52 UTC, Brandon Ragland wrote:
>> A downloaded plugin, would be a man-in-the-middle solution. Users get there "google search away" and developers get the necessary native speed, flexibility, and components necessary to perform better work.
>>
>> The browser is a stellar user-interface engine. Certainly better than GTK+ or MFC by a long shot. I just don't think the browser is a useful *operating system* for all these "web apps" that should be native.
>
> There's been lots of improvements in the DOM, those slick CSS3 transitions are actually hardware accelerated with OpenGL, lots of GUI front-ends don't event have transitions in the first place. I wouldn't rely on Javascript for crunching data though, the "slowness" you talk about? Mostly stems from that. Otherwise, for display, it certainly is a great tool with lots of open source components.
>
> Just look at this data table with 640k rows: http://ui-grid.info/docs/#/tutorial/404_large_data_sets_and_performance
>
> THAT is some good javascript, and it certainly beats what I've seen Excel do with sheets harboring 10x less data.
>
> While some are still blaming the hammer, I blame the person trying to hit the nail on the head with his eyes closed.

Exactly why I said the browser is an excellent User Interface engine, but it's not a good data cruncher, or anything of the sorts. A full fledged 3D game in the browser using JS wouldn't run as fast as a fully native, well coded one.

Not to mention that JavaScript is inherently one of the worst thought-up languages.[2]

Regarding JavaScript being slower, this benchmark[1] seems to indicate that on average, JavaScript on V8 is at-least 4X slower than a g++ compiled native package. It also appears to use anywhere from 2-4X as much memory.

As for why Excel is slow, who knows. I don't genuinely like excel myself, but it is unbeatable in feature-set as of yet (personal opinion).


[1]http://benchmarksgame.alioth.debian.org/u64/compare.php?lang=v8&lang2=gpp
[2]http://live.julik.nl/2013/05/javascript-is-shit

July 30, 2015
On Thursday, 30 July 2015 at 02:17:34 UTC, Ola Fosheim Grøstad wrote:
> On Thursday, 30 July 2015 at 01:42:02 UTC, Etienne wrote:
>> There's been lots of improvements in the DOM, those slick CSS3 transitions are actually hardware accelerated with OpenGL, lots of GUI front-ends don't event have transitions in the first place. I wouldn't rely on Javascript for crunching data though, the "slowness" you talk about? Mostly stems from that. Otherwise, for display, it certainly is a great tool with lots of open source components.
>
> In my experience performance issues are either DOM/redraw related, or old versions of IE. Javascript performance and download speed are nonissues if you do it right. Send data in a format that compresses well and those 2MB turn into 200K. Add caching and background loading... Etc...
>
>> While some are still blaming the hammer, I blame the person trying to hit the nail on the head with his eyes closed.
>
> Yes. Unfortunately many Javascript programmers don't know how to do it right... Or their superiors halt development before performance tuning because it "works".

That's still unacceptable, and by the way, the numbers I were using were indeed compress (with DEFLATE) numbers from Facebook.

The average Facebook Timeline (compressed, with DEFLATE) is about ~4MiB to download.

200KiB isn't bad, and that'll generally be loaded in a second or so on most American 6mbps streams (and faster for the other 50% with better speeds) but 4MiB compressed is a bit too much.

If you leave that same Facebook Timeline open, lazy loading kicks in and within 10 minutes you're total download is ~10MiB (usually pictures that get loaded in).
July 30, 2015
On Thursday, 30 July 2015 at 03:56:25 UTC, Brandon Ragland wrote:
> Regarding JavaScript being slower, this benchmark[1] seems to indicate that on average, JavaScript on V8 is at-least 4X slower than a g++ compiled native package. It also appears to use anywhere from 2-4X as much memory.

I think speed optimized javascript is at 50% of regular non-SIMD C. Javascript suffers when you use high level constructs and convenience frameworks, but so does C++ (which is why people don't use them). It will probably take a long time for javascript to get proper cross browser SIMD support though...

When you get rid of IE9 the real bottleneck to combat is the C++ browser runtime, reflow and "random delays" etc.

Memory usage/performance are issues you can limit by using typed arrays/free lists. But the browser impose limitations on memory usage...