March 13, 2012
On 03/13/2012 01:29 AM, James Miller wrote:
> On 13 March 2012 17:07, Ary Manzana<ary@esperanto.org.ar>  wrote:
>> On 03/12/2012 08:32 PM, Nick Sabalausky wrote:
>>>
>>> "Adam D. Ruppe"<destructionator@gmail.com>    wrote in message
>>> news:npkazdoslxiuqxiingao@forum.dlang.org...
>>>>
>>>> On Monday, 12 March 2012 at 23:23:13 UTC, Nick Sabalausky wrote:
>>>>>
>>>>> at the end of the day, you're still saying "fuck you" to millions of
>>>>> people.
>>>>
>>>>
>>>> ...for little to no reason. It's not like making 99% of
>>>> sites work without javascript takes *any* effort.
>>>>
>>>
>>> *Exactly*. And nobody can tell me otherwise because *I DO* exactly that
>>> sort
>>> of web development. Plus, it often makes for a *worse* user experience
>>> even
>>> when JS is on - look at Vladimir's D forums vs reddit. Vladimir put reddit
>>> to shame *on reddit*, for god's sake! And how many man-hours of effort do
>>> you think went into those D forums vs reddit?
>>>
>>>> Indeed, going without javascript is often desirable
>>>> anyway, since no JS sites are /much/ faster than script
>>>> heavy sites.
>>>
>>>
>>> Yup. Guess I already responded to this in the paragraph above :)
>>
>>
>> It's not about the speed. It's about behaviour.
>>
>> Imagine I do I blog site and want people to leave comments. I decide the
>> best thing for the user is to just enter the comment in a text area, press a
>> button, and have the comment turn into a text block, and say something like
>> "Comment saved!". From a UI perspective, it's the most reasonable thing to
>> do: you leave a comment, it becomes a definitive comment on the blog, that's
>> it.
>>
>> The implementation is straightforward (much more if I use something like
>> knockoutjs): I post the comment to the server via javascript and on the
>> callback, turn that "editing comment" into a definitive comment. Note that
>> only the comment contents were transfered between the client and the server.
>>
>> Now, I have to support people who don't like javascript (and that people
>> ONLY includes developers, as most people don't even know the difference
>> between google and a web browser).
>>
>> To implement that I have to check for disabled javascript, and post the
>> comment to a different url that will save the comment and redirect to the
>> same page. First, it's a strange experience for the user: navigating to
>> another page while it's really going to the same page, just with one more
>> comment (and how can I make it scroll without javascript to let the user see
>> the comment just created? Or should I implement an intermediate page saying
>> "here's your newly created comment, now go back to the post"). Second, the
>> whole page is transferred again! I can't see how in the world that is faster
>> than not transferring anything at all.
>>
>> I know, I had to transfer some javascript. But just once, since it'll be
>> cached by the server. In fact, if the page has a static html which invokes
>> javascript that makes callbacks, that's the most efficient thing to do.
>> Because even if your comments change, the whole page remains the same:
>> elements will be rendered after *just* the comment's content (in JSON) are
>> transferred.
>>
>> Again, I don't understand how that is slower than transferring whole pages
>> the whole time.
>
> Ary, the idea is to start with the static HTML version, then
> progressively add javascript to improve the functionality. If you have
> javascript at your disposal, you can change the behavior of the
> existing page.
>
> Your example would be:
>
> 1. Start with normal POST-request comment form, make sure it works.
> (HTTP redirect back to original page)
> 2. Add javascript that listens to the submit on the comment form.
> 2a. Stop the default submit, submit the form to the same endpoint as 1
> 3. On success, do your in-page comment action.
>
> And thats about it. I'm sure you could break it down more. There's
> also more you can do, most of it server-side (check for ajax post,
> return JSON, etc.), but the idea is that the extra effort to support
> HTML-only isn't really extra effort. Since you have to submit the form
> anyway, then why not allow it to submit by regular HTTP first.
>
> Ideally, you don't have to detect for javascript, you just have to
> *shock horror* code to web standards.
>
> --
> James Miller

But the non-javascript version is a worse user experience, and it's less efficient. Why not make it well from scratch?
March 13, 2012
On 13 March 2012 17:23, Nick Sabalausky <a@a.a> wrote:
> 1. Such animations need to be *FAST*. We're talking roughly 250ms max (probably even less, but I'd have to play around with it to refresh my memory). Most UI animations are slower than this (particularly on the web - although many DVDs are *FAR* worse), and while it's good for first-time users, for everyone else it just gets in the way of getting work done and makes the experience feel sluggish.
>
> 2. On the web, animation means JS. But not everyone is using a browser with that V8 engine or whatever it's called (the one that Chrome uses). And not everyone is using a quad-core system with 64-bit software and 16GB or whatever RAM, etc. like the well-supplied web developers are likely using. So frequently this means very choppy, sluggish animations. And that's a much worse UX than popping. This also gets in the way of being able to properly handle #1 above, *fast* animations.
>
> 3. People have also reported that such UI animations can convey a subtle (or even not-so-subtle) sense of being patronized. Especially if it's a slower animation. I can definitely relate to this.
>
> (Of course, if people just make real applications instead of web apps, then those problems would be trivially solvable.)

Slow animations are a problem, but CSS transition are helping make this less of an issue. And as long as you aren't trying to be too over-the-top then your normally fine. Sliding tends to be ok for most things, and fades are fast everywhere. I agree that slow animations are annoying though, I only do it for things that are loading anyway, so the slow animation doesn't actually slow down interaction. (I'm talking 1-2 second-long credit card transaction situations).

--
James Miller
March 13, 2012
On Tuesday, 13 March 2012 at 04:07:08 UTC, Ary Manzana wrote:
> The implementation is straightforward (much more if I use something like knockoutjs): I post the comment to the server via javascript and on the callback, turn that "editing comment" into a definitive comment.

It is *equally* straightforward to do this without
javascript, though it will have a page refresh.

> Note that only the comment contents were transfered between the client and the server.

That's not relevant. Profile a web app's speed, and
you'll see there is no significant difference between
some json and a page on most internet connections.

The majority of the time is spent in javascript, then
latency between server and client. Data transfer time
is often not important.

> To implement that I have to check for disabled javascript, and post the comment to a different url that will save the comment and redirect to the same page.

You're doing this wrong. Your comment form is a form, yes?
Just set the action to the same thing you'd call in
javascript.

Zero extra work.

Now, add your javascript code to the onsubmit handler.

Using my web.d:

Element addComment(int contentId, Text commentText) {
   ensureGoodPost(); // xsrf check
   auto obj = new DataObject(db, "comments");
   obj.id = std.random.uniform(1, int.max);
   obj.content_id = contentId;
   obj.message = commentText.content;
   obj.commitChanges();
   // redirect redirects users, not api calls so this just works
   redirect("view-content?contentId=" ~ to!string(contentId) ~ "#c" ~ obj.id;
   return renderComment(obj); // you need to be able to render comments on the server for full page load anyway... so just reuse that
}


And you can render that in javascript or html without
hassle. You can simply let the automatically created
form

To render the form in javascript:

YourSite.getAutomaticForm("addComment").appendTo(this);

to call it:

YourSite.addComment(cid, "comment").appendTo(this);



Or, you can link to it:

site/add-comment?contentId=xxx

that gives the form, and POST to it will save the comment.
No JS needed, it uses standard form encoding.

Submit the form, and the redirect() sends the user
to the right place. Do the javascript submit, and the
redirect is not needed - the return value there
gets sent down.

(don't even have to write html btw!)



> I make it scroll without javascript to let the user see the comment just created?

use an anchor in a http redirect. I don't recall if
that is quite right in IE8... but worst case, it just
ignores the anchor so the user scrolls manually. That's
graceful degredation.


> In fact, if the page has a static html which invokes javascript that makes callbacks, that's the most efficient thing to do.

No. The more processing you do on the server, the
faster the page will be.

Paying an extra couple milliseconds on the server
is much smaller than paying for double latency
in more server round trips.


It is just like how people combine their css, js,
and images into big single files to cut down on
http requests. Do the same thing with ajax.

Now, you could cache the ajax requests too,
and save that... but there's no need when
you can just do it on the server. (and perhaps
cache the finished product)
March 13, 2012
"Ary Manzana" <ary@esperanto.org.ar> wrote in message news:jjmhja$3a$2@digitalmars.com...
> On 03/12/2012 10:58 PM, H. S. Teoh wrote:
>>
>> The problem today is that JS is the "next cool thing", so everyone is jumping on the bandwagon, and everything from a single-page personal website to a list of links to the latest toaster oven requires JS to work, even when it's not necessary at all. That's the silliness of it all.
>>
>>
>> T
>
> It's not the next cool thing. It makes thing more understandable for the user. And it makes the web transfer less content,

That gets constantly echoed throughout the web, but it's a red herring: Even if you handle it intelligently like Adam does (ie, lightweight), the amount of data transfer saved is trivial. We're talking *part* of *one* measly HTML file here. And even that can be gzipped: HTML compresses *very* well. Yes, techincally it can be less transfer, but only negligably so. And bandwith is the *only* possible realistic improvement here, not speed, because the speed of even a few extra K during a transfer that was already going to happen anyway is easily outweighed by the overhead of things like actually making a round-trip to the server at all, plus likely querying a server-side DB, plus interpreting JS, etc.

If, OTOH you handle it like most people do, and not like Adam does, then for brief visits you can actually be tranferring *more* data just because of all that excess JS boilerplate people like to use. (And then there's the start-up cost of actually parsing all that boilerplate and then executing their initialization portions. And in many cases there's even external JS getting loaded in, etc.)

The problem with optimization is that it's not a clear-cut thing: If you're not looking at it holistically, optimizing one thing can either be an effective no-op or even cause a larger de-optimization somewhere else. So just because you've achived the popular goal of "less data transer" upon your user clicking a certain link, doesn't necessarily mean you've won a net gain, or even broken even.

> and leverages server processing time. It's the next step. It's not a backwards step. :-P
>

It's the *newer* step. It may be "the future", but that's irrelevent: The question here is whether it's *good*, not whether it's popular or ubiquitous. Most real-world uses of it *are*, objectively, backwards steps.

> I figure then Google people are just all a bunch of idiots who just like JS a lot...

Probably not all of them, but for the most part I frequently get that impression. Plus, keep in mind too, they have a *clear vested interest* in treating the web as a platform. Their whole business model relies on the web being treating as a platform. That, in turn, creates a (self-serving) need for them to push JS *regardless* of JS's merit. Without ubiquitus JS, the web has an even harder time competing with real platforms, and that pulls the rug out from under Google. They *are* a major corporation, never forget that.


March 13, 2012
On Tue, Mar 13, 2012 at 05:27:27AM +0100, Adam D. Ruppe wrote:
> On Tuesday, 13 March 2012 at 04:24:45 UTC, Nick Sabalausky wrote:
> >2. On the web, animation means JS.
> 
> css3 does animations that are pretty easy to use,
> degrade well, and tend to be fast. Moreover css
> is where it belongs anyway - it is pure presentation.
> 
> Far, far superior to the JS crap.

+1.


T

-- 
Real men don't take backups. They put their source on a public FTP-server and let the world mirror it. -- Linus Torvalds
March 13, 2012
On Mon, Mar 12, 2012 at 10:35:54PM -0400, Nick Sabalausky wrote:
> "Jonathan M Davis" <jmdavisProg@gmx.com> wrote in message news:mailman.572.1331601463.4860.digitalmars-d@puremagic.com...
[...]
> > All I'm saying is that if it makes sense for the web developer to use javascript given what they're trying to do, it's completely reasonable to expect that their users will have javascript enabled (since virtually everyone does). If there's a better tool for the job which is reasonably supported, then all the better. And if it's easy to provide a workaround for the lack of JS at minimal effort, then great. But given the fact that only a very small percentage of your user base is going to have JS disabled, it's not unreasonable to require it and not worry about the people who disable it if that's what you want to do.
> >
> 
> Personally, I disagree with the notion that non-JS versions are a "workaround".
[...]

Me too. To me, non-JS versions are the *baseline*, and JS versions are enchancements. To treat JS versions as baseline and non-JS versions as "workaround" is just so completely backwards.


T

-- 
There are three kinds of people in the world: those who can count, and those who can't.
March 13, 2012
On 13 March 2012 17:31, Ary Manzana <ary@esperanto.org.ar> wrote:
>>
>> Ideally, you don't have to detect for javascript, you just have to *shock horror* code to web standards.
>>
>> --
>> James Miller
>
>
> But the non-javascript version is a worse user experience, and it's less efficient. Why not make it well from scratch?

Because my way works for everybody, and well for people with javascript. Your way eliminates everybody without javascript. It is barely any extra work to set it up this way (all the logic you had to do anyway, just have to think about it better), and 99% of people get the exact same experience, and 1% still get to use your site. Everybody wins.

This isn't some JS vs NoJS debate, this is JS-only vs Progressive Enhancement. And for the record, GMail has a HTML-only version, and most of the other products work, if with reduced functionality, without javascript. I just tested search, it worked fine.

--
James Miller
March 13, 2012
"Ary Manzana" <ary@esperanto.org.ar> wrote in message news:jjmiip$2c2$1@digitalmars.com...
>
> But the non-javascript version is a worse user experience, and it's less efficient. Why not make it well from scratch?

Because it's trivially easy to do, and it *is* a better experience than: a user goes to your page, tries to add a comment, finds that "This fucking thing doesn't even work, WTF? It's just a goddamn form submission! How do you screw that up?" and then if they still care, enable JS and then reload the page, possibly much more slowly this time, reenter the captcha and try again. You can argue that "everyone should just conform and keep JS on!", but that's never going to happen (and for legitimate reasons). Besides, as developers, it's *our* responsibility, not the user's, to make things "just work".


March 13, 2012
On Tue, Mar 13, 2012 at 06:13:53PM +1300, James Miller wrote: [...]
> This isn't some JS vs NoJS debate, this is JS-only vs Progressive Enhancement. And for the record, GMail has a HTML-only version, and most of the other products work, if with reduced functionality, without javascript. I just tested search, it worked fine.
[...]

Data point. After google started adding JS enhancements to their search results page and the JS keyboard shortcuts conflicted with my browser custom key bindings, I turned off JS for www.google.com (shock! horror!).

And guess what? It went back to the same behaviour it used to have before the JS enhancements. ON THE SAME HTML PAGE. No loss in functionality at all. See, now that's an example of web coding done right. The HTML provides the baseline functionality, and if the user has JS, then she gets the enhanced functions. Everybody wins. This is how web standards were designed to work, in the first place.

And this takes no extra effort at all. The HTML is supposed to express the logical structure of the page anyway, so using <form> and form elements *should* be done anyways. You get baseline functionality for free. Then layer JS on top of that to do whatever fancy effects you want -- which you wanted to do anyway. So it's the same amount of work for *much* better graceful degradation.

As opposed to writing the site with JS from the get-go, which has no graceful degradation, *and* often turns out to be much uglier (you end up with lots of JS just outputting HTML into the DOM, which should've just been put into the HTML file in the first place).


T

-- 
Государство делает вид, что платит нам зарплату, а мы делаем вид, что
работаем.
March 13, 2012
"Adam D. Ruppe" <destructionator@gmail.com> wrote in message news:oxkxtvkuybdommyerrke@forum.dlang.org...
> On Tuesday, 13 March 2012 at 04:24:45 UTC, Nick Sabalausky wrote:
>> 2. On the web, animation means JS.
>
> css3 does animations that are pretty easy to use,
> degrade well, and tend to be fast. Moreover css
> is where it belongs anyway - it is pure presentation.
>

Interesting, I had no idea! Thanks for the tip :)

> Far, far superior to the JS crap.
>

Yea, there's a lot of things that are much better done in CSS that a lot of people don't even know about. For example, most rollovers are easily doable in pure CSS. But there's a lot stuff out there (paricularly things created in Adobe's "software") that use JS for rollovers, which doesn't even work as well (even with JS on).

OTOH, I don't like CSS drop-down menus. Maybe it's different in CSS3, but in CSS2 the only way to make CSS menus work is for them to open upon rollover, not click. And dropdown menus opening upon rollover is just a usability mess, IMO, *and* inconsistent with pretty much any GUI OS I've ever used.