September 21, 2013
On Sat, 21 Sep 2013 03:29:59 +0200
"Adam D. Ruppe" <destructionator@gmail.com> wrote:

> On Friday, 20 September 2013 at 19:17:45 UTC, H. S. Teoh wrote:
> > I dunno, I find that my good memories of those old games are quite tainted by nostalgia.
> 
> True in some cases, but in others I find myself able to appreciate them even more now. But I avoid the taint by playing them again every few years :)
> 

I agree. I don't really get when people say that older games are just nostalgia. I mean, I'm sure it is for some people, but I genuinely do like a lot of those other games even when I actually go back and replay them (not *all* of course).

Yea, there are occasionally areas where those older games are rough around the edges by today's standards. Ex: The mouse handling in Lemmings 3 is really awkward, and save systems weren't always good in the rare cases they existed at all. But most of these issues are either:

A. Obviated by save states

or B. Easier to put up with than extremely *inane* non-skippable cutscenes (*cough*Assassin's Creed, Shift 2, and the original non-blood-dragon FarCry 3*cough*), endless company logos, patronizing tutorials, endless chatter, inability to use keyboard/mouse for FPSes on systems with *actual USB ports* (ie every console FPS except CS:GO), bad framerates *despite* being on hardware several orders of magnitude more powerful (16-bit systems ran at 30 fps, there's *no* excuse for bad framerates anymore - I'm looking at you Sonic. If you're having framerate difficulties then quit being such graphics whores and tone down all the effects and poly counts for godssakes, you're not targeting 1991 hardware, you can make things run at least *that* well), and all the other shit from so many modern games.

But back to the topic: I admit there are a few "retro" games I've played and enjoyed SOOOO much over the years that I've started to just get tired of (like the NES Marios), but more often than not my reaction isn't disappointment of "Ehh, I remember it being better" but frequently one of these:

A. It's been too long since I've played this. It's really nice to get back to games where you have to actually *think* and/or *try*. Man I like this game.

B. Hmmm, I wasn't into this at the time, but all of a sudden it all just "clicks" and I "get it" now. Sweet!

C. I never played/heard of this at the time, and that's a shame because this is fantastic! (Ex: I'm currently going through a fan translation of Monster World 4 on Genesis.)

Of course there *are* plenty of duds on those old systems, but that's true of modern systems, too: Last Of Us[1], anyone? No thanks. God no.

However, all that said, there *is* a lot of modern stuff I like, too. Just *some* of them off the top of my head:

- Rayman Origins/Legends
- Splinter Cell 1-3 (4's not terrible either, haven't played 5 or 6)
- 3D Dot Game Heroes
- Forza/Gran Turismo/Need for Speed: Shift 1
- Sonic Racing Transformed
- Sonic 4/Generations
- Limbo
- Hatsune Miku: Project Diva F
- BioShock (once you get past the completely worthless first ~20-30min)
- Portal 1 and 2
- Disgaea
- From Dust
- Kirby's Epic Yarn
- New Mario
- Kororinpa
- Pikmin
- Wii Sports Resort (believe it or not)
- Angry Birds in Space (believe it or not)
- Adventures of Big and Tiny (or something like that)
- Braid

So I'm not really a "new games" hater so much as an "idiotic bullcrap" hater. There *are* good games being made; it's just there are also some very irritating trends: For example, a *lot* of the modern AAA games that are actually good, are complete and total shit for the first 30-60 minutes - *then* they become worthwhile. It's as if they're *trying* to make their games leave a bad first impression.

Wanna see a *good* first impression? Play "Castlevania: Symphony of the Night". But a lot of AAA devs take what leads to good first impressions and do exactly the opposite. (BTW: No personal offense intended, Manu. I don't know how long you've been there, but I actually love the first two Max Paynes. I honestly do.)

[1] Last Of Us <https://www.youtube.com/watch?v=tTsBn36yPrg> Ie "Lets make a game about wandering from one empty, pointless room to the next, to the next, to the next, while seeing how much inane whining you can tolerate from the NPCs, and after every so many rooms of non-gameplay force people to watch parts of a generic zombie movie with the only distinction being a protagonist who's a grown male drama queeen ('What, she's infected and lied to us?! Whine whine whine, bitch bitch bitch.') and then watch the whole industry to praise it as an alleged breakthrough in cinematic gaming" What horseshit. And yes, I've played it, as well as Assassin's Creed 2 which isn't any better.

Last Of Us isn't even good as a zombie *movie*. It's more like "If JJ Abrams made a zombie movie". Want a good zombie movie? Try Zombieland or Sean of the Dead.

<pet peeve>Zombies are supernaturally reanimated corpses, not viral infections. Supernatural: Cool. Virus: Boring.</pet peeve>

> > many annoyances that have been eliminated in modern games.
> 
> Oh, modern games have their own annoyances. For example, the NES would flash or glitch. The playstation three freezes up and disconnects its own CPU with its excessive heat.
> 

So *that's* why it crashes more than any other console I've owned! (Although I think another reason for that is that Crackle apparently can't handle the clock being pushed a year ahead to kill Sony's bullshit forced-telemarketing-to-paying-customers. Not that Crackle is really worth using...)


> I'll take the NES though, at least it didn't have such ridiculously long load and boot times!
> 

I *LOVE* that about the NES. Here's what an NES game is like:

- Purchase game
- Open shrinkwrap
- Insert cart
- Push power
- (3rd party only) Wait <= 5 sec for copyright/legal screen(s)
- Wait <= 1 sec for title screen
- Press Start
- (Sometimes) Select save slot, player name
- Now playing game!!

This is what a typical AAA PS3 game is like:

- Push power
- Wait for health and safety warning to appear
- Wait for warning to go away
- Wait for system menu to appear
- Purchase game (I'll count this as one step. I think that's fair.)
- Download game
- Install game (Downloading *isn't* installing? On a *console*?)
- Wait for system menu to load list of games...one...by...one...
- Start game
- Download game update because Sony's infrastructure is too fucking
  stupid to send you the latest version in the first place.
- Wait for PS Eye/Motion health & safety warning even though nobody
  owns a PS Eye/Motion.
- Wait for first company logo to finish animating.
- Wait for second company logo to finish animating.
- Wait for third company logo to finish animating.
- ...etc...
- Wait for it to connect to network even though you're already logged
  into PSN and don't intend to play multiplayer.
- Wait for it to explain "Auto Saving" and show you what a "Saving"
  logo looks like
- Wait for it to load the title screen
- Wait for the title screen to animate in
(Are you *still* interested in playing at this point?)
- Press start
- A loading screen just to load the main menu
- Select an option
- Wait for the screen transition animation that some artist decided was
  necessary
- Select another option
...etc...
- Wait for game to load
- Surprise, it's a cutscene and you can't skip it.
- Wait for game to load for real
- Another cutscene
- Hope to heck *this* loading screen is for the first level
- The first level is a tutorial. After several lines of completely
  unnecessary dialog, some Sergeant is telling me how to push buttons.
  I can't kill him no matter how much I want to because the last three
  times I tried the tutorial restarted from the beginning (*cough*Call
  of Duty 4*cough*).
- Spend at least several minutes playing "Simon Says" with Mr.
  Chatterbox "Can't Get To The Freaking Point" Windbag. (*cough*Much,
  much more than just Call of Duty*cough*)
- Loading cutscene.
- Go to the kitchen and make a sandwich while more windbags ramble on
  about whiny bullshit I don't care about. Occasionally peek my head in
  to see if they're almost done. Ignore the controller rumbling itself
  off the edge of the table.
- More loading
- Bring my snack back to the living room
- First level starts, but by now I'm more interested in my food
- Finish eating, take control of the now-waiting player character
- Walk around a stage that doesn't have much of any gameplay, but maybe
  has some barely-interactive scripted sequences (with more
  blathering), or more interspersed cutscenes.
- Wait for next level to load
- It's now about 30-60 minutes from the first company logo (not from
  power on) and I finally get to start playing the *real* game. Or
  not...my laundry's probably done...

The bizarre thing is, I swear to god I'm *not* exaggerating any of that. And that means somebody, somewhere, actually thought all that bullshit was acceptable for the price of console + game. Yea, obviously some of it is to be expected (downloading, various purchasing steps, multi-tiered menu system), but most of it's just badly designed, badly engineered bullshit.

I've played PUO-heavy DVDs that have far less bullcrap than that.

> 
> Gameplay wise.... eh, the new games I like tend to be similar to the old games.
> 

Heh, that's often (though not always) my experience, too.

> 
> They also loop so well, I can set a video game song playing for 30 minutes straight as real life background music and not get annoyed with it.

Well, that depends on the song ;)

I can understand why parents got annoyed at us playing those games too long, relegated them to the basement, etc... <g>

Journey to Silius had good music. And of course MegaMan and Sonic.

> Sometimes that works with mp3s too, but the video game ones are specifically made for infinite looping so there's no discontinuity as it goes around again.

Yea, that is true.

September 21, 2013
On Sat, 21 Sep 2013 11:04:10 +1000
Manu <turkeyman@gmail.com> wrote:

> On 20 September 2013 22:15, H. S. Teoh <hsteoh@quickfur.ath.cx> wrote:
> >
> > There is no argument here, actually. The problem is really historical -- names like 'du' or 'grep' or 'awk' meant something back in who knows when, but they no longer mean anything to us today (well, those of us not old enough *cough*). If I were to reinvent Unix today, I'd choose better names for these things. But think about it, if the above line were instead written like this:
> >
> >         diskUsage $HOME | sort --reverse --numeric | pager
> >
> > it would make so much more sense, wouldn't it? So the "nonsensical" part is really just in the poor choice of naming, not an inherent weakness of the interface.
> >
> 
> I'd still argue that it is. It is how it is, and it's completely prohibitive to casual or new users.

So? Does everything have to be targeted at new/casual users? Can't experienced users have stuff that's made for them? Who ever said command lines are still intended for everybody? Keep in mind, a programmer is NOT a casual or new user. But in any case, please don't mistake "Windows vs Linux" as a "one size fits all" topic, because you seem to be steering things that way.

Rant: Seems to be a big trend in computing these days. Everything is all
about catering to Average Joe Numbskull and Little Miss Facebook, and to
hell anyone who has more advanced experience and needs where "usable
by anyone's grandmother" is the least of their concerns.

Average Joes need their tools, sure, but so do the rest of us.

You do realize that in the time you've spent taking a friendly OS discussion and single-handedly trying[1] to turn it into yet another ill-informed OS flamewar (congratulations, btw) you could have already learned quite a bit about using a unix command line?

[1] Don't deny it. Your intent to bait was obvious a few posts back, but due to your good standing here I've been giving you a chance.


> [...]
> > > I had a video card driver problem the other day. The bundled
> > > auto-update app failed, and totally broke my computer.
> > > I had to download kernel source, and run some scripts to compile
> > > some
> > sort
> > > of shim that made the video driver compatible with my kernel to get it working again... absolutely astounding.
> >
> > Uh... you do realize that this is because Linux actually *lets* you fix things? If something like this happened on Windows, the only real solution is to nuke the system from orbit and start from ground zero again (i.e. reinstall). One can hardly expect that repairing a broken car engine should require no thought.
> >
> 
> Nothing like that has EVER happened to me in a few decades of windows. In my experience asa linux user, these sort of problems are a daily chore.
> 

I've had stuff like that happen on Windows. Not on my own system within the last few years, but over "a few decades"? Oh hell yea.

OTOH, I don't think I've had such trouble with Linux in at least as long. I think 2002 was probably the last time.


> Speaking of which, I managed to totally break my computer last night /
> > this morning too.
> 
> 
> No shit. Should I be surprised? ;)
> 
[...]
> 
> > but the hardy little thing just kept going. It was
> > causing subtle breakages like my printer mysteriously failing to
> > work, and when I finally figured out the problem, I downloaded a
> > new kernel and recompiled it.
> 
> 
> ... speechless ;)
> 
> 
[...]
> 
> I rest my case.
> 

Ok, now I know you're just trying to troll. But I've never seen you troll before so you should know better.

He made it perfectly clear he had been messing around with his own internals. *Plus* you know perfectly well messing around with Windows internals can also lead to problems requiring expert-skill recovery techniques, so really, you *know* that you know better, so cut the shit.

Yes, Linux sucks. And guess what? So does Windows. I use both, by choice. End of story.

> 
> I think the main difference is quality-assurance. Windows software is more likely to be released only after it's reasonably proven that it works.
> 

Like Debian.

And if you bring up some broken Linux distro, I'll bring up WinME, and then we'll all have added a whole lot of usefulness to the discussion ;)


> I'm not a mechanic, and I shouldn't have to be to drive a car.
> 

Strawman, in too many ways to be worth listing.

September 21, 2013
On Sat, 21 Sep 2013 05:05:41 -0400
Nick Sabalausky <SeeWebsiteToContactMe@semitwist.com> wrote:
> 
> You do realize that in the time you've spent taking a friendly OS discussion and single-handedly trying[1] to turn it into yet another ill-informed OS flamewar (congratulations, btw) you could have already learned quite a bit about using a unix command line?
> 
> [1] Don't deny it. Your intent to bait was obvious a few posts back, but due to your good standing here I've been giving you a chance.
> 

That came out overly-harsh and not how I intended. ("Yea, no shit,
Nick") Uhh, yea...

What I mean is just, in this section of the thread, it has been sounding as if you're simply flame-baiting or arguing for the sake of arguing.

(And then I somehow managed to awkwardly weave that into a completely different and not-terribly-important point about "time it takes", bleh, whatever...)
September 21, 2013
Am 21.09.2013 12:12, schrieb Nick Sabalausky:
> On Sat, 21 Sep 2013 05:05:41 -0400
> Nick Sabalausky <SeeWebsiteToContactMe@semitwist.com> wrote:
>>
>> You do realize that in the time you've spent taking a friendly OS
>> discussion and single-handedly trying[1] to turn it into yet another
>> ill-informed OS flamewar (congratulations, btw) you could have already
>> learned quite a bit about using a unix command line?
>>
>> [1] Don't deny it. Your intent to bait was obvious a few posts back,
>> but due to your good standing here I've been giving you a chance.
>>
>
> That came out overly-harsh and not how I intended. ("Yea, no shit,
> Nick") Uhh, yea...
>
> What I mean is just, in this section of the thread, it has been
> sounding as if you're simply flame-baiting or arguing for the sake of
> arguing.
>
> (And then I somehow managed to awkwardly weave that into a completely
> different and not-terribly-important point about "time it takes", bleh,
> whatever...)
>

Just yesterday I've watched a cool talk from Rich Hickey (clojure designer) about design and the time it takes to learn stuff.

http://www.infoq.com/presentations/Design-Composition-Performance

Basically, one of his messages is that nothing comes for free and learning requires effort.

He makes the remark that only in the software industry people seem to have the "learn in xxx days" mentality and suff for dummies.

--
Paulo
September 21, 2013
On 21/09/13 11:05, Nick Sabalausky wrote:
> So? Does everything have to be targeted at new/casual users? Can't
> experienced users have stuff that's made for them? Who ever said
> command lines are still intended for everybody? Keep in mind, a
> programmer is NOT a casual or new user. But in any case, please don't
> mistake "Windows vs Linux" as a "one size fits all" topic, because you
> seem to be steering things that way.

There's a difference between difficulty that is inherent, versus difficulty that is unnecessary and arises out of a lack of concern for usability.

Or, in the case being discussed here, more likely it arises out of historical priorities that apply much less today.  I would imagine that back in the early days of UNIX, processing key-presses was much more expensive than it is today, and there was thus a strong functional benefit in minimizing the amount of typing.  (That still applies to an extent today if you're typing commands over a slow ssh connection, for example.)

If we were designing command-line scripting from scratch, today, we'd do something very different and it would definitely be much more user-friendly, and no one would lose from that -- both experts and novices would benefit.

> Rant: Seems to be a big trend in computing these days. Everything is all
> about catering to Average Joe Numbskull and Little Miss Facebook, and to
> hell anyone who has more advanced experience and needs where "usable
> by anyone's grandmother" is the least of their concerns.
>
> Average Joes need their tools, sure, but so do the rest of us.

Speaking as a hopefully non-average Joe ... making things usable by anyone's grandmother doesn't necessarily have to come at the cost of making things less good for experts.  Well-done usability design makes life easier for experts as well as for novices.

The problem is that because experts are as good as they are, they are much more capable of dealing with unnecessary complexity.  And, having mastered unnecessary complexity, it's then that bit more difficult to invest the time to learn a new, simpler way of doing things, because it means investing time to re-learn how to do stuff _you can already do_, and that learning curve means you'll go through a period of being less capable (because you're learning) than you are with your existing toolkit.  And then of course there's all the legacy stuff that does things using the old tools and which you know you'll have to keep using, so it's another reason to stick with what you know ... and thus lockin happens.

Case in point: C++ vs. D.  Is anyone here going to claim that, _as a language_, D is not significantly more user-friendly than C++?  Yet it's no less powerful -- in fact, the enhanced user-friendliness frees up experts to do more things better.

> You do realize that in the time you've spent taking a friendly OS
> discussion and single-handedly trying[1] to turn it into yet another
> ill-informed OS flamewar (congratulations, btw) you could have already
> learned quite a bit about using a unix command line?
>
> [1] Don't deny it. Your intent to bait was obvious a few posts back, but
> due to your good standing here I've been giving you a chance.

For what it's worth, I think you may have missed the humour of Manu's posts. Over-the-top criticism, in the right tone of voice, can be a pretty good way to get people who are used to a particular way of doing things to re-evaluate their assumptions.  It's important to occasionally engage in mockery and caricature of the things that we value, because it helps to re-engage our critical faculties about what is good and bad.

Specifically in this case: the user-friendliness of GNU/Linux distros has come a _huge_ way in the last 10 years, but there's no reason why they shouldn't be every bit as surface-friendly (maybe even more so) than the popular commercial OS's while retaining all the power that experts need and want.  It's a terrible shame that more attention is not given to this surface-friendliness, and it's striking how resistant many old-school free software people are to usability-oriented improvements _that don't necessarily constrain them_.

** Example 1 **
I was a longstanding KDE user until with the 12.04 release of Ubuntu, I switched over to using Unity.  I found it much more usable and effective in all sorts of ways, but initially I was frustrated because there were superficially less config options available.  It was striking how quickly I realized _I didn't miss them_ and that most of that configurability I'd had with KDE was a distraction rather than something that assisted me.  As someone wrote round-about that time, there's a tendency for customisability to be an excuse for lack of design.

** Example 2 **
The first GNU/Linux distro I ever installed on my own machine was Ubuntu 5.10, which I decided I should try out after seeing video of Mark Shuttleworth's talk at DebConf 2005.  Coming from Windows there was a fairly steep learning curve, but what buggered it for me was that my wireless card driver wasn't supported and getting it working involved a complicated procedure editing various config files to get the system to use a proprietary Windows driver. I just couldn't get it to work.

Then, I tried OpenSUSE 10.0, which had YaST -- a GUI config tool which could handle all the complicated under-the-bonnet stuff I needed, it just needed to be pointed to the proprietary driver and it would sort out the rest itself.

So, SUSE was what kept me using Linux, and after a period of finding all the ways to shoot myself in the foot (e.g. installing RPMs from 3rd-party repos and seeing how they'd break all my system dependencies...), and getting used to Linux-y ways of doing stuff rather than Windows-y, I got to the point where I switched back over to Ubuntu, was comfortable doing the command-line config for the wireless driver, and have never really looked back.

The point is, without a distro that catered to novices who had no clue how to use the command line, I'd have been sunk.  And as it is, I've been able to get to the point of becoming much more capable, thanks to there being a tool available that let me initially bypass the depth of complexity of the system.

** Example 3 **
... when colleagues first tried to get me to install Linux on my system, way back in 2001.  Much laughter in the office when I suggested that I thought Windows was a more effective, better-made OS.  By coincidence, the same day, we had a visitor coming to give a presentation which she had on a 1.44 MB floppy disk (I know, I know, this sounds like an archaelogical dig...).  Cue much amusement on my part as all the guys in the office tried to remember the command-line instructions to mount a floppy on Red Hat.  Today, of course, it's a given that if you insert a disk, your Linux distro should auto-detect and work out how to mount it, but back then, this kind of usability issue wasn't really considered important -- even though auto-detection is just as beneficial to the expert as the novice.

>>>> I had a video card driver problem the other day. The bundled
>>>> auto-update app failed, and totally broke my computer.
>>>> I had to download kernel source, and run some scripts to compile
>>>> some
>>> sort
>>>> of shim that made the video driver compatible with my kernel to
>>>> get it working again... absolutely astounding.
>>>
>>> Uh... you do realize that this is because Linux actually *lets* you
>>> fix things? If something like this happened on Windows, the only
>>> real solution is to nuke the system from orbit and start from
>>> ground zero again (i.e. reinstall). One can hardly expect that
>>> repairing a broken car engine should require no thought.
>>>
>>
>> Nothing like that has EVER happened to me in a few decades of windows.
>> In my experience asa linux user, these sort of problems are a daily
>> chore.
>
> I've had stuff like that happen on Windows. Not on my own system within
> the last few years, but over "a few decades"? Oh hell yea.
>
> OTOH, I don't think I've had such trouble with Linux in at least as
> long. I think 2002 was probably the last time.

It's worth remembering that the ability to go under the bonnet and fix things, while in principle it's available to everyone, is not an advantage that's perceptible to many users.  To a great many users, _any_ breakage that can't be fixed through the regular OS GUI is a "take it to the experts or else just reinstall it" show-stopper.

So, if the _typical_ problems of your OS require under-the-bonnet maintenance, then this is a usability problem that it's worth trying to address.

>> Speaking of which, I managed to totally break my computer last night /
>>> this morning too.
>>
>>
>> No shit. Should I be surprised? ;)
>>
> [...]
>>
>>> but the hardy little thing just kept going. It was
>>> causing subtle breakages like my printer mysteriously failing to
>>> work, and when I finally figured out the problem, I downloaded a
>>> new kernel and recompiled it.
>>
>> ... speechless ;)
>>
> [...]
>>
>> I rest my case.
>
> Ok, now I know you're just trying to troll. But I've never seen you
> troll before so you should know better.
>
> He made it perfectly clear he had been messing around with his own
> internals. *Plus* you know perfectly well messing around with Windows
> internals can also lead to problems requiring expert-skill recovery
> techniques, so really, you *know* that you know better, so cut the
> shit.
>
> Yes, Linux sucks. And guess what? So does Windows. I use both, by
> choice. End of story.

I think it's generally true that (these days) most of the under-the-bonnet maintenance I have to do on GNU/Linux is down to the fact that I ask more of the system than I do of Windows, and I do more risky stuff that requires me to take on more responsibility.  (In fact, I hardly use Windows at all these days, and I ask my Ubuntu setup to do all sorts of things I'd never have dreamed of doing back when Windows was my main OS.)

On the other hand, I think it's daft not to recognize the fact that Windows is in many ways better at helping the user avoid having to go beneath the surface to fix problems, and that surface-level friendliness makes a big difference in how easy it is to use in practice.  It solves _more users' problems more of the time_.

This ought to be somewhere GNU/Linux can clean up, because it ought to be possible to have that surface friendliness while also being _easier_ to go under the hood.  (Though as I discovered as a novice Linux user, that ease of going under the hood can be also a great way to screw things up for yourself.  It's a bit like giving someone a handful of basic martial arts moves can be a great way to get them beaten up...)

>> I think the main difference is quality-assurance. Windows software is
>> more likely to be released only after it's reasonably proven that it
>> works.
>
> Like Debian.

Debian's QA is different to that of Windows.  Debian test to ensure that the software is bug-free -- they don't as a rule consider usability challenges to be bugs, and they will sometimes favour inferior technical solutions for non-technical reasons (e.g. driver licensing).

I actually think that they're right to have that strong free software focus, and that in the long term it also results in better software, but on a short-term basis it does result in more user-facing problems.

Whereas, whatever extensive criticisms one can make of Microsoft Windows, one thing that has to be acknowledged is that they have a very strong focus on minimizing user-facing problems or making it trivial to deal with them.

Bottom line: we all know that GNU/Linux is fundamentally a better OS than Windows.  We all know that many of the claims about user-friendliness are FUD, and that many of the real problems arise out of Windows lock-in in the computing space (driver support being the most obvious).  But there are certain usability issues that are particularly damaging for non-technical users, which do arise much more regularly on GNU/Linux than on Windows systems, and we shouldn't deny this or claim that it's tolerable.  It's a fair criticism that there are not enough actors in the GNU/Linux world focusing on design for usability, and this ought to change.

Best wishes,

    -- Joe
September 21, 2013
On 21/09/13 12:12, Nick Sabalausky wrote:
> That came out overly-harsh and not how I intended. ("Yea, no shit,
> Nick") Uhh, yea...

Hah, and I just managed to write a huge long "Profundus Maximus" post in response to try and negotiate the peace ... :-P

http://www.flamewarriorsguide.com/warriorshtm/profundusmaximus.htm

> What I mean is just, in this section of the thread, it has been
> sounding as if you're simply flame-baiting or arguing for the sake of
> arguing.
>
> (And then I somehow managed to awkwardly weave that into a completely
> different and not-terribly-important point about "time it takes", bleh,
> whatever...)

Honestly, I think a bit of good-humoured trolling among friends is sometimes a good thing.  Keeps everyone on their toes and thinking ... :-)

September 21, 2013
On 21/09/13 12:44, Paulo Pinto wrote:
> Basically, one of his messages is that nothing comes for free and learning
> requires effort.
>
> He makes the remark that only in the software industry people seem to have the
> "learn in xxx days" mentality and suff for dummies.

One reason I like D is because it gives you access to all the difficult concepts, but doesn't wrap them up in difficult or finnicky syntax.

Sometimes highly expert developers are not good at appreciating the difference.
September 21, 2013
On 21 September 2013 19:05, Nick Sabalausky < SeeWebsiteToContactMe@semitwist.com> wrote:

> On Sat, 21 Sep 2013 11:04:10 +1000
> Manu <turkeyman@gmail.com> wrote:
>
> > On 20 September 2013 22:15, H. S. Teoh <hsteoh@quickfur.ath.cx> wrote:
> > >
> > > There is no argument here, actually. The problem is really historical -- names like 'du' or 'grep' or 'awk' meant something back in who knows when, but they no longer mean anything to us today (well, those of us not old enough *cough*). If I were to reinvent Unix today, I'd choose better names for these things. But think about it, if the above line were instead written like this:
> > >
> > >         diskUsage $HOME | sort --reverse --numeric | pager
> > >
> > > it would make so much more sense, wouldn't it? So the "nonsensical" part is really just in the poor choice of naming, not an inherent weakness of the interface.
> > >
> >
> > I'd still argue that it is. It is how it is, and it's completely prohibitive to casual or new users.
>
> So? Does everything have to be targeted at new/casual users? Can't experienced users have stuff that's made for them? Who ever said command lines are still intended for everybody? Keep in mind, a programmer is NOT a casual or new user. But in any case, please don't mistake "Windows vs Linux" as a "one size fits all" topic, because you seem to be steering things that way.
>
> Rant: Seems to be a big trend in computing these days. Everything is all
> about catering to Average Joe Numbskull and Little Miss Facebook, and to
> hell anyone who has more advanced experience and needs where "usable
> by anyone's grandmother" is the least of their concerns.
>
> Average Joes need their tools, sure, but so do the rest of us.
>
> You do realize that in the time you've spent taking a friendly OS discussion and single-handedly trying[1] to turn it into yet another ill-informed OS flamewar (congratulations, btw) you could have already learned quite a bit about using a unix command line?
>
> [1] Don't deny it. Your intent to bait was obvious a few posts back, but due to your good standing here I've been giving you a chance.
>

;)
Sure, I do like to stir the pot to some extent, I won't deny that, but my
primary position in this thread is actually just a recount of my own
experience trying to use linux _productively_.
I just reviewed my last few posts, I don't think I was being particularly
unreasonable. I'm trying to use linux, right now, and it's frustrating. If
it just worked, I wouldn't be frustrated.

I've said about 3 times already, and I'll say again, I actually *really*
like the idea of linux, I really do wish to see it succeed. I've made deals
with myself on numerous occasions to try and force myself to adapt.
The problem is, every time I make a good solid effort, I just become
frustrated with inevitable problems, and end up wasting so much time. And
the end point, even if I were somewhat more expert at fighting the OS, is
I'm left with a suite of tools which are simply inferior and less
productive by a huge huge margin to what I'm used to. I wish this weren't
the case. (I've already acknowledged loads of cool projects that look
promising, but isn't that ALWAYS the way with linux? And I'm talking about
things that were rock solid over 10 years ago)
Philosophically, I hate windows, apple, and everything they stand for. I'd
love to be all-out linux. I'm honestly just expressing my frustration with
the offering, and that people seem to genuinely think it's okay.
I'm not flame baiting (well, maybe a little bit, just because it always
gets a good reaction), I don't think my criticisms are unrealistic. And
they're not even really baits, I'm just sharing personal experience.

I'm also not 'average-joe-numskull', at least I don't like to think I am,
but that doesn't mean I want to know how a car is built, and then in turn
how each individual part was built, and how to fix it, before I can have
confidence it will get me to Sydney in one piece.
I don't actually really care about how linux works, or any of the little
bits and pieces that form it's awkward foundation... and I shouldn't need
to in order to like the premise of an open system, and want to use it on
that merit alone.
I don't actually enjoy OPERATING a computer, I enjoy the creative process
of working, and getting work done. Solving interesting problems. If the
computer gets in my way, it has failed me at some level.
That might sound strange coming from a software engineer, but I guess
that's how I see it. I just don't have the patience to mess with my OS
anymore.
I enjoyed _operating_ my Amiga 15-20 years ago. You should have seen how
pimp my desktop was!
I don't like operating computers anymore, I like using them.

I likewise don't want to know how the video driver interacts with the
kernel, and that I'm supposed to download source code to manually build
some driver shim to correct a fault of the system auto-update tool failing
to resolve a dependency correctly.
Or my re-mapped hotkeys mysteriously reverting to their default settings
every few minutes after I map them for no apparent reason (I don't even
know where to start looking for that).
Or half the software on my PC looking ugly and broken while half of it
looks fine (I tracked this down to the theme control panel not properly
applying style settings to both gnome and Qt, it managed to leave Qt
styling in an invalid state, somehow).

I can't count the number of times that linux has crapped out by applying a
large update all at once in the last 5 years.
I think the daily linux users apply a small number of updates regularly,
and it seems to work better that way. But for someone who primarily uses
windows, when I switch to linux it wants to install a crap load of updates
all at once, and it often goes wrong.
I suspect updating versions 1 -> 2 -> 3 -> 4 is the common path, but
updating 1 -> 3 -> 4 -> 9 it not so well tested, and often fails.

You can try and dismiss this all, and go "yeah well, obviously it's hard for people to find the time to work on this and that and make it as solid as what a commercial company can possible achieve with loads of money, and everyone is disconnected little groups just trying to cooperate and communication is hard and", blah blah, but in *practise*, it doesn't make one bit of difference to me.. it always breaks on me for one reason or another, and wastes my time. And there are few things more frustrating to me than having me time wasted by problems that were otherwise solved decades ago...

In a lot of ways, in this thread I'm effectively venting frustration, sorry
if you're offended by it.
I'm still using linux, right now even, trying to overcome some issues with
my opengl rendering driver; versioning problems perhaps, wrangling
extensions, trying to understand how the drivers relate to the problems,
and where the source of various pieces of the puzzle all stem, if it's
reasonable to expect various extensions to be present in conjunction or if
I absolutely need to consider every single one of them in isolation.
Why is my opengl driver reporting version 3.3? My PC is really modern...
and my OpenGL code works on windows.
I have fuck-all tools available, it's near impossible to debug. Something
that I know would take me 5 minutes in windows with that toolset has taken
me a whole day so far...
I honestly don't understand how linux users think it's okay. That's not
inflammatory, it's legitimate amazement.
The only way I can reason that people can be happy being so unproductive,
is that they don't actually know what it's like to be really productive in
the first place. (see: my comment prior about the mouse scroll-wheel)
That might sound insulting, accusing people of being unproductive when I'm
sure they feel like they are, and I'm almost certainly wrong, sure, they're
probably just adapted to a totally different flow, but I just can't see it,
and I have no other working theories.

If I had visual studio and PIX, I would take a screen capture, clicked on
the bad pixel, it would immediately present a stack of every rendering
event contributing to that pixel, and the entire state of the rendering
hardware at every step of the way, and I'd find the problem in a couple of
minutes.
Here, I find myself writing heaps of code just to debug my code! Seriously?

</endrant>

(disclaimer: you caught me after dinner, and a few wine's down, which may have resulted in a few extra paragraphs ;)

> [...]
> > > > I had a video card driver problem the other day. The bundled
> > > > auto-update app failed, and totally broke my computer.
> > > > I had to download kernel source, and run some scripts to compile
> > > > some
> > > sort
> > > > of shim that made the video driver compatible with my kernel to get it working again... absolutely astounding.
> > >
> > > Uh... you do realize that this is because Linux actually *lets* you fix things? If something like this happened on Windows, the only real solution is to nuke the system from orbit and start from ground zero again (i.e. reinstall). One can hardly expect that repairing a broken car engine should require no thought.
> > >
> >
> > Nothing like that has EVER happened to me in a few decades of windows. In my experience asa linux user, these sort of problems are a daily chore.
> >
>
> I've had stuff like that happen on Windows. Not on my own system within the last few years, but over "a few decades"? Oh hell yea.
>
> OTOH, I don't think I've had such trouble with Linux in at least as long. I think 2002 was probably the last time.
>

I have a strong suspicion that linux works better if you use it daily.
I'm starting to realise a pattern emerging that things tend to fuck up
after I perform a bulk round of updates.
This tends to happen more often as an in-frequent linux user. Every time I
boot it up, it wants to update loads of stuff.
I suspect the installation tools handle incremental updates better than
when you're skipping over a bunch of intermediate revisions... at least,
that's my current working theory.
That said, I don't think that excuses the situation. It's broken, and it
wastes my time. That's the bottom line. I don't have a PhD in linux, and
when it breaks, it's very time consuming for me to fix it.

> Speaking of which, I managed to totally break my computer last night /
> > > this morning too.
> >
> >
> > No shit. Should I be surprised? ;)
> >
> [...]
> >
> > > but the hardy little thing just kept going. It was
> > > causing subtle breakages like my printer mysteriously failing to
> > > work, and when I finally figured out the problem, I downloaded a
> > > new kernel and recompiled it.
> >
> >
> > ... speechless ;)
> >
> >
> [...]
> >
> > I rest my case.
> >
>
> Ok, now I know you're just trying to troll. But I've never seen you troll before so you should know better.
>

Those 2 comments are about 30% trolling, and 70% amazement that people
think this story is actually okay.
There's also an element of sympathy, the idea that he managed to break his
computer is not foreign to me; it's expected ;) (even if he did go slightly
out of his way to do so)

I can very happily say, I have NEVER compiled a windows kernel.

Objectively though, I'd like to think that this whole scenario described
simply shouldn't have been possible under any normal usage scenario in the
first place.
There's obviously a whole bunch of fail-safe's that are totally missing for
your problem to arise in the first place.
This is possibly the core of most of my points in this thread too.

I'm going to say again, that I do understand WHY linux is how it is, I also
consider it a valiant effort/experiment.
I like it in theory, I like it in essence, but it just never manages to
deliver in practise. At some point, I have to admit that it's just
unacceptable so many years down the line.
I don't like the direction MS are taking windows at all. More than ever, I
really want to become a full-time linux user and embrace the choice that it
offers in many aspects... but first, I just want it to work!
And I shouldn't have to work to make it work.

He made it perfectly clear he had been messing around with his own
> internals. *Plus* you know perfectly well messing around with Windows internals can also lead to problems requiring expert-skill recovery techniques, so really, you *know* that you know better, so cut the shit.
>
> Yes, Linux sucks. And guess what? So does Windows. I use both, by choice. End of story.
>

This point. This is the part I object.
Again, I'm just recounting my experience/frustrations.
If i were flaming/trolling/baiting, I would be trying to say that linux is
shit, windows is awesome, you suck for liking linux, blah blah. I'm not.
Actually, I'm not happy with windows at all. But it has just one critical
advantage, it WORKS (at least, out of the box).

>
> > I think the main difference is quality-assurance. Windows software is more likely to be released only after it's reasonably proven that it works.
> >
>
> Like Debian.
>

I've tried to use debian, precisely for this alleged stability.
My experience was a whole bunch of software that was simply out of date.
Modern software/tools doesn't actually work against the out-dated packages.
And the result is even less features than the cutting edge releases.
I'm sure it's great if I just want to run a webserver and vi. Sadly I'm a
user-facing software developer, and that requires being on the cutting edge
of computer technology.
Basically, my experience was that if it's not in the debian package
repository, even if you do manage to get it to work, it's likely LESS
stable than on a cutting-edge distro since it's rarely tested against such
outdated libraries.
Maybe this isn't the case anymore, but it was last time I gave it a shot.

And if you bring up some broken Linux distro, I'll bring up WinME, and
> then we'll all have added a whole lot of usefulness to the discussion ;)
>

I don't think that's a fair comparison at all, that's like saying there was one broken version of ubuntu 10 years ago, but it's all better now...

> I'm not a mechanic, and I shouldn't have to be to drive a car.
> >
>
> Strawman, in too many ways to be worth listing.
>

I can't agree. I just want to do my work uninterrupted.
If my car breaks down half way to Sydney, I'd better hope to god I have
phone reception, because I'll have to start calling mechanics and service
people to get me moving again, and let's just hope I wasn't pressed to make
some sort of deadline...
That's seems like a pretty realistic analogy to me.


September 21, 2013
On 21 September 2013 21:27, Joseph Rushton Wakeling < joseph.wakeling@webdrake.net> wrote:

>
> Specifically in this case: the user-friendliness of GNU/Linux distros has come a _huge_ way in the last 10 years, but there's no reason why they shouldn't be every bit as surface-friendly (maybe even more so) than the popular commercial OS's while retaining all the power that experts need and want.  It's a terrible shame that more attention is not given to this surface-friendliness, and it's striking how resistant many old-school free software people are to usability-oriented improvements _that don't necessarily constrain them_.
>
> ** Example 1 **
> I was a longstanding KDE user until with the 12.04 release of Ubuntu, I
> switched over to using Unity.  I found it much more usable and effective in
> all sorts of ways, but initially I was frustrated because there were
> superficially less config options available.  It was striking how quickly I
> realized _I didn't miss them_ and that most of that configurability I'd had
> with KDE was a distraction rather than something that assisted me.  As
> someone wrote round-about that time, there's a tendency for customisability
> to be an excuse for lack of design.
>

I really like this point. It's something I think I'll definitely keep in
mind in the future. I'm certainly guilty of this myself; "surely people
would prefer the option" when I'm writing some code.
But in reality, in almost every piece of software I use myself, even as a
'power user', I tend to use it in it's default configuration.

What this really highlights is that I'm a terrible UX coder myself :P


September 21, 2013
On Saturday, 21 September 2013 at 15:07:17 UTC, Manu wrote:
> I have fuck-all tools available, it's near impossible to debug. Something
> that I know would take me 5 minutes in windows with that toolset has taken
> me a whole day so far...
> I honestly don't understand how linux users think it's okay.

Well, you aren't alone. Valve recently announced that they're working on a Linux debugger. They wouldn't bother wasting time and money doing that if the debugging experience on Linux was any good. As far as I'm aware, they're not working on a debugger for Windows, and I can only assume that's because debugging on Windows is bearable.