March 10, 2012
"Alex Rønne Petersen" <xtzgzorex@gmail.com> wrote in message news:jjgb5l$94f$1@digitalmars.com...
> On 10-03-2012 20:23, Nick Sabalausky wrote:
>> "Alex Rønne Petersen"<xtzgzorex@gmail.com>  wrote in message news:jjg7dq$24q$1@digitalmars.com...
>>> On 10-03-2012 18:58, H. S. Teoh wrote:
>>>>
>>>> Then you must be running a very different Linux from the one I use. In
>>>> my experience, it's Windows that's an order of magnitude less
>>>> responsive
>>>> due to constant HD thrashing (esp. on bootup, and then periodically
>>>> thereafter) and too much eye-candy.
>>>
>>> This. On the other hand, OS X has all the eye candy and is still
>>> extremely
>>> responsive. ;)
>>>
>>
>> That's because they cram [their] hardware upgrades down your throat every couple years.
>>
>>
>
> No one forces you to upgrade.
>

That's true. They just say "You *could* stick with your ancient two-year-old machine...You'll be shit out of luck when you need to install anything...but yea, we'll *cough* 'let' *cough* you do it...hee hee hee."


March 10, 2012
"Jonathan M Davis" <jmdavisProg@gmx.com> wrote in message news:mailman.428.1331409260.4860.digitalmars-d@puremagic.com...
> On Saturday, March 10, 2012 11:49:22 H. S. Teoh wrote:
>> Yikes. That would *not* sit well with me. Before my last upgrade, my PC was at least 10 years old. (And the upgrade before that was at least 5 years prior.) Last year I finally replaced my 10 y.o. PC with a brand new AMD hexacore system. The plan being to not upgrade for at least the next 10 years, preferably more. :-)
>
> LOL. I'm the complete opposite. I seem to end up upgrading my computer
> every 2
> or 3 years. I wouldn't be able to stand being on an older computer that
> long.
> I'm constantly annoyed by how slow my computer is no matter how new it is.
> Of
> course, I do tend to stress my machine quite a lot by having a ton of
> stuff
> open all the time and doing CPU-intensive stuff like transcoding video,
> and how
> you use your computer is a definite factor in how much value there is in
> upgrading.
>

With the exception of notably-expensive things like video processing, ever since CPUs hit the GHz mark (and arguably for some time before that), there has been *no* reason to blame slowness on anything other than shitty software.

My Apple IIc literally had more responsive text entry than at least half of the textarea boxes on the modern web. Slowness is *not* a hardware issue anymore, and hasn't been for a long time.

You know what *really* happens when you upgrade to a computer that's, say, twice as fast with twice as much memory? About 90% of the so-called "programmers" out there decide "Hey, now I can get away with my software being twice as slow and eat up twice as much memory! And it's all on *my user's* dime!" You're literally paying for programmer laziness.

I just stick with software that isn't bloated. I get just as much speed, but without all that cost.

(Again, there are obviously exceptions, like video processing, DNA processing, etc.)


March 10, 2012
On Sat, Mar 10, 2012 at 11:52:47AM -0800, Jonathan M Davis wrote:
> On Saturday, March 10, 2012 11:49:22 H. S. Teoh wrote:
> > Yikes. That would *not* sit well with me. Before my last upgrade, my PC was at least 10 years old. (And the upgrade before that was at least 5 years prior.) Last year I finally replaced my 10 y.o. PC with a brand new AMD hexacore system. The plan being to not upgrade for at least the next 10 years, preferably more. :-)
> 
> LOL. I'm the complete opposite. I seem to end up upgrading my computer every 2 or 3 years. I wouldn't be able to stand being on an older computer that long.  I'm constantly annoyed by how slow my computer is no matter how new it is. Of course, I do tend to stress my machine quite a lot by having a ton of stuff open all the time and doing CPU-intensive stuff like transcoding video, and how you use your computer is a definite factor in how much value there is in upgrading.
[...]

True. But I found Linux far more superior in terms of being usable on very old hardware. I can't imagine the pain of trying to run Windows 7 on, say, a 5 y.o. PC (if it will even let you install it on something that old!).  I used to run CPU-intensive stuff too, by using 'at' to schedule it to run overnight. :-)

Although, I have to admit the reason for my last upgrade was because I was doing lots of povray rendering, and it was getting a bit too slow for my tastes. It's no fun at all if you had to wait 2 hours just to find out you screwed up some parameters in your test render. Imagine if you had to wait 2 hours to know the result of every 1 line code change.


T

-- 
Lottery: tax on the stupid. -- Slashdotter
March 10, 2012
On Saturday, March 10, 2012 16:08:28 Nick Sabalausky wrote:
> With the exception of notably-expensive things like video processing, ever since CPUs hit the GHz mark (and arguably for some time before that), there has been *no* reason to blame slowness on anything other than shitty software.
> 
> My Apple IIc literally had more responsive text entry than at least half of the textarea boxes on the modern web. Slowness is *not* a hardware issue anymore, and hasn't been for a long time.
> 
> You know what *really* happens when you upgrade to a computer that's, say, twice as fast with twice as much memory? About 90% of the so-called "programmers" out there decide "Hey, now I can get away with my software being twice as slow and eat up twice as much memory! And it's all on *my user's* dime!" You're literally paying for programmer laziness.
> 
> I just stick with software that isn't bloated. I get just as much speed, but without all that cost.

Yeah. CPU is not the issue. I/O and/or memory tends to be the bottleneck for most stuff - at least for me. Getting a faster CPU wouldn't make my computer any more responsive.

> (Again, there are obviously exceptions, like video processing, DNA
> processing, etc.)

I do plenty of that sort of thing though, so CPU really does matter quite a bit to me, even if it doesn't affect my normal computing much. When transcoding video, CPU speed makes a _huge_ difference.

- Jonathan M Davis
March 10, 2012
On Sat, Mar 10, 2012 at 04:08:28PM -0500, Nick Sabalausky wrote: [...]
> My Apple IIc literally had more responsive text entry than at least half of the textarea boxes on the modern web. Slowness is *not* a hardware issue anymore, and hasn't been for a long time.

Ugh. You remind me of the early releases of Mozilla, where loading the *UI* would slow my machine down to a crawl (if not to a literal stop). Needless to say actually browsing. I stuck with Netscape 4 for as long as I could get away with, and then switched to Opera because it could do everything Mozilla did at 10 times the speed.

Sad to say, recent versions of Opera (and Firefox) have become massive memory and disk hogs. I still mainly use Opera because I like the interface better, but sometimes I have the misfortune of needing Firefox for some newfangled Javascript nonsense that the GUI team at my day job were arm-twisted to implement by the PTBs. Running *both* Firefox and Opera simultaneously, with some heavy-duty Javascript going on in Firefox, routinely soaks up all RAM, hogs the disk at 99% usage, and renders the PC essentially unusable. Exiting one (or preferably both) of them immediately solves the problem.

And people keep talking about web apps and the browser as a "platform". Sigh.


> You know what *really* happens when you upgrade to a computer that's, say, twice as fast with twice as much memory? About 90% of the so-called "programmers" out there decide "Hey, now I can get away with my software being twice as slow and eat up twice as much memory! And it's all on *my user's* dime!" You're literally paying for programmer laziness.

Or worse, "Hey look! We can add goofy animations to every aspect of our UI to hog all CPU and memory, because users love eye-candy and will be induced to upgrade! We get a kickback from our hardware manufacturers and we sell more software without actually adding any new features! It's a win-win situation!"


> I just stick with software that isn't bloated. I get just as much speed, but without all that cost.

I'm constantly amazed by the amount of CPU and memory needed to run a *word processor*. I mean, really?! All of that just for pushing some characters around? And I thought word-processing has been solved since the days of CP/M. Silly me.


> (Again, there are obviously exceptions, like video processing, DNA processing, etc.)
[...]

And povray rendering. :-) Or computing the convex hull of high-dimensional polytopes. Or solving the travelling salesman problem. Or inverting very large matrices.  Y'know, actual, *hard* problems. As opposed to fiddling with some pixels and pushing some characters around.


T

-- 
Tech-savvy: euphemism for nerdy.
March 10, 2012
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.435.1331411268.4860.digitalmars-d@puremagic.com...
> On Sat, Mar 10, 2012 at 02:59:28PM -0500, Nick Sabalausky wrote:
>> "H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.431.1331409456.4860.digitalmars-d@puremagic.com...
>> > On Sat, Mar 10, 2012 at 11:39:54AM -0800, Walter Bright wrote:
>> >> On 3/10/2012 11:02 AM, H. S. Teoh wrote:
>> >> >Speaking of which, how's our progress on that front? What are the major roadblocks still facing us?
>> >>
>> >> http://d.puremagic.com/issues/buglist.cgi?query_format=advanced&bug_severity=regression&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED
>> >
>> > Looks quite promising to me. Can we expect dmd 2.060 Real Soon Now(tm)?
>> > :-)
>> >
>>
>> No. Unfortnately, 2.059 will have to come first. ;)
> [...]
>
> Argh! I didn't realize dmd bumped its version in git immediately after a release, rather than before. At my day job, we do it the other way round (make a bunch of changes, test it, then bump the version once we decide it's ready to ship).
>

I honestly don't like it either way. For my stuff, I bump it just before *and* just after a release. If you see something of mine with a version like "vX.Y", then it's a release version. If it's "vX.Y.1" than it's a development snapshot that could be anywhere between the next and previous "vX.Y". For instance, v0.5.1 would be a dev snapshot that could be anywhere between the v0.5 and v0.6 releases.

Once I reach v1.0, then whenever I need to do a "vX.Y.Z" release, the 'Z' part will always been an even number for releases and odd for dev snapshots (unless I decide to just add an extra fourth number instead). (Prior to a v1.0, I don't think there's much point in bothering with a full "vX.Y.Z": just bump the Y since, by definition, you can always expect breaking changes prior to v1.0)

I think it's terrible for dev and release versions to share the same version number.


March 10, 2012
"Jonathan M Davis" <jmdavisProg@gmx.com> wrote in message news:mailman.438.1331414665.4860.digitalmars-d@puremagic.com...
> On Saturday, March 10, 2012 16:08:28 Nick Sabalausky wrote:
>> With the exception of notably-expensive things like video processing,
>> ever
>> since CPUs hit the GHz mark (and arguably for some time before that),
>> there
>> has been *no* reason to blame slowness on anything other than shitty
>> software.
>>
>> My Apple IIc literally had more responsive text entry than at least half
>> of
>> the textarea boxes on the modern web. Slowness is *not* a hardware issue
>> anymore, and hasn't been for a long time.
>>
>> You know what *really* happens when you upgrade to a computer that's,
>> say,
>> twice as fast with twice as much memory? About 90% of the so-called
>> "programmers" out there decide "Hey, now I can get away with my software
>> being twice as slow and eat up twice as much memory! And it's all on *my
>> user's* dime!" You're literally paying for programmer laziness.
>>
>> I just stick with software that isn't bloated. I get just as much speed,
>> but
>> without all that cost.
>
> Yeah. CPU is not the issue. I/O and/or memory tends to be the bottleneck
> for
> most stuff - at least for me. Getting a faster CPU wouldn't make my
> computer
> any more responsive.
>

Well, all those busses, I/O devices, etc, are still a lot faster than they were back in, say, the 486 or Pentium 1 days, and things were plenty responsive then, too. But, I do agree, like you say, it *does* depend on what you're doing. If you're doing a lot of video as you say, then I completely understand.


March 10, 2012
Walter:

> I'm talking about the name change. It's far and away the most common thing I have to edit when moving code from D1 <=> D2.

We need good/better ways to manage Change and make it faster and less painful, instead of refusing almost all change right now.
Things like more fine-graded deprecation abilities, smarter error messages in libraries that suggest how to fix the code, tools that update the code (py2to3 or the Go language tool to update the programs), things like the strange "future" built-in Python package, and so on.

Bye,
bearophile
March 10, 2012
H. S. Teoh:

> (Then again, I don't use graphics-heavy UIs... on Linux you can turn most of it off, and I do, but on Windows you have no choice.

In Windows there is a very very easy way to disable all eye candy and most UI sugar, to produce a snappy graphics interface even on low powered laptops, that looks like Windows95 :-)

Bye,
bearophile
March 10, 2012
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.439.1331415624.4860.digitalmars-d@puremagic.com...
> On Sat, Mar 10, 2012 at 04:08:28PM -0500, Nick Sabalausky wrote: [...]
>> My Apple IIc literally had more responsive text entry than at least half of the textarea boxes on the modern web. Slowness is *not* a hardware issue anymore, and hasn't been for a long time.
>
> Ugh. You remind me of the early releases of Mozilla, where loading the *UI* would slow my machine down to a crawl (if not to a literal stop). Needless to say actually browsing. I stuck with Netscape 4 for as long as I could get away with, and then switched to Opera because it could do everything Mozilla did at 10 times the speed.
>
> Sad to say, recent versions of Opera (and Firefox) have become massive memory and disk hogs. I still mainly use Opera because I like the interface better,

I couldn't beleive that Opera actually *removed* the native "skin" (even what joke it was in the first place) in the latest versions. That's why my Opera installation is staying put at v10.62.

Which reminds me, I still need to figure out what domain it contacts to check whether or not to incessently nag me about *cough* "upgrading" *cough*, so I can ban the damn thing via my hosts file.

> And people keep talking about web apps and the browser as a "platform". Sigh.
>

Yea. There's even an entire company dedicated to pushing that moronic agenda (*and* tracking you like Big Brother). They're called "Microsoft Mark 2"...erm...wait...I mean "Google".

>
>> You know what *really* happens when you upgrade to a computer that's, say, twice as fast with twice as much memory? About 90% of the so-called "programmers" out there decide "Hey, now I can get away with my software being twice as slow and eat up twice as much memory! And it's all on *my user's* dime!" You're literally paying for programmer laziness.
>
> Or worse, "Hey look! We can add goofy animations to every aspect of our UI to hog all CPU and memory, because users love eye-candy and will be induced to upgrade!

Yes, seriously! "And let's not bother to allow anyone to disable the moronic UI changes even though we (*cough* Mozilla) *claim* to care about being super-configurable."

> We get a kickback from our hardware manufacturers
> and we sell more software without actually adding any new features! It's
> a win-win situation!"
>

That's one of the reasons I despise the modern-day Epic and Valve: *Complete* graphics whores (not to mention Microsoft sluts, particularly in Epic's case), and I don't believe for a second that what you've described isn't the exact nature of...what does Epic call it? Some sort of "Alliance" with NVIDIA and ATI that Epic was so *publically* proud of. Fuck Cliffy, Sweeny, "Fat Fuck" Newell, et al. Shit, and Epic actually used to be pretty good back in their "Megagames" days.

Portal's great (honestly I hate myself for how much I *like* it ;) ), but seriously, it would be *so* much better with Wii controls instead of that dual-analog bullshit. But unlike modern game devs I'm not a graphics whore, so I don't give two shits about the tradeoff in visual quality ('Course that's still no free ride for the lazy crapjobs that were done with the Wii ports of Splinter Cell 4 and FarCry - it may not be a 360/PS3, but it sure as hell is no N64 or even any sub-XBox1 machine, as the "modern" gamedevs would have me believe).

>
>> I just stick with software that isn't bloated. I get just as much speed, but without all that cost.
>
> I'm constantly amazed by the amount of CPU and memory needed to run a *word processor*. I mean, really?! All of that just for pushing some characters around? And I thought word-processing has been solved since the days of CP/M. Silly me.
>

Ditto.

>
>> (Again, there are obviously exceptions, like video processing, DNA
>> processing, etc.)
> [...]
>
> And povray rendering. :-) Or computing the convex hull of high-dimensional polytopes. Or solving the travelling salesman problem. Or inverting very large matrices.  Y'know, actual, *hard* problems. As opposed to fiddling with some pixels and pushing some characters around.
>

Yup. Although I like to count non-realtime 3D rendering under the "video processing" category even though the details are very, very different from transcoding or AfterEffects and such.

> Or solving the travelling salesman problem.

That's aready been solved. Haven't you heard of eCommerce? j/k ;)