View mode: basic / threaded / horizontal-split · Log in · Help
March 12, 2012
Re: Breaking backwards compatiblity
On Sun, Mar 11, 2012 at 11:38:12PM +0100, deadalnix wrote:
> Le 11/03/2012 21:06, Walter Bright a écrit :
> >On 3/11/2012 4:20 AM, Matej Nanut wrote:
> >>I've been using an EeePC for everything for the past 2.5 years and
> >>until now, I could cope.
> >
> >You're right that there's a downside to providing your developers the
> >hottest machines available - their code tends to be a dog on the
> >machines your customer has.
> >
> 
> I think a better solution is including expected performances in the
> user stories and add them in the testing suite. Dev can enjoy a
> powerful machine without risking to get a resource monster as a final
> executable.

Even better, have some way of running your program with artificially
reduced speed & resources, so that you can (sortof) see how your program
degrades with lower-powered systems.

Perhaps run the program inside a VM or emulator?


T

-- 
It's bad luck to be superstitious. -- YHL
March 12, 2012
Re: Breaking backwards compatiblity
On Sun, Mar 11, 2012 at 11:31:33PM +0100, deadalnix wrote:
> Le 11/03/2012 21:16, Walter Bright a écrit :
[...]
> >I'm not suggesting no naming convention. Naming conventions are good.
> >But they don't trump everything else in importance, not even close.
> >
> 
> I have to disagree. They are really important on a large codebase (let
> say > 100,000 lines of code). Otherwise people tend not to find
> modules they can reuse expect by nowing the whole codebase. This have
> indirect effect to cause needless duplication - with all known
> drawbacks - and make the project more dependent on documentation -
> more documentation have to be produced, which means an overhead in
> workload and more trouble when documentation lacks, is outdated, or is
> bugguy.

I used to be very much against verbose naming conventions. Almost all
naming conventions are verbose, and I hate verbosity with a passion (I
still do).

But after having been forced to use naming conventions in some code at
work, I'm starting to see some value to them, especially in very large
projects where there's so much code that without some sort of
convention, it quickly becomes impossible to find what you want. Due to
lack of convention, one team tries writing code with what they feel is a
consistent naming, then another team comes along and do the same, and
they run into naming conflicts, so they rename stuff haphazardly. Repeat
this over a few iterations of the product, and you end up with 50
modules all with inconsistent naming due to historical conflicts.

Code readability thus drops dramatically, and people coming on board
later on can't understand what the code was supposed to do because it's
all so obtuse (and they don't have the time/energy to wade through 5
million lines of code to understand every little nuance). This, plus the
time pressure of impending deadlines, cause them to resort to
copy-n-pasting, second-guessing what a piece of code does without
bothering to check their assumptions (since it's so obtuse that looking
up *one* thing would cost them hours just to even begin to understand
it), and all sorts of bad code enters the system.

Compare this with a project that has naming conventions from the get go.
Name clashes are essentially non-existent if the naming convention is
consistent, and if you're looking for a function in module X to do Y,
the naming convention almost already spells out the name for you. Makes
things very easy to find, and once you understand the convention, makes
code very easy to read (most guesses at what the code means are actually
correct -- in a large project, you *know* people are just going to write
what they think is right rather than spend time reading code to
understand what it actually does). So people are more likely to actually
write correct code, which means people who come on board after them are
more likely to understand the code and not do stupid things.


[...]
> IMO, the name in itself isn't that important. The important thing is
> that thing get named in a predictable and simple way.

+1.


T

-- 
First Rule of History: History doesn't repeat itself -- historians merely repeat each other.
March 12, 2012
Re: Breaking backwards compatiblity
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message 
news:mailman.510.1331520028.4860.digitalmars-d@puremagic.com...
>
> That is not to say the classroom is completely worthless,
> mind you;

I'd say that, and I often have ;) And I forever will.

> courses like discrete maths

Personally, I found discrete math to be the easiest class I took since 
kindergarten (*Both* of the times they made me take discrete math. Ugh. God 
that got boring.) It was almost entirely the sorts of things that any 
average coder already understands intuitively. Like DeMorgan's: I hadn't 
known the name "DeMorgan", but just from growing up writing "if" statements 
I had already grokked how it worked and how to use it. No doubt in my mind 
that *all* of us here have grokked it (even any of us who might not know it 
by name) *and* many of the coworkers I've had who I'd normally classify as 
"incompetent VB-loving imbiciles". Then there was Pidgeonhole principle, 
which was basically just obvious corollaries to preschool-level spacial 
relations. Etc. All pretty much BASIC-level stuff.

> and programming logic did train me
> to think logically and rigorously, an indispensible requirement in the
> field.
>
> However, I also found that most big-name colleges are geared toward
> producing researchers rather than programmers in the industry.

The colleges I've seen seemed to have an identity crisis in that regard: 
Sometimes they acted like their role was teaching theory, sometimes they 
acted like their role was job training/placement, and all the time they were 
incompetent at both.

> Case in point. One of the courses I took as a grad student was taught by
> none less than Professor Cook himself (y'know the guy behind Cook's
> Theorem). He was a pretty cool guy, and I respect him for what he does.
> But the course material was... I don't remember what the official course
> title was, but we spent the entire term proving stuff about proofs.  Let
> me say that again.  I'm not just talking about spending the entire
> semester proving math theorems (which is already questionable enough in
> a course that's listed as a *computer science* course). I'm talking
> about spending the entire semester proving things *about* math proofs.
> IOW, we were dealing with *meta-proofs*.  And most of the "proofs" we
> proved things about involved *proofs of infinite length*.
>
> Yeah.
>
> I spent the entire course repeatedly wondering if I had misread the
> course calendar and gone to the wrong class, and, when I had ruled that
> out, what any of this meta-proof stuff had to do with programming.
>
>

I once made the mistake of signing up for a class that claimed to be part of 
the CS department and was titled "Optimization Techniques". I thought it was 
obvious what it was and that it would be a great class for me to take. 
Turned out to be a class that, realistically, belonged in the Math dept and 
had nothing to do with efficient software, even in theory. Wasn't even in 
the ballpark of Big-O, etc. It was linear algebra with large numbers of 
variables. I'm sure it would be great material for the right person, but it 
wasn't remotely what I expected given the name and department of the course. 
(Actually, similar thing with my High School class of "Business Law" - 
Turned out to have *nothing* to do with business whatsoever. Never 
understood why they didn't just call the class "Law" or "Civic Law".) Kinda 
felt "baited and switched" both times.
March 12, 2012
Re: Breaking backwards compatiblity
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message 
news:mailman.517.1331521772.4860.digitalmars-d@puremagic.com...
> On Sun, Mar 11, 2012 at 11:38:12PM +0100, deadalnix wrote:
>>
>> I think a better solution is including expected performances in the
>> user stories and add them in the testing suite. Dev can enjoy a
>> powerful machine without risking to get a resource monster as a final
>> executable.
>
> Even better, have some way of running your program with artificially
> reduced speed & resources, so that you can (sortof) see how your program
> degrades with lower-powered systems.
>
> Perhaps run the program inside a VM or emulator?
>

I don't think such things would ever truly work, except maybe in isolated 
cases. It's an issue of dogfooding. But then these "eat your cake and then 
still have it" strategies ultimately mean that you're *not* actually doing 
the dogfooding, just kinda pretending to. Instead, you'd be eating steak 
seven days a week, occasionally do a half-bite of dogfooding, and 
immediately wash it down with...I dunno, name some fancy expensive drink, I 
don't know my wines ;)
March 12, 2012
Re: Breaking backwards compatiblity
On Sun, 11 Mar 2012 12:20:43 +0100, Matej Nanut <matejnanut@gmail.com>  
wrote:

> I find the point on developing on a slower computer very interesting,
> and here's my story.
>
> I've been using an EeePC for everything for the past 2.5 years and
> until now, I could cope. I'm getting a new laptop this week because I
> direly need it at the faculty (some robotics/image processing/computer
> vision — no way to run these on an EeePC realtime).
>
I recently switch from an Atom N330 to a Core i5-2400S.
While being about 10x faster at compilation it lead to more
trial and error compilation.
Turnaround times with dmd are a little too short.
March 12, 2012
Re: Breaking backwards compatiblity
On Mon, Mar 12, 2012 at 01:36:06AM -0400, Nick Sabalausky wrote:
> "H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message 
> news:mailman.510.1331520028.4860.digitalmars-d@puremagic.com...
[...]
> Personally, I found discrete math to be the easiest class I took since
> kindergarten (*Both* of the times they made me take discrete math.
> Ugh. God that got boring.) It was almost entirely the sorts of things
> that any average coder already understands intuitively. Like
> DeMorgan's: I hadn't known the name "DeMorgan", but just from growing
> up writing "if" statements I had already grokked how it worked and how
> to use it. No doubt in my mind that *all* of us here have grokked it
> (even any of us who might not know it by name) *and* many of the
> coworkers I've had who I'd normally classify as "incompetent VB-loving
> imbiciles".

It's not that I didn't already know most of the stuff intuitively, I
found that, in retrospect, having to learn it formally helped to
solidify my mental grasp of it, and to be able to analyse it abstractly
without being tied to intuition. This later developed into the ability
to reason about other stuff in the same way, so you could *derive* new
stuff yourself in similar ways.


> Then there was Pidgeonhole principle, which was basically just obvious
> corollaries to preschool-level spacial relations. Etc.  All pretty
> much BASIC-level stuff.

Oh reeeeaally?! Just wait till you learn how the pigeonhole principle
allows you to do arithmetic with infinite quantities... ;-)

(And before you shoot me down with "infinite quantities are not
practical in programming", I'd like to say that certain non-finite
arithmetic systems actually have real-life consequences in finite
computations. Look up "Hydra game" sometime. Or "Goldstein sequences" if
you're into that sorta thing.)


[...]
> > However, I also found that most big-name colleges are geared toward
> > producing researchers rather than programmers in the industry.
> 
> The colleges I've seen seemed to have an identity crisis in that
> regard: Sometimes they acted like their role was teaching theory,
> sometimes they acted like their role was job training/placement, and
> all the time they were incompetent at both.

In my experience, I found that the quality of a course depends a LOT on
the attitude and teaching ability of the professor. I've had courses
which were like mind-openers every other class, where you just go "wow,
*that* is one heck of a cool algorithm!".

Unfortunately, (1) most professors can't teach; (2) they're not *paid*
to teach (they're paid to do research), so they regard it as a tedious
chore imposed upon them that takes away their time for research. This
makes them hate teaching, and so most courses suck.


[...]
> I once made the mistake of signing up for a class that claimed to be
> part of the CS department and was titled "Optimization Techniques". I
> thought it was obvious what it was and that it would be a great class
> for me to take.  Turned out to be a class that, realistically,
> belonged in the Math dept and had nothing to do with efficient
> software, even in theory. Wasn't even in the ballpark of Big-O, etc.
> It was linear algebra with large numbers of variables.

Ahhhhahahahahaha... must've been high-dimensional polytope optimization
stuff, I'll bet. That stuff *does* have its uses... but yeah, that was a
really dumb course title.

Another dumb course title that I've encountered was along the lines of
"computational theory" where 95% of the course talks about
*uncomputable* problems. You'd think they would've named it
"*un*computational theory". :-P


> I'm sure it would be great material for the right person, but it
> wasn't remotely what I expected given the name and department of the
> course.  (Actually, similar thing with my High School class of
> "Business Law" - Turned out to have *nothing* to do with business
> whatsoever. Never understood why they didn't just call the class "Law"
> or "Civic Law".) Kinda felt "baited and switched" both times.
[...]

That's why I always took the effort read course descriptions VERY
carefully before I sign up. It's like the fine print in contracts. You
skip over it at your own peril.

(Though, that didn't stop me from taking "Number Theory". Or "Set
Theory". Both of which went wayyyyyy over my head for the most part.)


T

-- 
2+2=4. 2*2=4. 2^2=4. Therefore, +, *, and ^ are the same operation.
March 12, 2012
Re: Breaking backwards compatiblity
On Mon, Mar 12, 2012 at 01:48:46AM -0400, Nick Sabalausky wrote:
> "H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message 
> news:mailman.517.1331521772.4860.digitalmars-d@puremagic.com...
> > On Sun, Mar 11, 2012 at 11:38:12PM +0100, deadalnix wrote:
> >>
> >> I think a better solution is including expected performances in the
> >> user stories and add them in the testing suite. Dev can enjoy a
> >> powerful machine without risking to get a resource monster as a
> >> final executable.
> >
> > Even better, have some way of running your program with artificially
> > reduced speed & resources, so that you can (sortof) see how your
> > program degrades with lower-powered systems.
> >
> > Perhaps run the program inside a VM or emulator?
> >
> 
> I don't think such things would ever truly work, except maybe in
> isolated cases. It's an issue of dogfooding. But then these "eat your
> cake and then still have it" strategies ultimately mean that you're
> *not* actually doing the dogfooding, just kinda pretending to.
> Instead, you'd be eating steak seven days a week, occasionally do a
> half-bite of dogfooding, and immediately wash it down with...I dunno,
> name some fancy expensive drink, I don't know my wines ;)
[...]

Nah, it's like ordering extra large triple steak burger with
double-extra cheese, extra bacon, sausage on the side, extra large
french fries swimming in grease, and _diet_ coke to go with it.


T

-- 
Blunt statements really don't have a point.
March 12, 2012
Re: Breaking backwards compatiblity
Am Sun, 11 Mar 2012 04:12:12 -0400
schrieb "Nick Sabalausky" <a@a.a>:

> I think it's a shame that companies hand out high-end hardware to their 
> developers like it was candy. There's no doubt in my mind that's 
> significantly contributed to the amount of bloatware out there.

But what if the developers themselves use bloated software, like Eclipse or slow compilation processes, like big C++ programs. It is a net productivity increase. But yeah, I sometimes think about keeping some old notebook around to test on it - not to use it for development. Actually, sometimes you may want to debug your code with a very large data set. So you end up on the other side of the extreme: Your computer has too little RAM to run some real world application of your software.

As for the article: The situation with automatic updates was worse than now - Adobe, Apple and the others have learned and added the option to disable most of the background processing. The developments in the web sector are interesting under that aspect. High quality videos and several scripting/VM languages make most older computers useless for tabbed browsing :D

-- 
Marco
March 12, 2012
Re: Breaking backwards compatiblity
> I searched every inch of Opera's options screens and never 
> found *any* mention or reference to any "Disable AutoUpdate"

"Derek" <ddparnell@bigpond.com> wrote in message 
news:op.wazmllu534mv3i@red-beast...
> I found it in a minute. First I tried opera help and it directed me to 
> details about auto-update, which showed how to disable it. It is in the 
> normal UI place for such stuff.
>
>   Tools -> Preferences -> Advanced -> Security -> Auto-Update.

Am Sat, 10 Mar 2012 23:44:20 -0500
schrieb "Nick Sabalausky" <a@a.a>:
> They stuck it under "Security"? No wonder I couldn't find it. That's like 
> putting "blue" under "shapes". :/

So much for every inch ...and false accusations. You made my day! ;)

-- 
Marco
March 12, 2012
Re: Breaking backwards compatiblity
"Marco Leise" <Marco.Leise@gmx.de> wrote in message 
news:20120312124959.2ef8eb86@marco-leise.homedns.org...
>> I searched every inch of Opera's options screens and never
>> found *any* mention or reference to any "Disable AutoUpdate"
>
> "Derek" <ddparnell@bigpond.com> wrote in message
> news:op.wazmllu534mv3i@red-beast...
>> I found it in a minute. First I tried opera help and it directed me to
>> details about auto-update, which showed how to disable it. It is in the
>> normal UI place for such stuff.
>>
>>   Tools -> Preferences -> Advanced -> Security -> Auto-Update.
>
> Am Sat, 10 Mar 2012 23:44:20 -0500
> schrieb "Nick Sabalausky" <a@a.a>:
>> They stuck it under "Security"? No wonder I couldn't find it. That's like
>> putting "blue" under "shapes". :/
>
> So much for every inch ...and false accusations. You made my day! ;)
>

Yup. You've got me there! (I had thought that I had, but I'm not sure if 
that works for or against me ;) )
15 16 17 18 19 20
Top | Discussion index | About this forum | D home