March 11, 2012
Le 11/03/2012 20:39, Nick Sabalausky a écrit :
> "Nick Sabalausky"<a@a.a>  wrote in message
> news:jjiv40$2aba$1@digitalmars.com...
>> "deadalnix"<deadalnix@gmail.com>  wrote in message
>> news:jjif2l$1cdi$1@digitalmars.com...
>>>
>>> An example of management the change (even breaking change) is PHP. PHP
>>> has so much improved if you consider what have changed between v4 and v5,
>>> and then v5 and v5.3, that it is a factual proof that breaking change can
>>> be done to great benefit.
>>
>> PHP could have handled the changes MUCH better than it did, though.
>>
>
> Ermm, the transition, I mean.
>
>

The point wasn't to say that what PHP did is perfect. But that they lead change and are successful at it. This clearly show that this is possible, they DID it.
March 11, 2012
Le 11/03/2012 22:59, David Nadlinger a écrit :
> On Friday, 9 March 2012 at 22:41:22 UTC, Timon Gehr wrote:
>> Most bug fixes are breaking changes. I don't think we are there yet.
>
> In my opinion, this is a very interesting and important observation –
> due to the powerful meta-programming and reflection capabilities, most
> of the time the question is not whether a change is backward compatibile
> or not, but rather how _likely_ it is to break code. There isn't really
> a good way to avoid that, even more so if your language allow testing
> whether a given piece of code compiles or not.
>
> A related problem is that we still don't quite have an appropriate
> language spec, so you can never be sure if you code is really »correct«
> or if you are relying on DMD implementation details – I'm sure everybody
> who had their meta-programming heavy code break due to a seemingly
> unrelated DMD bugfix knows what I'm trying to say…
>
> David

D is very tied to DMD's implementation. This may be not good, but it is too soon to maintain several compilers.
March 11, 2012
"deadalnix" <deadalnix@gmail.com> wrote in message news:jjj907$2t00$1@digitalmars.com...
> Le 11/03/2012 23:12, Walter Bright a écrit :
>>
>> And I did just that for "invariant". Over and over and over. People immediately get what "immutable" means, like for no other name. So consider "immutable" a labor saving device for me.
>
> We have the same phenomena with dur and return type type qualifier (ie: why does const int* fun() isn't compiling ? Because const is qualifying the function, not the return type).
>
> Both are recurring questions and so should be as important as immutable. But both are major breaking change.

I wouldn't call dur->duration a *major* breaking change. First of all, you get a clear compile-time error, not silently changed semantics. Secondly, it's a simple search/replace: s/dur!/duration!/ (Not that I normally do search/replaces *completely* blind and unattended, but it's still trivial.)


March 11, 2012
"deadalnix" <deadalnix@gmail.com> wrote in message news:jjj9bm$2t00$4@digitalmars.com...
> Le 11/03/2012 20:39, Nick Sabalausky a écrit :
>> "Nick Sabalausky"<a@a.a>  wrote in message news:jjiv40$2aba$1@digitalmars.com...
>>> "deadalnix"<deadalnix@gmail.com>  wrote in message news:jjif2l$1cdi$1@digitalmars.com...
>>>>
>>>> An example of management the change (even breaking change) is PHP. PHP
>>>> has so much improved if you consider what have changed between v4 and
>>>> v5,
>>>> and then v5 and v5.3, that it is a factual proof that breaking change
>>>> can
>>>> be done to great benefit.
>>>
>>> PHP could have handled the changes MUCH better than it did, though.
>>>
>>
>> Ermm, the transition, I mean.
>>
>
> The point wasn't to say that what PHP did is perfect. But that they lead change and are successful at it. This clearly show that this is possible, they DID it.

Right, I agree.


March 12, 2012
On 3/11/2012 3:36 PM, deadalnix wrote:
> Le 11/03/2012 21:07, Walter Bright a écrit :
>> On 3/11/2012 12:32 PM, Nick Sabalausky wrote:
>>> I'm convinced that colleges in general produce very bad programmers. The
>>> good programmers who have degrees, for the most part (I'm sure there are
>>> rare exceptions), are the ones who learned on their own, not in a
>>> classroom.
>>
>> Often the best programmers seem to have physics degrees!
>>
>
> I saw you coming ^^.

Whether or not anyone considers me a good programmer, I don't have a degree in physics. Frankly, I wasn't good enough to consider it.
March 12, 2012
On Sun, Mar 11, 2012 at 03:20:34PM -0400, Nick Sabalausky wrote: [...]
> 'Course, I'm more than ready to give up KDE itself now. Move to something like Trinity or LXDE or XFCE.

Way ahead of you there. ;-) I'm already using a mouseless WM, and thinking of replacing ratpoison with something even more radical. Something that not only doesn't need the mouse, but *eradicates* all need for the mouse on virtually all applications. Something that maps keystrokes to geometry, so that you point using your keyboard. The screen would divide into regions mapped to certain keys, and certain key sequences would subdivide regions, so you can virtually point at anything just by hitting 3-4 keys.  Furthermore, due to X11's root window allowing the WM to scan pixels, the region subdivisions can auto-snap to high-contrast boundaries, so you're actually subdividing based on visually distinct regions like text lines or buttons, etc., rather than just a blind coordinates subdivision (which will require unreasonable amounts of keystrokes to point accurately).

(Though at the rate I'm going, I don't know when I'm ever going to have the time to actually sit down and write a WM. So maybe this is just a really wild impractical pipe dream. :-P)

The mouse still has its place, of course, for when you *actually* need it, like drawing free-hand curves and stuff like that.


> And Debian 6. Canonincal just keeps getting crazier and crazier. I don't want their new Linux-based iOS of an operating system. OTOH, Debian's "versioning" system is irritationly moroninc.  Squeeze, wheeze, wtf? They don't even have any natural ordering for god's sake! At least Ubuntu's moronic names have *that* much going for them! I don't care what Pixar character my OS is pretending to be, and I don't *want* to care.

I'm a Debian developer, actually. Though I've been so busy with other stuff that I haven't really done anything worth mentioning for the last long while. To me, Debian only ever has 3 versions, oldstable, stable, and unstable. Every Debian "release" is really just a rollover of unstable into stable, and stable into oldstable. I don't even remember those silly Pixar names or their correlation with actual version numbers (actually, I find the version numbers quite meaningless).

"Unstable" is really a misnomer... it's generally a lot more stable than, say, your typical Windows XP installation. (But YMMV... remember I don't use a real desktop environment or any of the stuff that "common people" use.) I see it more as "current release" than "unstable", actually. The *real* unstable is "experimental", which only fools who like crashing their system every other month would run. Stable is for those freaks who run mission-critical servers, whose life depends on the servers being almost impossible to crash. So really, for PCs, most people just run "unstable".

So in my mind, I never even think about "Debian 6" or "Debian 5" or whatever arbitrary number they want to call it. I run "unstable" at home and "stable" on a remote server, and it's as simple as that.


T

-- 
If it tastes good, it's probably bad for you.
March 12, 2012
On Monday, 12 March 2012 at 01:28:39 UTC, H. S. Teoh wrote:
> Something that not only doesn't need the mouse, but *eradicates* all
> need for the mouse on virtually all applications.

It isn't what you described, but in X11, if you hit
shift+numlock, it toggles a mode that lets you
move the cursor and click by using the numpad
keys.

March 12, 2012
On Sun, Mar 11, 2012 at 03:32:39PM -0400, Nick Sabalausky wrote: [...]
> I'm convinced that colleges in general produce very bad programmers. The good programmers who have degrees, for the most part (I'm sure there are rare exceptions), are the ones who learned on their own, not in a classroom.  It's sad that society brainwashes people into believing the opposite.
[...]

I have a master's degree in computer science. About 90% (perhaps 95%) of what I do at my day job is stuff I learned on my own outside the classroom. That is not to say the classroom is completely worthless, mind you; courses like discrete maths and programming logic did train me to think logically and rigorously, an indispensible requirement in the field.

However, I also found that most big-name colleges are geared toward producing researchers rather than programmers in the industry. Now I don't know if this applies in general, but the curriculum I was in was so geared towards CS research rather than doing real industry work (i.e., write actual programs!) that we spent more time studying uncomputable problems than computable ones.

OK, so knowing what isn't computable is important so that you don't waste time trying to solve the halting problem, for example. But when *most* (all?) of your time is spent contemplating the uncomputable, wouldn't you say that you're a bit too high up in that ivory tower? I mean, this is *computer science*, not *uncomputable science* we're talking about.

Case in point. One of the courses I took as a grad student was taught by none less than Professor Cook himself (y'know the guy behind Cook's Theorem). He was a pretty cool guy, and I respect him for what he does. But the course material was... I don't remember what the official course title was, but we spent the entire term proving stuff about proofs.  Let me say that again.  I'm not just talking about spending the entire semester proving math theorems (which is already questionable enough in a course that's listed as a *computer science* course). I'm talking about spending the entire semester proving things *about* math proofs. IOW, we were dealing with *meta-proofs*.  And most of the "proofs" we proved things about involved *proofs of infinite length*.

Yeah.

I spent the entire course repeatedly wondering if I had misread the course calendar and gone to the wrong class, and, when I had ruled that out, what any of this meta-proof stuff had to do with programming.


T

-- 
Recently, our IT department hired a bug-fix engineer. He used to work for Volkswagen.
March 12, 2012
On 12 March 2012 15:42, H. S. Teoh <hsteoh@quickfur.ath.cx> wrote:
> On Sun, Mar 11, 2012 at 03:32:39PM -0400, Nick Sabalausky wrote: [...]
>> I'm convinced that colleges in general produce very bad programmers. The good programmers who have degrees, for the most part (I'm sure there are rare exceptions), are the ones who learned on their own, not in a classroom.  It's sad that society brainwashes people into believing the opposite.
> [...]
>
> I have a master's degree in computer science. About 90% (perhaps 95%) of what I do at my day job is stuff I learned on my own outside the classroom. That is not to say the classroom is completely worthless, mind you; courses like discrete maths and programming logic did train me to think logically and rigorously, an indispensible requirement in the field.
>
> However, I also found that most big-name colleges are geared toward producing researchers rather than programmers in the industry. Now I don't know if this applies in general, but the curriculum I was in was so geared towards CS research rather than doing real industry work (i.e., write actual programs!) that we spent more time studying uncomputable problems than computable ones.
>
> OK, so knowing what isn't computable is important so that you don't waste time trying to solve the halting problem, for example. But when *most* (all?) of your time is spent contemplating the uncomputable, wouldn't you say that you're a bit too high up in that ivory tower? I mean, this is *computer science*, not *uncomputable science* we're talking about.
>
> Case in point. One of the courses I took as a grad student was taught by none less than Professor Cook himself (y'know the guy behind Cook's Theorem). He was a pretty cool guy, and I respect him for what he does. But the course material was... I don't remember what the official course title was, but we spent the entire term proving stuff about proofs.  Let me say that again.  I'm not just talking about spending the entire semester proving math theorems (which is already questionable enough in a course that's listed as a *computer science* course). I'm talking about spending the entire semester proving things *about* math proofs. IOW, we were dealing with *meta-proofs*.  And most of the "proofs" we proved things about involved *proofs of infinite length*.
>
> Yeah.
>
> I spent the entire course repeatedly wondering if I had misread the course calendar and gone to the wrong class, and, when I had ruled that out, what any of this meta-proof stuff had to do with programming.
>
>
> T
>
> --
> Recently, our IT department hired a bug-fix engineer. He used to work for Volkswagen.

I'm entirely self-taught, and currently taking a break from university (too much debt, not enough time, too much stress). I rarely use stuff that I haven't taught myself. I realize now that trying to teach people how to program is very, very hard however, since I always think about how to teach stuff I know. Ideally you'd learn everything at once and spend the next 2 years re-arranging it in your brain, but unfortunately people don't work like that...

--
James Miller
March 12, 2012
On Mon, Mar 12, 2012 at 02:31:44AM +0100, Adam D. Ruppe wrote:
> On Monday, 12 March 2012 at 01:28:39 UTC, H. S. Teoh wrote:
> >Something that not only doesn't need the mouse, but *eradicates* all need for the mouse on virtually all applications.
> 
> It isn't what you described, but in X11, if you hit shift+numlock, it toggles a mode that lets you move the cursor and click by using the numpad keys.

There is that, but that's still just a mouse in disguise. In fact, it's worse than a mouse, 'cos now you're pushing the mouse with buttons instead of just sweeping it across the mouse pad with your hand.

I'm talking about a sort of quadtree-type spatial navigation where you zero in on the target position in logarithmic leaps, rather than a "linear" cursor displacement.


T

-- 
INTEL = Only half of "intelligence".