March 12, 2012
"H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.531.1331533449.4860.digitalmars-d@puremagic.com...
> On Mon, Mar 12, 2012 at 01:36:06AM -0400, Nick Sabalausky wrote:
>> "H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.510.1331520028.4860.digitalmars-d@puremagic.com...
> [...]
>> Personally, I found discrete math to be the easiest class I took since kindergarten (*Both* of the times they made me take discrete math. Ugh. God that got boring.) It was almost entirely the sorts of things that any average coder already understands intuitively. Like DeMorgan's: I hadn't known the name "DeMorgan", but just from growing up writing "if" statements I had already grokked how it worked and how to use it. No doubt in my mind that *all* of us here have grokked it (even any of us who might not know it by name) *and* many of the coworkers I've had who I'd normally classify as "incompetent VB-loving imbiciles".
>
> It's not that I didn't already know most of the stuff intuitively,

Didn't mean to imply that you didn't, of course.

>
>
>> Then there was Pidgeonhole principle, which was basically just obvious corollaries to preschool-level spacial relations. Etc.  All pretty much BASIC-level stuff.
>
> Oh reeeeaally?! Just wait till you learn how the pigeonhole principle allows you to do arithmetic with infinite quantities... ;-)
>

Well, the discrete math courses offered at the places I went to didn't take things that far. Just explained the principle itself.

> (And before you shoot me down with "infinite quantities are not practical in programming", I'd like to say that certain non-finite arithmetic systems actually have real-life consequences in finite computations. Look up "Hydra game" sometime. Or "Goldstein sequences" if you're into that sorta thing.)
>

Yea, I don't doubt that. While no game programmer, for example, would be caught dead having their code crunching calculus computations, there are some computations done in games that are obtained in the first place by doing some calculus (mostly physics, IIRC). Not exactly the same thing, but I get that applicablity of theory isn't limited to what the computer is actually calculating.

>
> [...]
>> > However, I also found that most big-name colleges are geared toward producing researchers rather than programmers in the industry.
>>
>> The colleges I've seen seemed to have an identity crisis in that regard: Sometimes they acted like their role was teaching theory, sometimes they acted like their role was job training/placement, and all the time they were incompetent at both.
>
> In my experience, I found that the quality of a course depends a LOT on the attitude and teaching ability of the professor. I've had courses which were like mind-openers every other class, where you just go "wow, *that* is one heck of a cool algorithm!".
>

Yea, I *have* had some good instructors. Not many. But some.

> Unfortunately, (1) most professors can't teach; (2) they're not *paid*
> to teach (they're paid to do research), so they regard it as a tedious
> chore imposed upon them that takes away their time for research. This
> makes them hate teaching, and so most courses suck.
>

#1 I definitely agree with. #2 I don't doubt for at least some colleges, although I'm uncertain how applicable it is to public party schools like BGSU. There didn't seem to be much research going on there as far as I could tell, but I could be wrong though.

>
> [...]
>> I once made the mistake of signing up for a class that claimed to be part of the CS department and was titled "Optimization Techniques". I thought it was obvious what it was and that it would be a great class for me to take.  Turned out to be a class that, realistically, belonged in the Math dept and had nothing to do with efficient software, even in theory. Wasn't even in the ballpark of Big-O, etc. It was linear algebra with large numbers of variables.
>
> Ahhhhahahahahaha... must've been high-dimensional polytope optimization stuff, I'll bet.

Sounds about right.  I think the term "linear programming" was tossed around a bit, which I do remember from high school to be an application of linear algebra rather than software.

> That stuff *does* have its uses...

Yea, I never doubted that. Just not what I was expected. Really caught me offguard.

> but yeah, that was a
> really dumb course title.
>
> Another dumb course title that I've encountered was along the lines of "computational theory" where 95% of the course talks about *uncomputable* problems. You'd think they would've named it "*un*computational theory". :-P
>

Yea that is kinda funny.

>
>> I'm sure it would be great material for the right person, but it wasn't remotely what I expected given the name and department of the course.  (Actually, similar thing with my High School class of "Business Law" - Turned out to have *nothing* to do with business whatsoever. Never understood why they didn't just call the class "Law" or "Civic Law".) Kinda felt "baited and switched" both times.
> [...]
>
> That's why I always took the effort read course descriptions VERY carefully before I sign up. It's like the fine print in contracts. You skip over it at your own peril.
>

Our course descriptions didn't have much fine print. Just one short vaguely-worded paragraph. I probably could have asked around and gotten a syllubus from previous semesters, but I didn't learn advanced student tricks like that until a few years into college. ;) Plus, that's other concerns, like scheduling and requirements. I found that a lot of my course selections had to be dictated more by scheduling and availability than much anything else.


March 12, 2012
On Mon, Mar 12, 2012 at 03:15:32PM -0400, Nick Sabalausky wrote:
> "H. S. Teoh" <hsteoh@quickfur.ath.cx> wrote in message news:mailman.531.1331533449.4860.digitalmars-d@puremagic.com...
[...]
> > (And before you shoot me down with "infinite quantities are not practical in programming", I'd like to say that certain non-finite arithmetic systems actually have real-life consequences in finite computations. Look up "Hydra game" sometime. Or "Goldstein sequences" if you're into that sorta thing.)

Argh. Epic fail on my part, it's *Goodstein* sequence, not Goldstein.


> Yea, I don't doubt that. While no game programmer, for example, would be caught dead having their code crunching calculus computations, there are some computations done in games that are obtained in the first place by doing some calculus (mostly physics, IIRC). Not exactly the same thing, but I get that applicablity of theory isn't limited to what the computer is actually calculating.

I think the bottom line is that a lot of this stuff needs someone who can explain and teach it in an engaging, interesting way. It's not that the subject matter itself is boring or stupid, but that the teacher failed at his job and so his students find the subject boring and stupid.


[...]
> Our course descriptions didn't have much fine print. Just one short vaguely-worded paragraph. I probably could have asked around and gotten a syllubus from previous semesters, but I didn't learn advanced student tricks like that until a few years into college. ;) Plus, that's other concerns, like scheduling and requirements. I found that a lot of my course selections had to be dictated more by scheduling and availability than much anything else.

I guess I was lucky then. There were a couple o' useless mandatory courses I had to take, but for the most part, I got to choose what I wanted. (And then my geeky side took over and I filled up most of my electives with math courses... sigh...)


T

-- 
Some days you win; most days you lose.
March 13, 2012
On Sun, 11 Mar 2012 21:07:06 +0100, Walter Bright <newshound2@digitalmars.com> wrote:

> On 3/11/2012 12:32 PM, Nick Sabalausky wrote:
>> I'm convinced that colleges in general produce very bad programmers. The
>> good programmers who have degrees, for the most part (I'm sure there are
>> rare exceptions), are the ones who learned on their own, not in a classroom.
>
> Often the best programmers seem to have physics degrees!
>

Eugh. Physicist programmers tend to use one-letter variable names in my
experience. Makes for... interesting reading of their code.
March 13, 2012
"Simen Kjærås" <simen.kjaras@gmail.com> wrote in message news:op.wa28iobk0gpyof@biotronic.lan...
> On Sun, 11 Mar 2012 21:07:06 +0100, Walter Bright <newshound2@digitalmars.com> wrote:
>
>> On 3/11/2012 12:32 PM, Nick Sabalausky wrote:
>>> I'm convinced that colleges in general produce very bad programmers. The good programmers who have degrees, for the most part (I'm sure there are rare exceptions), are the ones who learned on their own, not in a classroom.
>>
>> Often the best programmers seem to have physics degrees!
>>
>
> Eugh. Physicist programmers tend to use one-letter variable names in my experience. Makes for... interesting reading of their code.

D is great for physics programming. Now you can have much, much more than 26 variables :)


March 13, 2012
On Tue, 13 Mar 2012 03:50:49 +0100, Nick Sabalausky <a@a.a> wrote:

> "Simen Kjærås" <simen.kjaras@gmail.com> wrote in message
> news:op.wa28iobk0gpyof@biotronic.lan...
>> On Sun, 11 Mar 2012 21:07:06 +0100, Walter Bright
>> <newshound2@digitalmars.com> wrote:
>>
>>> On 3/11/2012 12:32 PM, Nick Sabalausky wrote:
>>>> I'm convinced that colleges in general produce very bad programmers. The
>>>> good programmers who have degrees, for the most part (I'm sure there are
>>>> rare exceptions), are the ones who learned on their own, not in a
>>>> classroom.
>>>
>>> Often the best programmers seem to have physics degrees!
>>>
>>
>> Eugh. Physicist programmers tend to use one-letter variable names in my
>> experience. Makes for... interesting reading of their code.
>
> D is great for physics programming. Now you can have much, much more than 26
> variables :)

True, though mostly, you'd just change to using greek letters, right?

Finally we can use θ for angles, alias ulong ℕ...
March 13, 2012
On 13 March 2012 16:10, Simen Kjærås <simen.kjaras@gmail.com> wrote:
> On Tue, 13 Mar 2012 03:50:49 +0100, Nick Sabalausky <a@a.a> wrote:
>
>> "Simen Kjærås" <simen.kjaras@gmail.com> wrote in message news:op.wa28iobk0gpyof@biotronic.lan...
>>>
>>> On Sun, 11 Mar 2012 21:07:06 +0100, Walter Bright <newshound2@digitalmars.com> wrote:
>>>
>>>> On 3/11/2012 12:32 PM, Nick Sabalausky wrote:
>>>>>
>>>>> I'm convinced that colleges in general produce very bad programmers.
>>>>> The
>>>>> good programmers who have degrees, for the most part (I'm sure there
>>>>> are
>>>>> rare exceptions), are the ones who learned on their own, not in a
>>>>> classroom.
>>>>
>>>>
>>>> Often the best programmers seem to have physics degrees!
>>>>
>>>
>>> Eugh. Physicist programmers tend to use one-letter variable names in my experience. Makes for... interesting reading of their code.
>>
>>
>> D is great for physics programming. Now you can have much, much more than
>> 26
>> variables :)
>
>
> True, though mostly, you'd just change to using greek letters, right?
>
> Finally we can use θ for angles, alias ulong ℕ...

That might actually make it /more/ readable in some cases.

--
James Miller
March 13, 2012
On Tue, Mar 13, 2012 at 04:10:20AM +0100, Simen Kjærås wrote:
> On Tue, 13 Mar 2012 03:50:49 +0100, Nick Sabalausky <a@a.a> wrote:
[...]
> >D is great for physics programming. Now you can have much, much more than 26 variables :)
> 
> True, though mostly, you'd just change to using greek letters, right?

And Russian. And extended Latin. And Chinese (try exhausting that one!). And a whole bunch of other stuff that you may not have known even existed.


> Finally we can use θ for angles, alias ulong ℕ...

+1.

Come to think of it, I wonder if it's possible to write a large D program using only 1-letter identifiers. After all, Unicode has enough alphabetic characters that you could go for a long, long time before you exhausted them all. (The CJK block will be especially resilient to exhaustion.) :-)

Worse yet, if you don't have fonts installed for some of the Unicode blocks, you'd just end up with functions and variables that have invisible names (or they all look like a black splotch). So it'll be a bunch of code that reads like black splotch = black splotch ( black splotch ) + black splotch. Ah, the hilarity that will ensue...


T

-- 
It's bad luck to be superstitious. -- YHL
March 13, 2012
On Tue, 13 Mar 2012 06:45:12 +0100, H. S. Teoh <hsteoh@quickfur.ath.cx> wrote:

> On Tue, Mar 13, 2012 at 04:10:20AM +0100, Simen Kjærås wrote:
>> On Tue, 13 Mar 2012 03:50:49 +0100, Nick Sabalausky <a@a.a> wrote:
> [...]
>> >D is great for physics programming. Now you can have much, much more
>> >than 26 variables :)
>>
>> True, though mostly, you'd just change to using Greek letters, right?
>
> And Russian. And extended Latin. And Chinese (try exhausting that one!).
> And a whole bunch of other stuff that you may not have known even
> existed.

I know Unicode covers a lot more than just Greek. I didn't know the usage
of Chinese was very common among physicists, though. :p


>> Finally we can use θ for angles, alias ulong ℕ...
>
> +1.
>
> Come to think of it, I wonder if it's possible to write a large D
> program using only 1-letter identifiers. After all, Unicode has enough
> alphabetic characters that you could go for a long, long time before you
> exhausted them all. (The CJK block will be especially resilient to
> exhaustion.) :-)

63,207[1] designated characters thus far[2]. Add in module names and other
'namespaces', and I'd say that should be no problem at all. As long as
your head doesn't explode, that is.

[1] http://unicode.org/alloc/CurrentAllocation.html

[2] Yeah, not all of those are valid identifiers.
April 01, 2012
the ABI of linux is good enough,  it's based on a mature os : UNIX.

forget the name of D, name is not important.

There is no need for a replacement for c in OS area, because c is the best
high level language match current CPU architecture
Why c++ is so complexity? because the cpu architecture is not object
oriented.

energy save, high performance, develop effective, in the area focus on
these, is the market for D : "Half" system program language.
and the big point for growth, can call c/c++ binary from D source.

sorry about digressing from the subject.


2012/3/10 Walter Bright <newshound2@digitalmars.com>

> On 3/9/2012 3:09 PM, Nick Sabalausky wrote:
>
>> Keep in mind, too, that Linux has decades of legacy and millions of users. That's a *very* different situation from Phobos. Apples and oranges.
>>
>
> Linux has had a habit of not breaking existing code from decades ago. I think that is one reason why it has millions of users.
>
> Remember, every time you break existing code you reset your user base back to zero.
>
> I'm *still* regularly annoyed by the writefln => writeln change in D1 to D2, and I agreed to that change. Grrrr.
>


April 03, 2012
On Saturday, 10 March 2012 at 08:53:23 UTC, Alex Rønne Petersen wrote:
> In all fairness, a stop-the-world GC in a kernel probably *is* a horrible idea.

Doesn't kernel always work in a stop-the-world mode?
10 11 12 13 14 15 16 17 18 19 20
Next ›   Last »