November 05, 2013
On 11/5/2013 2:05 AM, Chris wrote:
> He also told me about the rule of diminishing returns*. If with well written C
> program you can get 90% of assembly's performance, leave it at that. If you
> wanna get the remaining 10% and use assembly instead, the cost of it may not be
> worth the returns.

That wasn't really true for 16 bit DOS programs. There was a much greater return for ASM programs - not just speed, but a large size reduction. The latter was critical because of the tight memory constraints.

November 05, 2013
On Tuesday, 5 November 2013 at 19:53:53 UTC, Walter Bright wrote:
> On 11/5/2013 2:05 AM, Chris wrote:
>> He also told me about the rule of diminishing returns*. If with well written C
>> program you can get 90% of assembly's performance, leave it at that. If you
>> wanna get the remaining 10% and use assembly instead, the cost of it may not be
>> worth the returns.
>
> That wasn't really true for 16 bit DOS programs. There was a much greater return for ASM programs - not just speed, but a large size reduction. The latter was critical because of the tight memory constraints.

I have to admit that I know next to nothing about the old 16 bit DOS programs, and not much about ASM. I'm sure that size reduction was critical in 16 bit times. However, and that's what my old man was talking about, once a programmer left the company, they could bin the program, because only the author knew what was really going on. The new programmer had to start from scratch. It was not just about performance, maintenance was also an issue. And as regards performance, he told me that many ASM programmers would use the same instruction sets over and over again. They didn't want to use new or smarter, more efficient instructions. This was partly down to habit/laziness and partly to fear of messing up the program. So the performance gain wasn't really there, and after a few years a good C compiler could probably create better code than ASM with old and suboptimal instructions.

But I agree, it's good to know what's going on under the hood, how memory works. I remember a talk by a guy from YouTube. He pointed out that young programmers fresh from college are used to 2GB memory and Java, only to find out that youtube has fierce memory restrictions.
November 06, 2013
On 11/5/2013 1:20 AM, H. S. Teoh wrote:
> On Mon, Nov 04, 2013 at 06:12:33PM +0100, PauloPinto wrote:
>>
>> Well I was looking at Z80 Assembly code at the age of 12.
>
> I started programming Applesoft BASIC around that age too,

Yea! Applesoft BASIC!! That was my first, too :D. Learned on the Apple IIc with the packed-in tutorial disks (which I think are *still* better designs than most training materials we have today.) Later on I did a whole bunch of QBASIC, and a little GWBASIC.

I tried a bit of the Apple Logo too, which was nice as an electronic Spirograph. But it never stuck me as being quite as well-suited to interactive programs as basic, so I didn't use it much.

I've actually tried to trace back just when it was I started on Applesoft BASIC, and best I can figure I must have been around 7 or 8. I know I had already learned to read (obviously), and I also remember it was definitely before second grade (but not *immediately* before, IIRC). Not intending to compete on ages of course, it's just that when you're a kid and you're at the Grandparent's place for the evening, all the adults are watching the news, and the only thing remotely toy-like is an Apple II...well, what else you gonna do? :)


> and when I
> was 14 or so, I was programming in Motorola 6502 assembly language.

Much the same here. I was about 12 or 13 (I remember I was in 7th grade). Although it wasn't so much "assembly language" as it was the machine code memory editor built into the Apple II. Didn't do a lot with it though because it was very shortly after I started *trying* to learn C...


> When
> I was 16 one of my assembly programs was sold in a bookstore. Thereafter
> I moved on to Intel 8088 assembly language. It was only years later, in
> college, that I learned C and C++.
>

Heh, again, somewhat similar: I *tried* to learn C at 13, but found it awkward. So I went back to QBASIC, and later on Visual BASIC 3. When I was around 15, maybe 16, I found this book at the local "Electronics Boutique" as part of a Starter Kit bundle:

http://www.amazon.com/Teach-Yourself-Game-Programming-Cd-Ro/dp/0672305623

That book made C (and pointers) finally "click" for me (plus the occasional 386/486/586 asm in the bottleneck sections). Then at around 16-17, I got a Breakout/Arkanoid clone in a few of those several-budget-games-on-one-CD packages that were in all the computer stores at the time. Didn't make much off it, (even McDonald's paid more) but I was thrilled :)

Then in late-HS/college I did a centepede clone that didn't really take off (wasn't anything spectacular anyway), wound up in web development, and didn't really get back to indie games until now. Sometimes I'm amazed at some of the game-dev tools we have now (like Unity3D), and at other times I feel like a dinosaur playing catch-up. ;) Which is strange - at my age, *nothing* should be making me feel old. And yet...


> I think BASIC introduced me to the concept behind imperative
> programming,

Exactly. I feel exactly the same way about it.


> even if at the time it has almost no structured constructs
> and most programs were just GOTO spaghetti soup. Going from there to
> assembly language was actually not that much of a stretch, and with big
> performance payoffs, too.
>

Yea, I didn't realize it at the time, but years later it occurred to me just how bizarrely similar BASIC and Assembly are, considering that one is seen as "high-level" and the other is the quintessential "low-level".


> Of course, the world has moved on since those days, so nowadays we don't
> usually bother with that level of performance fine-tuning except in
> performance critical bits of code.
>

Strange thing is, I can't decide whether or not I miss it.

I certainly don't think I would want to do that sort of low-level tweaking on modern hardware. With all the performance-related complications and nuances of modern processors (even pilelining is way beyond old-news now), not to mention the variation, cycle counting would be a nightmare. Of course, I'm sure cycle counting's outdated anyway!


> But anyway, w.r.t. the OP, if I were to be in charge of designing a
> curriculum, I'd put assembly language as the first language to be
> learned, followed by a good high-level language like D. On this, I agree
> with Knuth's sentiments:
>
>     By understanding a machine-oriented language, the programmer
>     will tend to use a much more efficient method; it is much closer
>     to reality. -- D. Knuth
>
>     People who are more than casually interested in computers should
>     have at least some idea of what the underlying hardware is like.
>     Otherwise the programs they write will be pretty weird. -- D.
>     Knuth
>

If I were designing a Programming 101 curriculum, I honestly don't know what language I'd pick. In many ways I don't think a lot of the details really matter much. But what I do think are the most important things in a first language are instant-gratification and a strong emphasis on flow-of-execution. Heck, I might even pick Applesoft BASIC ;)

Of course, a lot of the choice would depend on the audience. For graduate-level math students, Haskell would be a real possibility.

November 06, 2013
On 2013-11-05 19:50, deadalnix wrote:

> Well there is understanding and understanding. Yes it is quite common
> that people understand bytecode but not inheritance, from a software
> architecture point of view (not how it works). We even have quite a lot
> of instances of this in this newsgroup.

Put in another way:

It seems strange someone can do bytecode manipulation but not create a subclass and override a method.

-- 
/Jacob Carlborg
November 06, 2013
On 2013-11-05 13:36, Bienlein wrote:

> Ah, you think it seems unbelievable? I thought so too the first
> time. OOP and OOD is not on the job ads any more as earlier. They
> are now filled with things like JSP, JSF, EJBs, JNDI, JTA, JMS,
> SOAP, REST, ICEFaces, Spring, Ajax, OSGi, Spring, Axis, CXF,
> Oracle, Sybase, DB/2, Oracle, MS-SQL, MySQL, Hibernate, Quartz,
> JMeter, XSD, XSLT, JavaScript, etc.

Some of these will imply OO.

-- 
/Jacob Carlborg
November 06, 2013
On 2013-11-05 19:23, H. S. Teoh wrote:

> Well yes. My point was not to force students to write *all* their code
> in assembly, but to give them some experience in writing small(!)
> assembly programs so that they get a taste of how the machine actually
> works under the hood. Once they have that down, I'd move straight on to
> a nice high-level language like D, because in 90% of the code you write,
> you don't *need* the kind of performance direct assembly coding gives
> you.

Some times you need assembly, not for performance, but because it gives you access to hardware you don't have access to otherwise. But that will most likely be a very small part of your code as well.

-- 
/Jacob Carlborg
November 06, 2013
On 2013-11-06 04:37, Nick Sabalausky wrote:

> I've actually tried to trace back just when it was I started on
> Applesoft BASIC, and best I can figure I must have been around 7 or 8. I
> know I had already learned to read (obviously), and I also remember it
> was definitely before second grade (but not *immediately* before, IIRC).
> Not intending to compete on ages of course, it's just that when you're a
> kid and you're at the Grandparent's place for the evening, all the
> adults are watching the news, and the only thing remotely toy-like is an
> Apple II...well, what else you gonna do? :)

Disassemble the TV in to small pieces so they can't watch the news :)

-- 
/Jacob Carlborg
November 06, 2013
On Tue, Nov 05, 2013 at 10:37:38PM -0500, Nick Sabalausky wrote: [...]
> I've actually tried to trace back just when it was I started on Applesoft BASIC, and best I can figure I must have been around 7 or 8. I know I had already learned to read (obviously), and I also remember it was definitely before second grade (but not *immediately* before, IIRC). Not intending to compete on ages of course, it's just that when you're a kid and you're at the Grandparent's place for the evening, all the adults are watching the news, and the only thing remotely toy-like is an Apple II...well, what else you gonna do? :)

Funny, I got an Apple II when I was 8, and was mostly just playing games on it. When I was 10 or 11, I got so sick of playing games that I decided to learn programming instead (i.e., to write my own games :P). Sadly, I never got very far on that front.


[...]
> Heh, again, somewhat similar: I *tried* to learn C at 13, but found it awkward. So I went back to QBASIC, and later on Visual BASIC 3. When I was around 15, maybe 16, I found this book at the local "Electronics Boutique" as part of a Starter Kit bundle:
> 
> http://www.amazon.com/Teach-Yourself-Game-Programming-Cd-Ro/dp/0672305623
> 
> That book made C (and pointers) finally "click" for me (plus the
> occasional 386/486/586 asm in the bottleneck sections).

Really? I understood pointers right away 'cos they were just fancy terminology for addresses in assembly language. :)


> Then at around 16-17, I got a Breakout/Arkanoid clone in a few of those several-budget-games-on-one-CD packages that were in all the computer stores at the time. Didn't make much off it, (even McDonald's paid more) but I was thrilled :)

Nice! I don't think I ever got that far in my attempts to write games at the time. I was bogged down with having too many radical ideas without the skills to actually implement them. :P


[...]
> > But anyway, w.r.t. the OP, if I were to be in charge of designing a curriculum, I'd put assembly language as the first language to be learned, followed by a good high-level language like D. On this, I agree with Knuth's sentiments:
> >
> >     By understanding a machine-oriented language, the programmer
> >     will tend to use a much more efficient method; it is much closer
> >     to reality. -- D. Knuth
> >
> >     People who are more than casually interested in computers should
> >     have at least some idea of what the underlying hardware is like.
> >     Otherwise the programs they write will be pretty weird. -- D.
> >     Knuth
> >
> 
> If I were designing a Programming 101 curriculum, I honestly don't know what language I'd pick. In many ways I don't think a lot of the details really matter much. But what I do think are the most important things in a first language are instant-gratification and a strong emphasis on flow-of-execution. Heck, I might even pick Applesoft BASIC ;)

True. For a total beginner's intro to programming, assembly is probably a bit too scary. :P  Applesoft might have been a good choice but its age is definitely showing. I dunno. Maybe D? :) At least, the simplest parts of it. But for computer science majors, I'd say dump assembly on them and if they can't handle it, let them switch majors since they won't turn out to be good programmers anyway. :P


> Of course, a lot of the choice would depend on the audience. For graduate-level math students, Haskell would be a real possibility.

IMNSHO, for graduate-level math students I would *definitely* start with assembly language. It would serve to dispel the misconception that the machine is actually capable of representing arbitrary natural numbers or real numbers, or computing the exact value of transcendental functions, etc.. Mathematicians tend to think in terms of idealized entities -- infinite precision, infinite representation length, etc., all of which are impossible on an actual computer. In order to program effectively, the first order of business is to learn the limitations of the machine, and then learn how to translate idealized mathematical entities into the machine's limited representation in such a way that it would be able to compute the desired result.

In fact, now that I think of it, I think the first lecture would be to convince them of how much computers suck -- can't represent natural numbers, can't store real numbers, can't compute transcendental functions, don't have infinite speed/time so only a small subset of mathematical functions are actually computable, etc..

Then the second lecture will explain what the computer *can* do. That would be when they'd start learning assembly language, and see for themselves what the machine is actually doing (and thereby firmly dispelling any remaining illusions they may have about mathematical entities vs. what is actually representable on the machine).

Then maybe at the end of the course, after having built up a realistic notion of what the machine is capable (and incapable) of, explain how things may be put together to produce the illusion of mathematical computation, say with a functional language like Haskell.

Later on, when they learn the theory of computation, I'd emphasize the fact that even though we computer people speak of Turing completeness all the time, actually no such computer exists that is *actually* Turing-complete, because all physical computers have finite storage, and therefore are no more than glorified finite-state machines. :) Of course, that in no way makes the theory of Turing machines useless -- it's a useful idealized abstraction -- but we shouldn't be under the illusion that we actually have access to the capabilities of an actual Turing machine with an infinite tape. Some things are computable in theory, but outright infeasible in practice. Like computing the Ackermann function of Graham's number, say. :-P


T

-- 
What do you call optometrist jokes? Vitreous humor.
November 06, 2013
Somewhat of a coincidence, my first *real* programming language was C++ (though I'd used GML/Visual Basic quite extensively before that), and I learned it primarily using Bartosz Milewski's book "C++ In Action: Industrial Strength Programming"[0]. It was from his site that I first learned about D.

[0]: http://www.relisoft.com/index.htm
November 08, 2013
On 11/6/2013 1:32 PM, H. S. Teoh wrote:> On Tue, Nov 05, 2013 at 10:37:38PM -0500, Nick Sabalausky wrote:
> [...]
>> I've actually tried to trace back just when it was I started on
>> Applesoft BASIC, and best I can figure I must have been around 7 or
>> 8. I know I had already learned to read (obviously), and I also
>> remember it was definitely before second grade (but not
>> *immediately* before, IIRC). Not intending to compete on ages of
>> course, it's just that when you're a kid and you're at the
>> Grandparent's place for the evening, all the adults are watching the
>> news, and the only thing remotely toy-like is an Apple II...well,
>> what else you gonna do? :)
>
> Funny, I got an Apple II when I was 8, and was mostly just playing games
> on it. When I was 10 or 11, I got so sick of playing games that I
> decided to learn programming instead (i.e., to write my own games :P).
> Sadly, I never got very far on that front.
>

I didn't have many games on the Apple II (It was already getting a bit dated when I was using it, so software was hard to find. Ironically, it's much easier to find software for it now thanks to the Internet and an easy-to-build PC <-> Apple II serial cable.) But my brother and sister and I loved the rabbit game that came with it on one of the tutorial disks. Later on, I did also have 2400 A.D. (I've always loved that style of graphics) plus all the BASIC games I typed in from the "how to program in BASIC" books at the library.

Initially, the tutorial disks and BASIC were pretty much all I had to do on the system, so that's what I did :)

>
> [...]
>> Heh, again, somewhat similar: I *tried* to learn C at 13, but found
>> it awkward. So I went back to QBASIC, and later on Visual BASIC 3.
>> When I was around 15, maybe 16, I found this book at the local
>> "Electronics Boutique" as part of a Starter Kit bundle:
>>
>> http://www.amazon.com/Teach-Yourself-Game-Programming-Cd-Ro/dp/0672305623
>>
>> That book made C (and pointers) finally "click" for me (plus the
>> occasional 386/486/586 asm in the bottleneck sections).
>
> Really? I understood pointers right away 'cos they were just fancy
> terminology for addresses in assembly language. :)
>

Admittedly, I hadn't gotten very far with the machine code I had done. Mainly just plotting some (giant) "pixels" to the lores screen.

IIRC the *main* thing about pointers I had trouble with was (no pun intended): What's the point? From what I had read in the intro to C books, they were always just described as another way to refer to a named variable. So I thought, "Uhh, so why not just use the actual variable instead?" Also the whole notion of "buffers" just seemed...advanced.

But then when I started to understand that arrays were nothing more than pointers, it all just "clicked". (It wasn't until many years later I realized arrays aren't *always* mere pointers depending on the language. But by then I'd already understood pointers and memory anyway.)

>
>> Then at around 16-17, I got a Breakout/Arkanoid clone in a few of
>> those several-budget-games-on-one-CD packages that were in all the
>> computer stores at the time. Didn't make much off it, (even McDonald's
>> paid more) but I was thrilled :)
>
> Nice! I don't think I ever got that far in my attempts to write games at
> the time. I was bogged down with having too many radical ideas without
> the skills to actually implement them. :P
>

Finishing is indeed one of the hardest parts. I've always had far more unfinished projects than finished. At the time, I probably never would have finished those games if it weren't for the prodding of the project's producer.

>
>> Of course, a lot of the choice would depend on the audience. For
>> graduate-level math students, Haskell would be a real possibility.
>
> IMNSHO, for graduate-level math students I would *definitely* start with
> assembly language. It would serve to dispel the misconception that the
> machine is actually capable of representing arbitrary natural numbers or
> real numbers, or computing the exact value of transcendental functions,
> etc.. Mathematicians tend to think in terms of idealized entities --
> infinite precision, infinite representation length, etc., all of which
> are impossible on an actual computer. In order to program effectively,
> the first order of business is to learn the limitations of the machine,
> and then learn how to translate idealized mathematical entities into the
> machine's limited representation in such a way that it would be able to
> compute the desired result.
>

That's actually a very good point.

> In fact, now that I think of it, I think the first lecture would be to
> convince them of how much computers suck -- can't represent natural
> numbers, can't store real numbers, can't compute transcendental
> functions, don't have infinite speed/time so only a small subset of
> mathematical functions are actually computable, etc..
>

Very good way to start a course, really. Grabs the students' attention. It was "unexpected approach" course introductions like that that always made me think "Ok, now THIS may turn out to be a pretty good class..."


> Then the second lecture will explain what the computer *can* do. That
> would be when they'd start learning assembly language, and see for
> themselves what the machine is actually doing (and thereby firmly
> dispelling any remaining illusions they may have about mathematical
> entities vs. what is actually representable on the machine).
>
> Then maybe at the end of the course, after having built up a realistic
> notion of what the machine is capable (and incapable) of, explain how
> things may be put together to produce the illusion of mathematical
> computation, say with a functional language like Haskell.
>
> Later on, when they learn the theory of computation, I'd emphasize the
> fact that even though we computer people speak of Turing completeness
> all the time, actually no such computer exists that is *actually*
> Turing-complete, because all physical computers have finite storage, and
> therefore are no more than glorified finite-state machines. :) Of
> course, that in no way makes the theory of Turing machines useless --
> it's a useful idealized abstraction -- but we shouldn't be under the
> illusion that we actually have access to the capabilities of an actual
> Turing machine with an infinite tape. Some things are computable in
> theory, but outright infeasible in practice. Like computing the
> Ackermann function of Graham's number, say. :-P
>

Yea, that'd be a really good class.