October 11, 2019
On Friday, 11 October 2019 at 10:06:46 UTC, Chris wrote:

> Now, this begs the question: To which extent do PLs influence the course of technology (e.g. C in the 80ies) and to which extent does the demand / the market created by new technologies influence PLs and their use? It's a bit like the hen and the egg, ain't it?
>


Oh, another thing, when and how did the general availability (open source and "powerful" PCs) of PLs begin, i.e. when did people (nerds) start to write software at home, this, of course influenced the course of IT big time. The usual hen and egg: more powerful machines (reasonably prized), more nerds/devs, more nerds/devs more PCs etc.

Interesting fact (Europe): it wasn't until the late 2000s that companies no longer demanded a CS degree but realized that a lot of people where literate in terms of programming just because they would do it at home as a hobby. I was really surprised the first time I read something like "degree in CS or experience in XYZ programming". And I remember the flamewars on the internet after Apple had introduced Xcode ("Now every idiot can program, the standard of software will go down!" - didn't happen, btw). Nowadays if you don't have something like Xcode (cf. Android Studio), you're out of the race. I.e. corporations empower people and lock them in at the same time (market share), people break out with the help of OSS and other corporations. I remember the dark days when OSS was considered the "great unwashed", now no corporation can do without it. Apple was one of the first, before that Sun with Java? Please correct me, if you know more. I don't have the whole picture.

Anyway, corporations create demands, users create demands. An interesting feedback loop, to me it's nowhere clearer than in software, it could  be used for courses in economics.
October 11, 2019
On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad wrote:
> Keep in mind that 1986 was the heyday of 8-bit computers, low on memory and diskette for storage in a blooming small business

Expanding on this so younger people understand the difference between the 80s and now.

1981-1986 were a period where personal computing became available and small businesses were able to purchase computers (with or without a tiny hard drive). This is kinda the 8/16 bit computing era. Many of these computers shipped with BASIC available. In the beginning when there was little software available people could write applications in BASIC and sell them as commercial products. As the market matured competition got harder and BASIC was no longer an option for commercial products. In the mid 80s there were already thousands software packages available for the IBM PC, and an unknown large number of similar magnitude for 8-bit home computers. Low memory footprint meant that programs were short and  focused on a limited task and that developers could ship new applications after only a few months of work. On 8-bit home computers, many of the early applications were written in BASIC, then a mix of BASIC and machine language and as the competition got harder the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language.

The threshold for starting up a small software company was now much lower than for the big mainframes... So a lot of programs were written, on cheap computers, using very crude tools. Some small developers would consider a macro assembler a luxury item...

The old computing manufactures completely missed this boat (most of them) and that left them in the dust. They relied on expensive hardware, expensive software, expensive manpower, high margins, small volume, large profits. So they viewed the low margin, high volume, small computer market as something completely separate and somewhat insignificant, and thus "surveys" prior to 1990 are likely to see this as the serious computing market that is completely separate from the personal computer market.

This didn't go well, IBM evaporated, SUN died, SGI died, DEC evaporated and so on.

1987-1994 could be viewed as the 16/32 bit era where non-GC high level programming took off also in the personal computing space... From 1995 onwards more memory was available and GC-high level programming and web-apps starts to dominate... This is where D belongs.


Anyway, measuring language popularity is problematic. Programmers do not necessarily like the language they have to use at work, and don't necessarily use the same language at home.  (We can assume that people who use D do so because the want to use it. Not so for Ada, which was known to be met with resistance and was most likely adopted primarily to get government contracts.)

Also, far more software is written in Java, JavaScript and PhP than can be seen on github. Many deployments on the web are closed source deployments.

So... these kinds of "I have statistics" are not as objective as they may seem. Getting conclusive results from quantitative analysis requires a very keen mindset and a lot attention to details and context.  There is no certainty in large numbers... although people find large datasets convincing. A dangerous fallacy...

Want to do data analysis?

1. Find high quality data.
2. Then get high quantity.
3. Then limit your findings based on a solid understanding of the contexts of the data-collection.

October 11, 2019
On Friday, 11 October 2019 at 10:40:13 UTC, Ola Fosheim Grøstad wrote:
> On Friday, 11 October 2019 at 08:06:02 UTC, Ola Fosheim Grøstad wrote:
>> Keep in mind that 1986 was the heyday of 8-bit computers, low on memory and diskette for storage in a blooming small business
>
> Expanding on this so younger people understand the difference between the 80s and now.
>
> 1981-1986 were a period where personal computing became available and small businesses were able to purchase computers (with or without a tiny hard drive). This is kinda the 8/16 bit computing era. Many of these computers shipped with BASIC available. In the beginning when there was little software available people could write applications in BASIC and sell them as commercial products. As the market matured competition got harder and BASIC was no longer an option for commercial products. In the mid 80s there were already thousands software packages available for the IBM PC, and an unknown large number of similar magnitude for 8-bit home computers. Low memory footprint meant that programs were short and  focused on a limited task and that developers could ship new applications after only a few months of work. On 8-bit home computers, many of the early applications were written in BASIC, then a mix of BASIC and machine language and as the competition got harder the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language.
>
> The threshold for starting up a small software company was now much lower than for the big mainframes... So a lot of programs were written, on cheap computers, using very crude tools. Some small developers would consider a macro assembler a luxury item...
>
> The old computing manufactures completely missed this boat (most of them) and that left them in the dust. They relied on expensive hardware, expensive software, expensive manpower, high margins, small volume, large profits. So they viewed the low margin, high volume, small computer market as something completely separate and somewhat insignificant, and thus "surveys" prior to 1990 are likely to see this as the serious computing market that is completely separate from the personal computer market.
>
> This didn't go well, IBM evaporated, SUN died, SGI died, DEC evaporated and so on.
>
[snip]

Care to write a book? I think you, Paulo Pinto and Walter and others here could write a good book about it. I find it fascinating how companies like SUN etc. defeated themselves. The mechanisms are fascinating, and as I said it's a fascinating topic for economics. The history of technology is fascinating from the first time humans could control fire, over the wheel to the internet. However, software gives you the chance to study the development of technology in fast motion. Things have developed incredibly fast, but not as fast as they could. What are the factors? Marketing strategies, narrow-mindedness etc.
October 11, 2019
On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
>> the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language.

This is actually an urban legend. The applications that needed most of performance in the 1980s were mostly written C (Borland C was really popular during the 80s) with a few optimized parts done in assembler. Very few programs were done in pure assembler. There wasn't any need to write everything in assembler except certain optimized loops.

It is simple check this as you can just search your old DOS .exe file for Borland for example and you will be surprised how many DOS programs used C during the 80s.

I suspect as previously mentioned that this survey is based on large companies. Ada has a suspiciously large cut during the 80s. Also what is based on? Per worker, per product, per company? Ada was probably big during the 80s because it was the height of the cold war but still a bit too high I think.
October 11, 2019
On 2019-10-10 18:32, Ethan wrote:
> https://www.youtube.com/watch?v=Og847HVwRSI
> 
> While not unsurprising, it was still fascinating watching Objective-C come out of nowhere to get in the list 25 years after it was first released.

The introduction of the App Store on iPhone and letting third party developer create apps for the iPhone.

-- 
/Jacob Carlborg
October 11, 2019
On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
> On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
>>> the best apps/games were written in pure machine language to get most out of very limited hardware. Embedded programming was also typically done in machine language.
>
> This is actually an urban legend. The applications that needed most of performance in the 1980s were mostly written C (Borland C was really popular during the 80s) with a few optimized parts done in assembler. Very few programs were done in pure assembler. There wasn't any need to write everything in assembler except certain optimized loops.

A lot of programming for 8-bit computers  were done in pure assembler. Also embedded cpus like 6800 were quite limited (256 bytes static RAM), so assembler was in many ways suitable.

16-bit computers like the IBM PC had more RAM, but even people who wrote in Turbo Pascal in the late 80s would drop down to assembler for performance where needed.

> It is simple check this as you can just search your old DOS .exe file for Borland for example and you will be surprised how many DOS programs used C during the 80s.

So maybe it wasn't clear from what I wrote, but in my experience there were essentially two segments up to 1986:

1. The limited "16-bit" 8088 CPU with 8-bit data-bus for IBM and CP/M business oriented machines that could address more than 64KB.

2. The 8-bit small business/home owner segment that often hat 16/32/64/128 KB of RAM and Z80 or 6502/6510  CPUS. Although I know that some established developers in the later years used crosscompilers when writing for 8-bit CPUs that wasn't what most did in the early days.

Then later some computers shipped with more than 1 CPU so that they both could be used for business and games, wasn't the Commodore 128 one of those? I believe it also had some kind of bank switching mechanism so that it was possible to access 128KB, but I didn't use this one myself. I have heared that it was used for cross platform development of Commodore 64 software, so they would write software on the CBM128 and execute it on the CBM64, obviously that would make it possible to use better tools.

The first assembler I used on the C64, was written in BASIC, read from tape. If my machine language program crashed, then I would have to reload the assembler from tape... There were better solutions around, like ROM cartridges... but either way, you had to be patient.

IIRC the most known 8-bit game composer Rob Hubbard wrote his music engine in assembler and entered his tunes in hex... These players were very small as they were added to games that already were cramped for space and the tunes had to be several minutes long. I believe he sometimes would use RAM areas that sat behind memory mapped areas and the like (ROM/Registers) because memory was so tight.


> I suspect as previously mentioned that this survey is based on large companies. Ada has a suspiciously large cut during the 80s. Also what is based on? Per worker, per product, per company? Ada was probably big during the 80s because it was the height of the cold war but still a bit too high I think.

It certainly seems high if you include countries outside the US. I also strongly suspect many US software development companies in the 80s simply wanted to be able to take on Ada development as a strategic "we have consultants that know Ada" thing, since the US government required Ada, but those programmers might also do C/Pascal in their daily work when working on other projects...


October 11, 2019
On Friday, 11 October 2019 at 11:23:34 UTC, IGotD- wrote:
>
> This is actually an urban legend. The applications that needed most of performance in the 1980s were mostly written C (Borland C was really popular during the 80s) with a few optimized parts done in assembler. Very few programs were done in pure assembler. There wasn't any need to write everything in assembler except certain optimized loops.
>
> It is simple check this as you can just search your old DOS .exe file for Borland for example and you will be surprised how many DOS programs used C during the 80s.
>
> I suspect as previously mentioned that this survey is based on large companies. Ada has a suspiciously large cut during the 80s. Also what is based on? Per worker, per product, per company? Ada was probably big during the 80s because it was the height of the cold war but still a bit too high I think.

Big corporations still widely used Assembly in the 80ies (the suicide rates where highest among assembly programmers - no joke). Some people thought that C wasn't that different so why bother? However, it soon became clear that a. if the Assembly programmer left (or killed himself), nobody else could make sense of the program and b. although C was 10% slower, squeezing out the last 10% wasn't worth it (law of diminishing returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?
October 11, 2019
On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
> returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?

Interesting. Maybe they use assembler because a compiler could inject malicious code?

Doesn't seem likely, but this reminds me of a fairly new research topic of «proof carrying machine language».

But you are right, I've heard several people state that the late 80s Motorola 68000 machine language was so programmer friendly that there was no real reason to write code in C...

I remember taking a course on operating system kernels where I had the choice to use Motorola 68000 assembly or C, and I did everything in assembly because it seemed both easier and more fun. Actually, I believe I used assembly instead of C on the on-paper-exam as well... because it seemed easer, I suppose.

Anyway, the both the 68000 instruction set and the first MIPS instruction set are very programmer friendly. So it all depends, I guess.



October 11, 2019
On Friday, 11 October 2019 at 12:14:02 UTC, Ola Fosheim Grøstad wrote:
> On Friday, 11 October 2019 at 12:00:38 UTC, Chris wrote:
>> returns). I have it on good authority that the civil service still uses assembler in certain areas (revenue). I wonder why?
>
> Interesting. Maybe they use assembler because a compiler could inject malicious code?
>

My guess is that the civil servants that had learned how to program in assembler didn't want to change / retrain and since they couldn't be fired they continued using assembler, but it might also be a security issue. I know that the public sector often has the oldest systems for several reasons: 1. security and stability: an new system introduces new errors / vulnerabilities and they can't afford to "not work" for a day or two, 2. reluctance of employees to learn something new, 3. old contracts etc. Then again, they have no problem accidentally deleting all your records (has happened to thousands of people). Schools are often very conservative because a. teachers don't want to learn something new (_they_ are the teachers after all, why should they learn anything?), b. the IT guy (which is often a teacher) learned how to use Internet Explorer, and Chrome of Firefox is just too much! Personally, I couldn't live without checking out new technologies.
October 11, 2019
On Friday, 11 October 2019 at 11:08:55 UTC, Chris wrote:
> Care to write a book? I think you, Paulo Pinto and Walter and others here could write a good book about it.

I am sure somebody has done so? There is at least a scientific journal about the history of computing where articles describe old systems in detail in order to record the history for future generations. (I did write an article about the first user-built graphical MUD on the Internet, though. I have to put it on the web some day.)

Actually, that would be a good theme for a youtube channel.

> I find it fascinating how companies like SUN etc. defeated themselves.

Yeah, SUN and SGI had some great tech ideas and I assume they also had great engineers and it still didn't work out. I wonder what they could have come up with if they had addressed the personal computing space.

They didn't survive networked clusters of Linux commodity-PCs and fast cheap ethernet interconnects...

> Things have developed incredibly fast, but not as fast as they could. What are the factors? Marketing strategies, narrow-mindedness etc.

Right, and there are some recurring themes.

Like the introduction of the iPad was kinda like the 80s all over. People got iPads, was fascinated by the hardware and was looking high and low to find applications to run on it, which in the early days were not polished. It was not obvious what they could use it for so people created many kinds of apps, and users were looking for the next great thing to try out.

That's pretty much what the early personal computing era was like too. People had very little preconception of what was possible with their hardware and would look for new and interesting software to run on it.

Today there seems to be stagnation and lots of copying. The most profitable and marketable ideas are rehashed in 100s, if not 1000s of variations and new and unique ideas kind of drown in the noise. So, now you have not only to create an interesting polished app, you also need to understand marketing really well (and have money to do it).

Seems that the early days of new tech are the most interesting times, then we hit a low creativity equilibrium. Kinda sad... so much potential that is probably overlooked.

There might be a similar dynamics in relation to programming languages.
Gonna think about that some.