History of Computing — 2


Its that time of the week again. The second lecture in the History of Computing class is just about to begin. Today’s lecture is on Electronic Computing: 1940 — 1970. It seems Steve (I’ll often refer to him as SMM for brevity) will be doing most of the talking today. BTW, if you’re reading this post and are generally interested in the class, you should definitely check out the [[http://cubist.cs.washington.edu/HistoryOfComputing/index.php/Main_Page|class Wiki]], in particular the discussion pages for the lectures, such as [[http://cubist.cs.washington.edu/HistoryOfComputing/index.php/Talk:Lecture_1|this one for the first Lecture]].

Brief recap from last time: The computing industry is a tipping industry — for a given application, several different technologies start out to give solutions, but eventually just one technology prevails. This leads to monopolies, which may drive innovation in some cases. The world of 1940 is a very divided world by our standards (w.r.t technology). On one hand we have the military machines (which are looking more and more like computers), and on the other hand we have commercial data processing machines (which still look like punch card processors). Then we have our familiar old debates on patents vs. contracts vs. prizes vs. grants as drivers of innovation.

**Wartime**: People actually start building electronic computers. The big three themes during the time of the war:
* Organizing work the “big science way” (spearheaded by Vannevar Bush)
* Crypto drives computing (read Cryptonomicon!!!)
* Stibbitz and ENIAC (didn’t quite get the story here, yet) — something about the feasibility of computers

After this SMM tried to go through the basics of vaccum tubes, boolean circuits, flip-flops and such. These are largely factual details, so I’ll leave the interested readers to peruse Wikipedia. Bell labs comes up with the transistor (this is slightly disputed I believe), which is a much better way of doing vaccum tubes. Talk about Sibbitz, Berry-Atanasoff (condensor memory), Konrad Zuse (programmable relays).

Then comes ENIAC from the efforts of John Mauchly and Presper Eckert. ENIAC was built using 200,000 man hours — enormously complicated device.

Interesting anecdote: why do computers always have flashing lights in movies? It seems that in ENIAC whenever a vaccum tube burnt out, a light would come on. So thats where the bulbs have their origin. Another interesting anecdote: ENIAC took 200,000 man hours. Panama canal took 2 million man hours. In 2004, **9 billion** man hours were spent by people, playing solitaire.

Anyways, so ENIAC is followed by the idea of magnetic drums/disks. Von Neumann comes up with the idea of store procedures to drive logical operations. The ENIAC folks left school shortly after the war because the University’s patent policy would not let them commercialize their machine, which they wanted to.

**The First Computer Companies** (post war): Government has new, large data processing needs — weapon physics, crypto and intelligence, air defense. But punch cards stayed popular until 1962. So computers had to be made robust, cheap and reliable before they would see commercial acceptance. So advances in a lot of technology to make it all happen: internal memory, external memory, CPU.

Mauchly and Eckert make the Electronic Computing Company, but ran into problems with marketing and capital. The NSA people also realized that they didn’t have adequate tech support and documentation to make computers fly. So they turned to Remington Rand. IBM, under the leadership fo T. J. Watson (Sr.) decides to go all electronic. He makes it his quest to make IBM the leader in electronic computers (in part so that he could position his son, Watson Jr., for the post of the CEO).

IBM comes up with //Selective Sequence Electronic Calculator// — it was the first stored procedure computer. A lot more machines followed. Card programmed electronic calculator — first truly programmable commercial machine. The concept of //User Innovation// — Northrop Grumman asks IBM to build a machine, but IBM sells it to all the competitors as well.

**UNIVAC**: UNIVersal Automatic Computer. IBM doesn’t like using the word “computer” because it associates that word with humans. UNIVAC starts selling the machines to GM, DuPont, US Steel, US Air Force etc. Cost $1 million a piece, but production problems because the manufacturing is still clumsy and not industrialized. But Univac ran into problems: small sales force, small field staff, perpetually changing design. Point being, that IBM innovated not just technologically, but also socially — marketing, documentation, sales pitch, lock-in etc.

IBM comes out with 701 — the “Defense calculator”. Improve it following up with 704 and 709. The 7090 is **all transistor** — this is late 60s, the beginning of the shrinking of the computer. Backwards compatible software also gains into prominence at this point. Also, in the 60s the users were already writing a lot of code and people decided that it would make sense to share this code. So the beginnings of open source was as far back as 1955. IBM also had a Contributed Program Library (CPL). We’re not talking much about programming languages here, but there was tremendous innovation happening there as well. Actually its astounding how much was achieved during those two decades. Fortran came out in 1957 and ushered the era of scientific computing, API stability, user programming etc.

**Whirlwind and SAGE**: Military projects to build giant computers. The AN/FSQ-7 employed 800 programmers — 20% of the world’s supply back then. It had **500,000** lines of code — terribly impressive for its time. Large real time OS, multi-programming, modems — all were invented as a result. IBM gets the bid again — its the “kingpin”. Benefits to IBM include expanded manufacturing abilities, even more project bids. Other benefits include the creation of Lincoln Labs and the eventual spinoff of DEC. The sense that big government investments can lead to //permanent value// sets in at this time.

Antitrust suite against IBM (1952–1956): predatory pricing, incompatible hardware/software, buying up patents, using leases to block innovation, binding inventors to exclusive contracts. As a result of this lawsuit IBM opens up the card market, which was a Good Thing ™. It fosters competition in repair, service and other affiliated fields.

Random Access Memory (RAM) shows up around 1956. IBM put up a demo “Ask Prof. RAMAC” where people could interact with a computer in real time. This was a big deal and made possible only because of RAM — earlier machines would have to spool through the entire drum. Around this time people are able to fit in 10 transistors in 1 cubic inch of space. So already people knew that moving forward, the future is in the ability to pack more and more transistors into a given space. Note that now we’re almost at the end of that era — dumping more transistors is not helping all that much any more and will soon become physically impossible.

IBM and the seven dwarfs: some other small companies show up — NCR, Honeywell, RCA, GE, Sperry, AT&T. Integrated circuits are invented in late sixties — “printing” out circuits like lithography on a piece of silicon, and more important, mass produce/replicate these circuits in a very cheap and scalable manner. IBM builds the ability of “Firehose RnD” — turn a firehose towards whatever problem comes up next. Sometime they make mistakes, but they usually get things right.

With the System/360 series, IBM does a lot of things. First, it goes for backwards compatibility across the board — software will run on //any// IBM system. First inklings of a standard hardware/software architecture. Secondly, it dumped its **huge** installed user base (you’ll have to rewrite your software) in favor of grabbing new user base. Grew from 6000 computers in 1960 to a hundred thousand world-wide in 1970. System/360 has a million lines of code. Now XP looks small to me in comparison :)

System/360 also introduces time sharing and multi-programming under severe pressure from programmers (even though the 360 originally didn’t support time sharing). 370 comes out later, and its all integrated circuits. Recall that around the 1970 time frame, the 370 series also supported the first ever virtual machines. Unbelievable stuff!

After the 1970s, the world is left without any large competitors. A temporary end to revolutions? Other competitors start to overtake IBM in other areas such as tapes and disk drives and eventually CPUs. So in this era, commerce is slowly displacing military when it comes to driving computing. IBM is still dominant, but at the cost of vigorous RnD. Integrated circuits are on the horizon, so computers are about to shrink significantly (where IBM once again leads the way with the advent of the PC in the 80s). The department of Justice ensures that there are open standards — these are the roots of the open source movement.

At this point Ed is taking over for the remained of the class. He’s going to talk about the architectural aspects of the IBM 360. A lot of interesting things:
* I/O offloaded from the CPU to special I/O “channels”
* Tape drives had about “8 feet” of buffer space
* Visually debug the machine by inspection of registers
* Supply the hex address of the “boot” loader program

Before System/360, every machine was **custom**. Pretty much everything about a PC that we take for granted today was introduced by the 360. Variable word length. 1-address, 2-address, 3-address instructions were common. Registers tended to be special purpose (accumulator, index, PC). But with the 360, we had the 8-bit byte (vs. 6 bits). Memory was byte addressable, 32 bit words, 2′s complement arithmetic, I/O architecture, general purpose registers, floating point architectures — many many brilliant trade-offs. The 360 family also had a wide range of performance characteristics: factor of 25 to 100 was common across memory/CPU etc.

Since 1965, cost per byte of storage has improved by a factor of **10 million**. More importantly however, latency has ONLY improved by a factor of 6 at best. This is remarkable — how some aspects of the system have seen several several orders of magnitude, while others have barely improved. It also suggests that concepts such as caching are becoming increasingly important.

3 comments

  1. Diwaker Gupta

    *@sg*: If you’re using a decent browser, you should be able to see the username and password in the authorization dialog itself. Just use ‘csep590a’ as the username and password (without the quotes, of course). Its just there to detract spam bots.

  2. Corey Donovan

    “The AN/FSQ-7 employed 800 programmers – 20% of the world’s supply back then.”
    – That is utterly startling. Another interesting fact about the AN/FSQ-7 is that it is the largest computer ever, occupying almost a half-acre of space.

    I included the AN/FSQ-7 and SAGE in a recent blog entry: Ten Servers that Changed the World.

    Anyhow, thanks for posting the summaries to these lectures, it looks like a really enthralling course with great presenters.

Leave a Reply