Floating Sun » computing http://floatingsun.net Mon, 07 Jan 2013 02:53:26 +0000 en-US hourly 1 http://wordpress.org/?v=3.5.1 History of Computing — 8 http://floatingsun.net/2007/06/29/history-of-computing-8/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-8 http://floatingsun.net/2007/06/29/history-of-computing-8/#comments Fri, 29 Jun 2007 20:52:23 +0000 Diwaker Gupta http://floatingsun.net/2007/06/29/history-of-computing-8/ Related posts:
  1. History of Computing — 6
  2. History of Computing — 3
  3. History of Computing — 1
]]>

//These are some unfinished notes from the history of computing class. They’ve been lying around in my drafts for a very long time and I thought I should just push them out. They don’t make much sense though. I’m hoping to fill in some more details later.//

Today’s speaker is Ray Ozzie. He’s going to talk about collaborative software.

Some guy who’s name I couldn’t get invented the plasma panel and also the first version of a touch screen. Motivated by computers for education, not really a computer scientist, just an eccentric. The terminal was called PLATO IV.

Ozzie’s slides are pretty interesting because they don’t contain any text. Almost each slide is simply an image, while he talks through the context and the anecdote behind the image.

Ted Nelson authored two (comic?) books called “Computer Lib” and “Dream Machines”. He was also the author of Hypertext.

]]>
http://floatingsun.net/2007/06/29/history-of-computing-8/feed/ 0
History of Computing — 6 http://floatingsun.net/2006/11/01/history-of-computing-6/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-6 http://floatingsun.net/2006/11/01/history-of-computing-6/#comments Thu, 02 Nov 2006 02:47:53 +0000 Diwaker Gupta http://floatingsun.net/blog/2006/11/01/764/ Related posts:
  1. History of Computing — 8
  2. History of Computing — 3
  3. History of Computing — 4
]]>

As you can see, I didn’t blog about the last HoC class largely because I had to leave early for Surtaal practice, but it was a wonderful lecture and I encourage all those who are interested to look at the video/slides. Armando Fox is a great speaker — his slides were both amusing and insightful. I missed Steve’s section on Antitrust but I think I’ve heard that material before from him from the earlier two courses.

Anyways, so for today’s class we have none other than [[wp>Steve_wozniak|Steve Wozniak]]. He’s shorter than I thought. He’s not using slides. But he’s speaking enthusiastically and passionately, so its really enjoyable just listening to him.

He’s basically off on a tirade about his life, which is fair I guess since he had a pretty big role in the development of the personal computer as we know it today (and which happens to be the subject of today’s class). His childhood was pretty fascinating apparently. When he was in 5th grade, his dad took him to these technology fairs — so at that early age he already knew about circuits and transistors. By the time he was 12 years old, he was already building simple circuits — for example, he built an electronic tic-tac-toe.

By the time he was in high school, he was so advanced in electronics that his high school teacher arranged for him to take electronic classes at a nearby college. He also started programming in Fortran when he was in high school — probably one of the few kids in the entire world to be programming a computer at that time. One of the first programs he wrote was the [[wp>Knights_tour|Knights Tour]].

He then got his hands on the manual for the PDP-8 and fascinated by it, he started designing his own computers. Of course he could never build and test those designs, but still, think about it — a high school kid, actually //designing// a full-fledged computer!!

He says that by the end of high school, he had designed and re-designed so many computers, and //optimized// so many existing designs that he was convinced that his designs were more compact and efficient than any other designs out there. He knew that he knew more than what could be taught in any electrical engineering course in college.

He was planning to go to tech schools like MIT for college, but he was visiting Colorado and he was so enamored with the snow there that he decided that he would only apply to Colorado. He was crazy about programming. He cost his first year college a lot of money in computer bills. He had to take a year off after his second year (which he did at De Anza, not at Colorado) working for a company to make enough money to finish college and get a car.

Wozniak met Jobs sometime during his college years. He says Jobs was more “liberated” than he was (in context of the Hippie movement prominent in California at the time). He went to Berkeley for his 3rd year college. They made “blue boxes” using which they could basically tap into the telecom system and make phone calls anywhere in the world without having to pay for anything.

(//At this point, the “lecture” really sounds like excerpts from iWoz, not really an academic prose on the advent of the PC from Apple’s vantage point. Probably it’ll better fit into the lecture as things proceed, but right now its simply an interesting story//)

During their Berkeley days, Jobs and Wozniak thought of selling the blue box on campus. So they did some door to door marketing, never got caught and sold a bunch of blue boxes.

Later Woz got his hands on one of the first legendary HP calculators (those which could do non-trivial math — calculus, for example). Woz was later hired by HP as an electronic engineer to work on calculator designs. In order to test his calculator designs, Woz also worked on //simulators// for the calculators. HP had one huge shared computer to run these testing and debugging routines.

He was a //hacker// par-excellent — he built a LOT of stuff on his own: he built a pinball machine, he built his own electronic Pong game (complete with TV display and controls). Out of this Pong machine, he designed a keyboard-video interface that was a cheap and better replacement for the expensive teletype machines. Jobs and Woz would later sell this terminal design to a Mountain View based company.

Later he designed a one player Pong game (better called Bricks) in //4 days// that he and Jobs later marketed. Recall in those days games were not really like any other program — things had to be hard-wired and stuff had to be coded in hardware. Very little software abstraction.

Woz was part of a “Homebrew” computer club where people interested in microprocessors met and talked about things. Jobs came to Woz saying that they should start a company to build self-contained components — a printer circuit board (PCB) which would have the basic functionality like keyboard input/output, video displays etc. Jobs secured a $50,000 order for their first PCB (morphed into Apple I).

Woz designed the Apple II from scratch. It would be the first low cost small computer with high resolution color display. He infact wrote down the entire BASIC interpreter //by hand// because he couldn’t afford a computer to type it out and compile it in. Jobs basically ran the business side of the company, while Woz led the technical side.

The first Apple ][ didn’t have a floppy drive, nor floating point BASIC. So they later worked on a floppy drive controller (in two weeks) but got the floating point BASIC from Microsoft, which Woz says was a big mistake for Apple because it was licensed from M$ for 5 years. Apple also wrote the first “home” oriented application — they wrote a checkbook manager, probably the world’s first consumer finance application as well.

]]>
http://floatingsun.net/2006/11/01/history-of-computing-6/feed/ 2
History of Computing — 4 http://floatingsun.net/2006/10/18/history-of-computing-4/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-4 http://floatingsun.net/2006/10/18/history-of-computing-4/#comments Thu, 19 Oct 2006 01:54:44 +0000 Diwaker Gupta http://floatingsun.net/blog/2006/10/18/759/ Butler_lampson|Butler Lampson]] as the guest speaker. He needs no introduction, but for those who are unfamiliar, he was one of the co-founders of Xerox. He’s been intimately involved with the development of a lot of technologies we take for granted today — Ethernet, WYSYWIG interfaces among a lot of other [...] ]]>
Today we have the legendary [[wp>Butler_lampson|Butler Lampson]] as the guest speaker. He needs no introduction, but for those who are unfamiliar, he was one of the co-founders of Xerox. He’s been intimately involved with the development of a lot of technologies we take for granted today — Ethernet, WYSYWIG interfaces among a lot of other things. Xerox, which is commonly known for its photo copiers, actually did really phenomenal work back in the 70s and 80s, most of it coming out of PARC (Palo Alto Research Center).

The talk is on “Xerox PARC, Workstations, and Distributed Computing”. Dr. Lampson has started off very very informally, sort of just talking though how Xerox began. They wanted to build some systems but they couldn’t find any industrial partners, so they went out and did a startup. Now startups in 1969 were very different than startups these days. They had two faculty, some grad students, even one undergrad student when Xerox began.

A lot of things inspired Xerox a lot: Alan Kay’s Flex Machine, Sketchpad, Arpanet and the Aloha Packet Radio Network. The folks at Xerox had the grand vision of “an Electronic Office”. They were very interested in human computer interfaces and how to improve the interaction. At around the same time, Doug Engelbart over at SRI did a lot of work on //augmenting// human computer interaction — improve the way humans work using computers. Predominant themes were computers as //tools// for helping people think and communicate. Note that here we start to see a paradigm shift, because before this time, computers were largely seen as calculating machines — mere number crunchers.

**Personal Distributed Computing**: Again note two really important themes that we are just seeing enfolding around us in the larger sense today. Personal computing (think laptops and PDAs and iPods etc) and networking (think Zune’s Wifi, 802.11, broadband etc). The other revolutionary notion was of //end-users// as non programmers. Users simply became //consumers// of the hardware-software package. This was a fundamental shift in thinking — earlier users had largely been programmers, or scientists requiring numbers to be crunched for them. But not users in the sense we know and understand today — think Skype or Photoshop or AutoCAD.

At some point Xerox bought [[http://en.wikipedia.org/wiki/Scientific_Data_Systems|SDS]] for almost a //billion dollars// in 1969!! Lampson says that this particular deal outdoes any of the ridiculous deals of the dot-com bubble. Xerox ended up shutting down the SDS division a few years later at a loss of hundreds of millions of dollars. One of the worst deals in the history of computing.

The mouse was //not// invented at PARC. Doug Engelbart’s team did the mouse. Xerox //did// spend a lot of time thinking about how to present information to users in a friendly manner. The [[http://en.wikipedia.org/wiki/Alto_%28computer%29|Alto]] was a result of this effort — in some sense the world’s first really personal computer. The PARC folks also seriously believed in eating their own dog food, so they always used for they built. Also, in those days, people were fast and machines were slow. Its amazing how interfaces have barely evolved in the past 3 decades. Now that machines are so much faster than people, we probably need to look into other way of interacting with computers, which can take advantage of all this computing power.

There were a bunch of largely factual slides on Alto, Ethernet and Laser printers that I won’t talk about much. They also build networked printers and the first print server (printing over Ethernet). Bravo was the WYSIWYG editor which was the precursor to M$ Word and other word processors. Smalltalk pioneered the idea of overlapping windows. The Grapevine system pioneered the idea of eventual consistency in distributed systems. The Laurel email client popularized the idea of the 3-pane window for email clients.

Markup was the precursor to Paint like bitmap based programs (nothing like Gimp or Photoshop though). Markup also had the equivalent of modern day popup-menus. They also did a lot of fonts, font editors. “Draw” was a vector based drawing program. I’m just amazed by the sheer volume and quality of systems these folks built within a few years.

PARC did “Boca Raton” in 1976 which was a big show-and-tell conference for the executives. The goal was to get Xerox to make products. As a result Xerox created the Systems Development Division. Unfortunately the execs at Xerox didn’t quite “get it”. Although the electronic printing industry (which made them billions of dollars and was born out of Alto) was a huge success, the Star Office system from Xerox was a big commercial failure.

Well, I gotta take off for Sur Taal practice now, so I’ll catch up on the rest of the lecture later.

]]>
http://floatingsun.net/2006/10/18/history-of-computing-4/feed/ 2
History of Computing — 3 http://floatingsun.net/2006/10/11/history-of-computing-3/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-3 http://floatingsun.net/2006/10/11/history-of-computing-3/#comments Thu, 12 Oct 2006 01:45:32 +0000 Diwaker Gupta http://floatingsun.net/blog/2006/10/11/756/ Gordon_Bell|Gordon Bell]] [...] ]]>
I’m a little late for today’s class and I will be leaving a little earlier as well, so the notes for this class may be just a tad bit incomplete (not like the notes for the rest of the lectures were //perfect//!). Of course, today’s class is special because we have the legendary [[wp>Gordon_Bell|Gordon Bell]] as the guest speaker.

Bell started by giving a brief overview of the four main themes of his talk: Moore’s law and Bell’s law, commoditization of technology and computer generations, DEC and minicomputers.

The first couple of slides are on DEC itself. How it was born out of MIT’s Lincoln Lab with some VC like funding from American R&D. The business plan was to design, manufacture and sell logic modules and use earnings to build computers. DEC was legendary in its time for quickly adapting to the rapidly changing technology. Note that in those days the //hardware// was rapidly evolving — these days companies struggle to keep up with just the //software// changes. DEC survived the coming and going of microcomputers, integrated circuits leading to VLSI and so on. And of course DEC made the PDP (Programmed Data Processor) series, without which we probably wouldn’t have had UNIX :)

(//Aside//: I don’t like Mr. Bell’s slides all that much. Too much text, the slides lose focus. And what he says for each slide doesn’t really highlight any focal point in each slide.)

The first PDP was delivered to BBN. The PDP-6 had timesharing. DEC also built the VAX (//Virtual Address eXtension// to the PDP-11), which was one ofthe first systems with a concept of //virtual memory//. Its surprising that DEC was around as late as 2002 — I somehow don’t remember hearing a lot about DEC. Compaq (now HP) acquired DEC in 1992.

The next couple of slides talk about the PDP-1, the PDP-5 etc in a little more detail. Nothing particularly insightful here — mostly factual details and some interesting anecdotes. PDP-1 had 18-bit words and a 4-kilo word capacity. PDP-5 had 12-bit words. So we can still that there was no sense of a platform or architecture or instruction set at that time. PDP-6 (the timesharing super-machine) could serve 128 terminals at once, and had a peak memory capacity of 262000 words.

PDP-11 was 16 bits (here we see the magic number 8 coming up) and VAX was one of the first 32 bit computers. Recall that IBM 360 sort of established the 8-bit byte convention. Before that it was all very ad-hoc.

The “mini-computer” actually denotes the //minimal//-computer: take the current technology, and combine it into a package with the smallest cost (so more like a bare-bones system in today’s parlance).

**Bell’s law of Computer Generations (or “classes”)**

Classes of computers form and evolve just like modes of transportation, restaurants etc. Governed by a lot of economic based laws such as the “network effect” or learning curves or the marginal cost of hardware/software etc. We’ve had the following classes so far: building sized computers, main frames, minicomputer, workstation, PC, laptop, PDA, sensor motes? How will future computers be build? **Scalable Networks and Platforms**.

Every decade a new class emerges, which has an order of magnitude better cost-performance ratio. New classes lead to new apps lead to new industrijes.

Alright time to take off now. I’ll be missing the VAX strategy onwards part of the class. Probably will fill in later after consulting the wiki. Cheers!

]]>
http://floatingsun.net/2006/10/11/history-of-computing-3/feed/ 0
History of Computing — 2 http://floatingsun.net/2006/10/04/history-of-computing-2/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-2 http://floatingsun.net/2006/10/04/history-of-computing-2/#comments Thu, 05 Oct 2006 01:29:04 +0000 Diwaker Gupta http://floatingsun.net/blog/2006/10/04/751/ Related posts:
  1. History of Computing — 3
  2. History of Computing — 8
  3. History of Computing — 4
]]>

Its that time of the week again. The second lecture in the History of Computing class is just about to begin. Today’s lecture is on Electronic Computing: 1940 — 1970. It seems Steve (I’ll often refer to him as SMM for brevity) will be doing most of the talking today. BTW, if you’re reading this post and are generally interested in the class, you should definitely check out the [[http://cubist.cs.washington.edu/HistoryOfComputing/index.php/Main_Page|class Wiki]], in particular the discussion pages for the lectures, such as [[http://cubist.cs.washington.edu/HistoryOfComputing/index.php/Talk:Lecture_1|this one for the first Lecture]].

Brief recap from last time: The computing industry is a tipping industry — for a given application, several different technologies start out to give solutions, but eventually just one technology prevails. This leads to monopolies, which may drive innovation in some cases. The world of 1940 is a very divided world by our standards (w.r.t technology). On one hand we have the military machines (which are looking more and more like computers), and on the other hand we have commercial data processing machines (which still look like punch card processors). Then we have our familiar old debates on patents vs. contracts vs. prizes vs. grants as drivers of innovation.

**Wartime**: People actually start building electronic computers. The big three themes during the time of the war:
* Organizing work the “big science way” (spearheaded by Vannevar Bush)
* Crypto drives computing (read Cryptonomicon!!!)
* Stibbitz and ENIAC (didn’t quite get the story here, yet) — something about the feasibility of computers

After this SMM tried to go through the basics of vaccum tubes, boolean circuits, flip-flops and such. These are largely factual details, so I’ll leave the interested readers to peruse Wikipedia. Bell labs comes up with the transistor (this is slightly disputed I believe), which is a much better way of doing vaccum tubes. Talk about Sibbitz, Berry-Atanasoff (condensor memory), Konrad Zuse (programmable relays).

Then comes ENIAC from the efforts of John Mauchly and Presper Eckert. ENIAC was built using 200,000 man hours — enormously complicated device.

Interesting anecdote: why do computers always have flashing lights in movies? It seems that in ENIAC whenever a vaccum tube burnt out, a light would come on. So thats where the bulbs have their origin. Another interesting anecdote: ENIAC took 200,000 man hours. Panama canal took 2 million man hours. In 2004, **9 billion** man hours were spent by people, playing solitaire.

Anyways, so ENIAC is followed by the idea of magnetic drums/disks. Von Neumann comes up with the idea of store procedures to drive logical operations. The ENIAC folks left school shortly after the war because the University’s patent policy would not let them commercialize their machine, which they wanted to.

**The First Computer Companies** (post war): Government has new, large data processing needs — weapon physics, crypto and intelligence, air defense. But punch cards stayed popular until 1962. So computers had to be made robust, cheap and reliable before they would see commercial acceptance. So advances in a lot of technology to make it all happen: internal memory, external memory, CPU.

Mauchly and Eckert make the Electronic Computing Company, but ran into problems with marketing and capital. The NSA people also realized that they didn’t have adequate tech support and documentation to make computers fly. So they turned to Remington Rand. IBM, under the leadership fo T. J. Watson (Sr.) decides to go all electronic. He makes it his quest to make IBM the leader in electronic computers (in part so that he could position his son, Watson Jr., for the post of the CEO).

IBM comes up with //Selective Sequence Electronic Calculator// — it was the first stored procedure computer. A lot more machines followed. Card programmed electronic calculator — first truly programmable commercial machine. The concept of //User Innovation// — Northrop Grumman asks IBM to build a machine, but IBM sells it to all the competitors as well.

**UNIVAC**: UNIVersal Automatic Computer. IBM doesn’t like using the word “computer” because it associates that word with humans. UNIVAC starts selling the machines to GM, DuPont, US Steel, US Air Force etc. Cost $1 million a piece, but production problems because the manufacturing is still clumsy and not industrialized. But Univac ran into problems: small sales force, small field staff, perpetually changing design. Point being, that IBM innovated not just technologically, but also socially — marketing, documentation, sales pitch, lock-in etc.

IBM comes out with 701 — the “Defense calculator”. Improve it following up with 704 and 709. The 7090 is **all transistor** — this is late 60s, the beginning of the shrinking of the computer. Backwards compatible software also gains into prominence at this point. Also, in the 60s the users were already writing a lot of code and people decided that it would make sense to share this code. So the beginnings of open source was as far back as 1955. IBM also had a Contributed Program Library (CPL). We’re not talking much about programming languages here, but there was tremendous innovation happening there as well. Actually its astounding how much was achieved during those two decades. Fortran came out in 1957 and ushered the era of scientific computing, API stability, user programming etc.

**Whirlwind and SAGE**: Military projects to build giant computers. The AN/FSQ-7 employed 800 programmers — 20% of the world’s supply back then. It had **500,000** lines of code — terribly impressive for its time. Large real time OS, multi-programming, modems — all were invented as a result. IBM gets the bid again — its the “kingpin”. Benefits to IBM include expanded manufacturing abilities, even more project bids. Other benefits include the creation of Lincoln Labs and the eventual spinoff of DEC. The sense that big government investments can lead to //permanent value// sets in at this time.

Antitrust suite against IBM (1952–1956): predatory pricing, incompatible hardware/software, buying up patents, using leases to block innovation, binding inventors to exclusive contracts. As a result of this lawsuit IBM opens up the card market, which was a Good Thing ™. It fosters competition in repair, service and other affiliated fields.

Random Access Memory (RAM) shows up around 1956. IBM put up a demo “Ask Prof. RAMAC” where people could interact with a computer in real time. This was a big deal and made possible only because of RAM — earlier machines would have to spool through the entire drum. Around this time people are able to fit in 10 transistors in 1 cubic inch of space. So already people knew that moving forward, the future is in the ability to pack more and more transistors into a given space. Note that now we’re almost at the end of that era — dumping more transistors is not helping all that much any more and will soon become physically impossible.

IBM and the seven dwarfs: some other small companies show up — NCR, Honeywell, RCA, GE, Sperry, AT&T. Integrated circuits are invented in late sixties — “printing” out circuits like lithography on a piece of silicon, and more important, mass produce/replicate these circuits in a very cheap and scalable manner. IBM builds the ability of “Firehose RnD” — turn a firehose towards whatever problem comes up next. Sometime they make mistakes, but they usually get things right.

With the System/360 series, IBM does a lot of things. First, it goes for backwards compatibility across the board — software will run on //any// IBM system. First inklings of a standard hardware/software architecture. Secondly, it dumped its **huge** installed user base (you’ll have to rewrite your software) in favor of grabbing new user base. Grew from 6000 computers in 1960 to a hundred thousand world-wide in 1970. System/360 has a million lines of code. Now XP looks small to me in comparison :)

System/360 also introduces time sharing and multi-programming under severe pressure from programmers (even though the 360 originally didn’t support time sharing). 370 comes out later, and its all integrated circuits. Recall that around the 1970 time frame, the 370 series also supported the first ever virtual machines. Unbelievable stuff!

After the 1970s, the world is left without any large competitors. A temporary end to revolutions? Other competitors start to overtake IBM in other areas such as tapes and disk drives and eventually CPUs. So in this era, commerce is slowly displacing military when it comes to driving computing. IBM is still dominant, but at the cost of vigorous RnD. Integrated circuits are on the horizon, so computers are about to shrink significantly (where IBM once again leads the way with the advent of the PC in the 80s). The department of Justice ensures that there are open standards — these are the roots of the open source movement.

At this point Ed is taking over for the remained of the class. He’s going to talk about the architectural aspects of the IBM 360. A lot of interesting things:
* I/O offloaded from the CPU to special I/O “channels”
* Tape drives had about “8 feet” of buffer space
* Visually debug the machine by inspection of registers
* Supply the hex address of the “boot” loader program

Before System/360, every machine was **custom**. Pretty much everything about a PC that we take for granted today was introduced by the 360. Variable word length. 1-address, 2-address, 3-address instructions were common. Registers tended to be special purpose (accumulator, index, PC). But with the 360, we had the 8-bit byte (vs. 6 bits). Memory was byte addressable, 32 bit words, 2′s complement arithmetic, I/O architecture, general purpose registers, floating point architectures — many many brilliant trade-offs. The 360 family also had a wide range of performance characteristics: factor of 25 to 100 was common across memory/CPU etc.

Since 1965, cost per byte of storage has improved by a factor of **10 million**. More importantly however, latency has ONLY improved by a factor of 6 at best. This is remarkable — how some aspects of the system have seen several several orders of magnitude, while others have barely improved. It also suggests that concepts such as caching are becoming increasingly important.

]]>
http://floatingsun.net/2006/10/04/history-of-computing-2/feed/ 3
History of Computing — 1 http://floatingsun.net/2006/09/27/history-of-computing-1/?utm_source=rss&utm_medium=rss&utm_campaign=history-of-computing-1 http://floatingsun.net/2006/09/27/history-of-computing-1/#comments Thu, 28 Sep 2006 01:45:14 +0000 Diwaker Gupta http://floatingsun.net/blog/2006/09/27/749/ Related posts:
  1. History of Computing — 2
  2. History of Computing — 3
  3. History of Computing — 4
]]>

Ed Lazowska began the class by giving a very brief overview of the class and whirlwind tour of some of the guest speakers who would be talking here. The great thing about the class is that we’ll get to hear about the history of computing from people who actually //created// a lot of the history. After some more introduction, Steve Maurer took over.

I really like listening to this guy. He talks like one writes a book. He’s articulate, he’s fluid, he keeps talking and he usually always makes sense. I think thats a rare quality for a public speaker, specially if you’ve to talk for 2-3 hours.

In general, we want a society where the value of inventions is consistently higher than the costs. That way we always increase the net value (or some metric of //happiness//) of the society. **All known incentive mechanisms are imperfect**. Basically we’re saying that there’s no one incentive that works for everyone in all situations. We need to pick the incentives depending on the //context//. Technology has improved, but society has learnt to consume more data as well. Inventions happen because a //consumer// (someone who is facing the problem) also has the technical know-how to put things together and come up with a solution. Rarely do these things occur in isolation.

Again, a recurring theme is that the incremental cost of technology has decreased, driving innovation and competition. 60 years ago, making a second computer was just as expensive as making the first one. But the ability to cheaply replicate components (think integrated circuits) is what led to the proliferation of the personal computer.

What was the value of “computing” in prehistoric times? Government uses, data for the military. Ancient libraries came up as a research tool, to enable knowledge transfer. Then we have the dark ages, where a data loving society collapses to one where no records are kept. Followed by renaissance and a revival in warfare and commerce. Tyco Brahe and Kepler got a machine built to compute trigonometric functions for their astronomical needs.

Then comes Pascal. Mathematical prodigy. Interesting anecdote: he was deformed at birth, and his father who was a tax official thought some client he didn’t serve well had bewitched their son. Pascal, as everyone would know, built the first mechanical calculator. Did you know that calculations for the atom bomb were also done on mechanical calculators? 50 “Pascalines” were built through 1652 and sold for 100 “livres”. The society at the time wasn’t “ready” to use such a technology.

In the late 19th century, the largest and most complex data processing consumers were the banks and the railways. The society’s data consumption was **exploding**. This was also the time of the rise of big businesses (precursors to today’s corporations) like Nabisco and Traveler’s Insurance. William Burroughs of the Burroughs Adding Machine Company made a lot of money out of these calculating machines. They had 58 models, one for each line of business. Fast and huge market penetration, and significant demand.

//Ex post// efficiency is that post the invention, the idea should be free for all. //Ex ante//, on the other hand, is to motivate people to invent in the first place but retain the monopoly once the invention has been made.

Punch cards were motivated by musical organs (where pins raised, so the opposite would be holes). The target market was to improve handlooms and automate the process of weaving using punch cards. Patents are not always good — toys can often lead to significant inventions and if all toys were to be patented, we would never make progress. There was also a debate between the British and the French on the subject of patents vs. prizes. Photography was born out of a prize, and at that time it was stifled by patents. The prize route ended up setting industry standards for photography technology.

Then the era of Charles Babbage and Ada, Countess of Lovelace. Babbage built the difference engine. A prototype was built after Babbage’s time. Babbage became dis-interested in the difference engine because he came up with the //analytical engine// — a steam powered, **programmable machine**. Its almost like the first Von Neumann, stored procedure kind of device.

Herman Hollerith and the age of IBM. Driven by the census challenge in 1880s. Taught at MIT, built Hollerith machine to “read” punch cards which held censor data. Interestingly, Hollerith had a //service model// with the Government for the Census. He “rented” machines for $1000/year with $10 for downtime. Also in part because they didn’t trust the technology as much. He did two census (1880 and 1890). Around the 1900 census, people start grumbling about monopoly of Herman Hollerith machines.

Then Hollerith went around selling the idea to corporations (New York Central railways). And adds functionality to the machine as demands show up (only counting was needed for the Census). He spins off the “Tabulating Company” (venture capitalism?) out of this effort, and eventually Hollerith machines end up all over the world. They start building peripheral business around it — accumulators, keypunches, sorters etc. Leads to defining of interfaces, standards, modularity and all these other concepts that we take for granted.

Hollerith institutionalized this ability to //find and discover// potential use cases and customers within his company. Later years see the decline of Hollerith’s market leadership followed by a merger. Interesting discussions are borne out of this turn of events: does monopoly fund innovation? How do mopolies and innovations stack against one another. Joseph Schumpeter came up with the notion of Schumpeterian competition where capitalism leads to monopolies, monopolies innovate, but only last until the next technological revolutions. And therefore, since in the long run we value technological progress more, monopolies are not only inevitable, they’re actually OK so long as there is competition at the innovation level.

Finally we have Vannever Bush and the Memex. Vannever Bush made the first power amp (it was mechanical!). He also built a differential analyzer, which was later improved upon by Claude Shannon. One thing to note is that throughout the past 100-200 years, a lot of the innovation came from grant money and universities. This is a key point, one that we shall have occasion to revisit in the future.

Back to Hollerith thread. The CTR company became IBM in 1924 with T.J. Watson as the CEO. Patent wars started as early as 1921. Practices such as customer lock-in and competing standards also raise their heads during this time. IBM used its patent position to force Remington-Rand into a joint monopoly pricing. We see this kind of thing again when in the 1980s IBM lets indepedent vendors build the IBM-PC.

Competition spurs a lot of R&D, primarily scientific calculators. At about the same time (1930s), HP is born. Electronic computers don’t become part of IBM’s portfolio until 1960s. Mechanical contraptions lead to electromechanical to electronic. In the early days, the computer industry was a capital intensive market. These days the dynamics have //completely// changed. Software is probably the least capital intensive industry, and therefore the most volatile. When markets are small, there is no long tail — the winner takes it all. But when markets are big, and there is a lot of choice, long tail effects come into play and being small can actually be profitable.

Interesting point just came up: why do some big companies have long term, research oriented investments (IBM, AT&T, M$) while others don’t (Cisco, Dell). One point Steve mentioned is that a lot of these companies head into these long term investments because they’re not working in a competitive economy. AT&T was insulated from all market forces. So were IBM and M$ in their hay days. Of course, a lot of competition also drives research, but even there small companies really can’t afford such investments.

Howard Aiken builds Mark 1 (Babbage’s dream comes true). $100,000 estimate price (1939). Final price tag of $200,000 (1943). This is 60 years ago!!!!

]]>
http://floatingsun.net/2006/09/27/history-of-computing-1/feed/ 2