4004 plus 40

Missed this by a couple of months….November 15, 2011, was the 40th anniversary of the Intel 4004, the world’s first microprocessor. The history of this extremely influential device provides an interesting case study in innovation.

Early computers were constructed out of discrete components, first vacuum tubes and later transistors. Early work on transistors was done at Bell Labs…one of the inventors, William Shockley, became dissatisfied with Bell’s management and left to start his own company, which he located in Palo Alto to be near his mother’s house. (If Shockley’s mom had lived in Roanoke, would the term “Silicon Valley” now refer to the Shenandoah valley!?)

Eight of the new company’s employees (“the traitorous eight”) in turn became unhappy with the way Shockley was running things, and left in 1957 to form Fairchild Semiconductor as a division of Fairchild Camera and Instrument. The integrated circuit, which allowed several transistors to be placed on a single chip, was independently invented at Fairchild and at Texas Instruments. Large numbers of these chips still had to be interconnected to form the central processing unit of a computer.

By 1968, several of the “fairchildren” were less than thrilled with the way things were going at Fairchild, and left to start their own companies. Two of these, Gorden Moore and Robert Noyce, founded Intel with an intital capital of $2.5 million, which was provided by pioneering venture capitalist Arthur Rock. Intel’s original focus was memory components–their initial product could store 64 whole bits on a single chip!

In that year, a Japanese company called Busicom approached Intel to make some custom chips for a new calculator they were developing. Intel employee Ted Hoff was not impressed with Busicom’s design and suggested an alternative: make the device internally programmable and reduce its complexity by implementing the calculator functionality as computer code rather than as electronic hardware. Fredrico Faggin, who joined Intel in 1970, was able to achieve the difficult task of designing a very rudimentary CPU so that it would fit on a single chip. Masatoshi Shima, a Busicom engineer, also contributed to this work.

Intel astutely decided to buy back the general design and marketing rights to the 4004 from Busicom, and the latter company, which had been interested in the project only as a source of components for its calculator, sold the rights in exchange for a fee of $60,000 and a lower price for the chips.

The initial applications of the 4004 were not computers in any recognizable sense, but rather things like arcade games, voting machines, and a rudimentary word processor. Later versions of Intel microprocessors were also heavily used for embedded-control applications, including controllers for traffic lights. The first microprocessor-based personal computer was the Altair 8800, a $400 kit from a company called MITS which was introduced in 1975. With the introduction of the VisiCalc spreadsheet program for the Apple II in 1979, and the launch of the IBM personal computer in 1981, the PC rapidly moved into the position of an essential business tool.

The introduction of the microprocessor had a huge impact on the structure of the computer industry. The CPU itself was now highly isolatable from the rest of the computer system, and moreover its production was subject to huge economies of scale. Traditional minicomputer companies such as Digital Equipment were crippled, while new companies such as Dell and Apple which made computers based on off-the-shelf microprocessors could now be started at relatively low cost. With multiple companies producing computers using the same CPU family, the outlook for independent software companies became much more rosy: Microsoft was the preeminent exploiter of this opportunity. And beyond the computer industry per se, microprocessors have provided the intelligence for a considerable realm of devices, ranging from home appliances to machine tools to children’s toys

It’s interesting to note that both Intel AND Microsoft developed/acquired their key products (the 4004 and PC DOS) as for-hire projects from other companies, but neither Busicom nor IBM was astute enough to avoid signing away the rights to these products in exchange for, at best, quite small amounts of money.

22 thoughts on “4004 plus 40”

  1. It would be interesting to know what the Busicom and IBM executives concerned were thinking. For Busicom/Intel, maybe they didn’t see the microprocessor as a generically valuable product–I would guess that they were so wrapped up in trying to make their own product a success that they didn’t think much about broader applications of the technology.

    In the case of IBM/Microsoft, I’d guess they were so self-confident in the impending market dominance of their own PC that they thought DOS opportunities outside of their own ecosystem would be minimal.

  2. It’s fascinating how micro-computers went from zero to bankrupting older industries in a matter of about 15 years, just as digital cameras have done. There may well be a new invention tomorrow that has the same effect on some other industry by 2030 or sooner. Interesting times.

  3. Actually, the statement “Traditional minicomputer companies such as Digital Equipment were crippled” does, as Robin says, compress time quite a bit. DEC’s glory years were still to come. If the Altair was introduced in 1975, then that was 2 years before the VAX which was DEC’s cash cow for a decade.

  4. YARA…”Actually, the statement “Traditional minicomputer companies such as Digital Equipment were crippled” does, as Robin says, compress time quite a bit. DEC’s glory years were still to come.”

    True. It’s the Wile-E-Coyote moment, when he’s able to walk on air for a while before the fall happens.

  5. Nicholas…”Amazing to consider that it was designed manually, in comparison to today’s processor designs which would be impossible without CAD.”

    This article reproduces the metal layout which was used for the 4004, signed with the designer’s initials FF.

    The article also suggests that there may have been a couple of microprocessor designs which slightly predated the 4004, one from Texas Instruments and the other from Four-Phase Systems, but that the 4004 was the first to be actually commercialized.

  6. In San Jose they have a computer museum that I keep meaning to see – As you said David when the microprocessor first came into being making a small “computer” wasn’t on the design list.

    A good friend of mine programmed for Xerox in the late 70s – they were integrating the microprocessor into some of their large printers.

    Putting himself into the position Steve Wozniak did at HP, Larry tried to talk Xerox into designing a PC (which – he had the same success as “The Woz” at HP).

    Amazing thing, the microprocessor.

    Wonder how long “Moore’s Law” will continue.

    “The number of transistors incorporated in a chip will approximately double every 24 months.”
    ””Gordon Moore, Intel Co-Founder
    (from intel.com)

    Funny thing in the “more things change” dept….

    Giant mainframes from the 50s-60s were so large – and generated so much heat – they had to be liquid cooled, much like a car’s radiator.

    Processors got smaller – ran cooler – to the point they could be cooled by fans and air. (although the heat sink and fan on my AMD is kind of comical – the size dwarfs the actual CPU) – but processors are getting so fast we are starting to see some liquid cooling again – although of course on a much smaller scale.

  7. Bill…”Giant mainframes from the 50s-60s were so large – and generated so much heat – they had to be liquid cooled, much like a car’s radiator.”

    Their very size, as well as their newness, surely contributed to their psychological effect. I’m in the early stages of developing a post on technology and employment, and was just reading a book published by Fortune in 1966 called “The Myths of Automation,” which gently mocks some of the more extreme hype about technological change. They quote the writer Jacques Ellus as asserting that the decision to recall General MacArthur was made by an “electronic brain” called EAC which had been developed by the National Bureau of Standards…that EAC had solved equations containing “all the strategic and economic variables of (MacArthur’s) plan” and determined that it was unfeasible.

    The Fortune book notes that the correct name of the computer at NBS was actually SEAC…the total memory capacity of this machine was about 3 kilobytes. But it sure did look a lot more impressive than your laptop does.

  8. David – there was something awfully impressive about walking into a 60s-70s computer room with its false flooring (to hide all the cabling), own cooling system (they were usually pretty cool rooms) , 6-10 giant boxes all humming, assorted tape drives spinning….

    The people who worked in that environment looked rather disdainfully at the new microprocessors – I had a boss years ago who worked at Aerojet – designer of some of the Apollo lunar rockets – and he thought that nothing could come of the then new PCs

    “Toy Computers”, he called them….

    Having straddled both eras – I learned on a small IBM mainframe – a 370 (one of the smaller models) I understood where he was coming from. Programming people from that era were only half jokingly described as a “priesthood” – now with $100 compilers (or free with a free linux system) programming has become mainstream.

    But the capability of the micros now….I am going to install Linux on my little PC – actually have a dual boot system – Linux on the primary 500GB drive, my trusty Win XP on a new terrabyte drive (I refuse to “upgrade” to Win 7 – thinking it intrusive in the way they handle file security – anyway – with 2GB of memory….

    To think that the Bank of America in the 80s had a 370/168 controlling 1000s of ATMs and talking to 1000 branches … all on 128MB of main memory. Disk drives in those days were measured in megabyrtes and costing 100s of thousands of dollars.

    Now I just buy a new terrabyte drive for $130 – would have been far cheaper if Thailand didn’t have the flood –

    Granted the GUI – Graphical User Interface – the mouse and all that – is a tremendous memory & disk hog – but still – my little Linux machine is really a powerful little mainframe – the B of A people would have been amazed…

  9. I learned on a small IBM mainframe – a 370

    If you learned on the 7094, there were no small 370’s. Real men could program in 16K. Pretty soon I’ll need a cane.

  10. I’m a lurker not a commenter, but I can’t resist setting the record straight about Microsoft & IBM. My info source was a PBS history of silicon valley show I saw about 20 years ago but wikipedia agrees with my memory (for what it’s worth).

    IBM never had any rights to DOS (which is what I consider MS’ key product). When IBM decided to get into the small computer market, they decided to go entirely ‘off the shelf’ instead of developing everything internally because they wanted to get something out quickly (plus IBM was under govt anti-trust supervision & terrified of doing anything that looked monopolistic). So they went to MS to license Basic (their top product at the time) & said btw, we want to license your computer operating system too. But MS didn’t have one & had to direct IBM to Gary Kildall, the father of CP/M. But when IBM couldn’t come to an agreement with Kildall, MS went to a company where an employee had reverse engineered a form of CP/M called QDOS (quick & dirty OS) & bought all the rights (from seattle computer products, according to wiki) & never looked back. IBM’s big mistake was not getting an exclusive license for DOS, which allowed the PC-clone industry to develop in the first place, when MS started licensing it to any computer maker willing to pay. That was Bill Gates’ one true stroke of genius in my opinion.

    badgerwx

  11. Mrs Davis – You are right – the smallest 370 was still pretty impressive! I think mine was a 370/135. It had something like 64K memory – semi conductors were just coming out and they had an auxiliary cabinet holding more memory – a 6′ tower holding 256K – not MB but KB!

    It also had those neat disk drives – in fact I think IBM developed the disk drive – but these had their own heads in a removable disk pack

    Badgerwx – the history of DOS is even more interesting – short version – IBM had gone to Monterey to see Gary Kildall (remember CP/M?) – he was out and his wife was in the office – afraid to sign a non-disclosure agreement (IBM was going to make the new PC) so in exasperation the IBM execs phoned Armonk to see what to do.

    They were told of a guy in Seattle – known for making a BASIC interpreter for micros – and maybe he could help

    Well Bill Gates recognized an opportunity when it came – he didn’t have an OS but knew who did – a product called QDOS – for Quick and Dirty Operating System – he bought the rights to it from – I think – Seattle Computer – for $50,000 – cleaned it up a bit and it became what we know as MS-DOS –

    (sorry I am replicating some of your story – I am in a hurry – reread it – but IBMs biggest mistake IMO was not insisting on a good piece of Microsoft in exchange for their using MS-DOS

    Now to load Linux onto my PC!

  12. badgerwx…I’m not seeing any contradiction in what we’re saying. IBM paid Microsoft to acquire nonexclusive rights to an operating system; Microsoft paid Seattle Computer Products to acquire what was apparently an exclusive license to their product, which MS then cleaned up and enhanced and made available to IBM on the aforesaid nonexclusive basis.

    If IBM had demanded say, a 20% perpetual royalty on all MS sales of DOS and derivative products, I doubt that Gates would have turned them down. Expensive omission.

  13. Man, some of those posts bring back memories. I started programming Fortran on an IBM 1620. It didn’t have disk drives – too expensive, so everything was done using punch cards. You loaded the operating system from a deck of cards. Then you loaded the Fortran compiler from a deck of cards, followed by your program on a deck of cards. The compiler would punch out a new bunch of cards that represented the machine language version of your program. I went from that to programming on an IBM 360/30, a real mainframe with 2311 disk drives the size of washing machines that held 7.25 megabytes. In the 80’s I worked on minicomputers and then transitioned to PC based client server apps. These days, it’s all C#, HTML5, Javascript and JQuery.

    I just bought a 16GB Ipod Nano. It is the size of watch and I wear it in a watchband. I calculated that it would take a stack of punch cards 27 miles or 148 IBM 2311 disk drives high to store 16GB.

  14. Before we move on from this thread – wanted to add a few things. While I think DEC can be credited with inventing the first mini computer (an archaic term now) Burroughs was close behind with their B700. My Dad bought one of the first – around 1974 – it had 40K of core memory – for an extra $8,000 or so (1974 dollars!) you could get an additional 8K.

    The disk drives were removable platters of 2MB. The heads were constantly crashing because the platters were removable and the head – when not in the platter, would be exposed to the elements.

    When the head crashed you actually heard it – like a record scratching.

    We didn’t have a console but a teletype – although later B700s had a console.

    It took the entire night to compile a COBOL program.

    Hard to believe these days.

    It was not a multitasking OS – input was limited to the console.

    However we had a paper-punch machine – would punch coded holes in paper tape – not cards – and you could thus get more input that was made “off line”. A tape reader was attached to the CPU.

    If I recall correctly (I am learning to preface this on Chicago Boyz as there are so many knowledgeable posters!) – but Burroughs was known as a software innovator while IBM seemed more hardware. Perhaps that is not even accurate to say about IBM as at one time – Forbes listed the 50 largest computer companies – and IBM, of course, was #1 – and it took the other 49 – Honeywell, Burroughs, Data General (remember them?), DEC, etc – to equal in revenue mighty IBM.

    I think it was Burroughs who invented virtual memory – something common place today.

    IBM invented the disk drive (as far as I know!)

    Mrs Davis was remarking how real programmers used to know how to use every byte in memory – quite true – and they wrote in assembler (after I guess binary!) – with Gigabytes of memory programmers can afford to be sloppy (and OS’s – I was thinking of Windows)

    During my flying days I used to have a friend at the FBO (for General aviation the FBO is kind of an all service shop – usually refueling, sales, service, training) – anyway this fellow was the most unassuming fellow you could meet – short – stocky, crew cut.

    Turns out he had a Phd from MIT and developed some of the first compilers – on some of the earliest computers – (I think ENIAC) but he later worked for Honeywell (yes they made mainframes at one time).

    He told me of the times where you would account for every byte in precious main memory – making sure you freed it when it was no longer needed for an instruction so it would be available.

    Finally I can’t finish this without telling you of my meeting the legendary Grace Hopper – the inventor of COBOL.

    This was the first language that was designed to be independent of the computer – the Navy asked her to design it so software re-writing across platforms would be at a minimum.

    I attended a dinner with her – well, me and 1,000 others – in San Diego in the early 80s.

    She was a character – elderly by this time but still in a Navy admiral’s uniform on active duty – said in hotels people would mistake her (in uniform) for the bellhop – She was a visionary – saying in the future when you wanted more processing power you would simply hook up more computers.

    She said “If you have a horse drawn wagon and need more power, you don’t get a bigger horse – you get more horses!”

    If she could see network computing now – like Google. I heard they have something like 100,000 microcomputers all working in tandem – (as far as I know!)

    She also would ask people if they knew “what a nanosecond looked like”

    She’d then produce a piece of string about 1-2 feet – saying “this is a nanosecond” – the time it took electricity to travel that distance.

    She was a true visionary.

  15. I had forgotten her story about the first bug David – (in the list of quotes) – She did tell that story at the dinner too! If I am not mistaken (the dinner was almost 30 years ago!) but I think she was somehow involved with that Harvard computer.

    I think that original “bug” is in the Smithsonian!

    Man I hope I am thinking that young in my 80s – it really was a memorable dinner and I was fortunate to go.

    There’s plenty of people half her age that think far older than she did – she was always looking forward to the future…

    Thanks for bringing up those quotes – I had to bookmark them.

  16. When I was a first year Physics undergraduate at Imperial in 1976 we had a 4004 evaluation kit in the labs on which we could try writing ludicrously crude programs by keying them in at the “front panel” (a board with switches on it, as if it were a minicomputer). But we could see where it would lead.

Comments are closed.