The Microprocessor is 50

Andy Kessler, writing in the WSJ, notes that the Intel 4004 microprocessor was introduced to the market on November 15, 1971.

Here’s a history of the project, and here’s an article which describes some of the related projects that were going on at other companies.

Busicom’s decision to give Intel the rights to sell the 4004 for non-calculator applications in exchange for a break on prices–rather than royalties or equity or warrants in Intel–has to rank as one of the most expensive business mistakes ever.

 

8 thoughts on “The Microprocessor is 50”

  1. Back in the 80s, we had the 2nd largest computer club in the country, after Boston. In those days they were full of microprocessor enthusiasts. In fact I think Apple’s co-founders were in the Homebrew club in San Jose.

    Anyway among the monthly speakers coming to our venue were Bill Gates and Andy Grove of Intel. . I should have recorded their talks…

    As far as bad business decisions, who can say what what would have happened had Busicom retained the rights? Would they have seen – or acted on – the the same opportunities? It is easy to think of alternative scenarios in a linier fashion, and that things would have turned out the same way as they did.

    I have in the last few months pondered what would our relationship with Communist China be today had Nixon and Kissinger not pushed for trade? Would a desperate China have caused a nuclear war (they got The Bomb in 1966), or would they even be communist today?

    For a few years nobody could see the purpose of a personal PC. It was IBM who really made it respectable for businesses. When Tandy came out with the TRS-80, the joke was “what do you do with it? And the answer was “keep recipes”. And the answer to that was “so can a metal box”.

    The story of how Microsoft got the contract to develop the OS is legendary. It involves the owner of the then-dominant OS, CP/M, being on a sailboat out of reach and IBM reps then calling a young Bill Gates, who had no OS but knew someone who did (for $50,000).

    What if IBM had demanded that Microsoft give them a half interest?

  2. There are so many decisions from that time that reverberate to this day. What if IBM had chosen Motorola instead of Intel with it’s complicated (from a programming standpoint) memory segmenting. The reason was mostly cheaper and who would ever need more than 640K. The decision to include GW Basic when compilers ran on minicomputers, not microcomputers. It took Apple several years to come up with a self hosted programming environment for the Mac, so you didn’t have to buy a very expensive Lisa to write a program.

    The 4004 was a dead end as far as computers were concerned but it and its descendants and clones lived on in countless VCR’s and microwaves.

  3. Bill…I didn’t mean Busicom retaining the rights and marketing the microprocessor themselves, rather, getting compensated for the $ they had put into the development in form of an ongoing financial interest in the microprocessor product’s success, rather than as a price reduction on the chips for their calculator. Similar to what IBM could have done with the Microsoft OS.

  4. @MCS

    Don Estridge made the 8088 v 68000 decision for the 5150 based purely for time schedule reasons. No other. The original hardware spec was for the vastly superior 68000/68008 from Motorola. The lead motherboard designer had a working motherboard for another project based on the 8085 so was able to knock about 8 to 12 weeks of the schedule it would take to design a 68000 motherboard from scratch. Even though the Motorola support chips were clean and well behaved. Unlike the 8086/88 support chips which were a typical mess.

    But despite all that the final decider was that Motorola were honest in giving ramp up times for delivery of the 68000 in 100K quantities. Whereas Intel as usual promised immediate delivery of the 8088 in quantity and as usual were both late and slow in ramping up volume delivery. So in the end little time difference from the original Moto guaranteed delivery dates.

    IBM going with the 8088 was a terrible technology decision. Catastrophic. The 8086 architecture was already obsolete in 1965 let alone in 1978. But without that decision by Estridge Intel would have been just another failed RAM maker with a small sideline in mostly mediocre processors.

    The 68000 was the model for which pretty much all other subsequent processors used. It was clean, consistent and easy to learn. And the delivered CPU was bug free from the get go. Over the last 35 plus year I have tried to avoid lower level x86 at all costs. Programming it is very frustrating and beyond stupid. I recently had to write a large chunk of x86 assembly language for a compiler and on several occasions I just wanted to go down to Santa Clara with a baseball bat to expression my appreciation for their technical incompetence. Very straight forward code sequence which I wrote in 68K assembler in 1984 (and 6502 in 1978) were still impossible in x86 (but could be finessed in x64). The AMD people at least are technically competent.

    Intel has always been just like Microsoft. Purveyors of technical mediocrity. But as its a low information market with very high lock in costs the mediocre usually wins. Although the mobile platform has worked out well so far. With both ARM and Android winning big and Intel and MS losing big time.

    As for Mac hosted development in the early days. The first MacOS shrink-wrap product I worked on in 1984 was cross complied C from a VAX with a custom downloader stub on a hosted 128K Mac. Got a 512K later on for QA. Native work done in MDS assembler from Apple. No Lisa Lisa Workshop involved.

    Next project in 1985 was initially a cross compiled / Lisa Workshop hybrid but converted to MPW in late 1985 using early betas on prototype MacPlus and HD20. Did last ever build on a Lisa in early 1986 which we then threw in a dumpster. Taking a hammer to the Profile harddrive. Hated them both. By the Fall of 1986 was building on a prototype Mac II with 3 monitors attached. With contiguous desktop. Something that only worked (mostly) 15 years later on Window 2000. But that happy world of technical competent and innovation (and really great people to deal with) from Apple ended in 1997 so no more good stuff since then. I’ve shipped both MacOS X and iOS product since then and just as gruesome and frustrating as WinTel world.

  5. It will be interesting to see how Intel does under Pat Gelsinger’s leadership. I have to wonder how many sharp/creative people are left in the company for him to work with…at all levels.

  6. It was June or July of 1971 when I drove south from Saratoga, CA to Pasadena to help my elder (by 4 years) brother vacate his dorm room at Blacker Hall at Cal Tech and move up to Santa Clara and begin his new job at Intel. He had earned his BS in Physics and MS in Solid State Physics there, and been hired into the Intel Engineering Dept. He told me at the time that Intel sales were running $12MM per year. I don’t know squat about semi-conductors, and my bro and I were not ever particularly close, but in ’73 or so he gave me a Microma watch after Intel bought the Co. They were pedaling them at the time for $500 per. Soon, Japanese makers had digital watches at much lower price points. I remember my brother stating that the Japanese makers were “pissing in the soup.” After a couple of years in Santa Clara, Intel built their plant in Beaverton, OR, and transferred my brother there as (I believe) Chief Engineer in Production. By 1977, my brother had concluded that the Japanese were going to end up dominating the semi space, and quit Intel. Using the capital gains on his Intel stock options, he financed 10 years of medical school, Internship and Orthopedic Surgery Residency at UC San Francisco, plus living expenses in San Francisco for he, his wife and child.

    Back about 2016 or so, I sold off my Intel shares and bought Apple. I don’t have much confidence that Intel can turn things around, but would be tickled if they did.

    I will add one interesting anecdote. On a visit to my brother at Cal Tech, probably also in 1971 (I travelled up from La Jolla, where I was a sophomore @ UCSD), he took me to the Solid State Physics Lab late at night. He opened up a vacuum metallization chamber and set up an incandescent light bulb with some gold wire wrapped around the nichrome wire heating element. Pumped the chamber down as far as it would go, heat up the nichrome wire and the gold wire vaporized and ended up on the glass light globe. When the chamber was opened up, I saw a nicely gold colored light bulb. Screwed into a light socket and plugged in, it yielded a red light cast. Lasted about 2 seconds, then burned out. The chamber vacuum was so high, that it caused the socket to globe seal to fail, and the filament burned out. Oh well.

  7. Intel, now, is in the seemingly enviable position of selling anything they can build. What they build is CPU’s for PC’s which is a declining market and servers where there is increasing competition especially along the $/MFLOP axis. All their initiatives toward things like phones and graphics processors have either died without a trace in phones or is seemingly stuck with integrated graphics. They lost their embedded business.

    The Japanese competition in the CPU space never really materialized, and it looks like they will trade dollars with AMD as long as the PC thing keeps going.

Comments are closed.