Who actually invented the single-chip microprocessor?
Trying to answer to this question, we will stumble upon again into the same story as with the inventions of the integrated circuit, the transistor, and many others gadgets, reviewed in this site. Several people got the idea almost at the same time, but only one got the all glory, and he was the engineer Ted Hoff (together with the co-inventors Mazor and Faggin) at Intel Corp., based in Santa Clara, California.
In 1950s and 1960s the microprocessor CPUs (Central Processing Units) were built with many chips or with a few LSI (large scale integration) chips. In the late 1960s, many articles had discussed the possibility of a computer on a chip. However, all concluded that the integrated circuit technology was not yet ready. Ted Hoff was probably the first to recognize that Intel’s new silicon-gated MOS technology might make a single-chip CPU possible if a sufficiently simple architecture could be developed.
In 1990 another U.S. engineer and inventor—Gilbert Hyatt from Los Angeles, after a 20-year battle with the patent office, announced that he had finally received a certificate of intellectual ownership for a single-chip microprocessor, that he says he invented in 1968, at least a year before Intel started (see the U.S. patent №4942516). Hyatt asserted that he put together the requisite technology a year earlier at his short-lived company, Micro Computer Inc., whose major investors included Intel’s founders, Robert Noyce and Gordon Moore. Micro Computer invented a digital computer that controlled machine tools, then fell apart in 1971 after a dispute between Hyatt and his venture-capital partners over sharing his rights to that invention. Noyce and Moore went on to develop Intel into one of the world’s largest chip manufacturers. “This will set history straight,” proclaimed Hyatt. “And this will encourage inventors to stick to their inventions when they’re up against the big companies.” Nothing came out however from Hyatt’s pretensions for pioneering and licensing fees from computer manufacturers.
In 1969, the Four-Phase Systems, a company just established by several former Fairchild engineers, lead by Lee Boysel, designed the AL1—an 8-bit bit slice chip, containing eight registers and an ALU (see the photo below). At the time, it formed part of a nine-chip, 24-bit CPU with three AL1s. Actually, the AL1 was called a microprocessor much later (in 1990s) when, in response a litigation by Texas Instruments, a demonstration system was constructed where a single AL1 formed part of a courtroom demonstration computer system, together with an input-output device, RAM and ROM. AL1 was shipped in data terminals from that company as early as 1969.
The Four-Phase Systems AL1 processor (source: www.computerhistory.org
Almost at the same time as Intel, the single-chip processors design started in Texas Instruments (engineers Gary Boone and Michael Cochran) and in 1971 the chip was ready. The result of their work was the TMS 1000 which went commercial in 1974.
In 1968, the company Garrett AiResearch, with designer Ray Holt and Steve Geller, were invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy’s new F-14 Tomcat fighter. The design was complete by 1970, and used a MOS-based chipset as the core CPU. The design was significantly (approximately 20 times) smaller and much more reliable than the mechanical systems it competed against, and was used in all of the early Tomcat models. This system contained a “a 20-bit, pipelined, parallel multi-microprocessor”. However, the system was considered so advanced that the Navy refused to allow publication of the design until 1997. For this reason the CADC, and the MP944 chipset it used, are fairly unknown even today.
In the same 1968, a former NASA engineer from San Antonio, Texas—Austin O. “Gus” Roche, who was obsessed with making a personal computer, designed a single-chip processor. In early 1970 Roche, already an owner of the company Computer Terminal Corporation (CTC), which was working on the design of one of the first personal computers, Datapoint 2200, had a meeting with Bob Noyce, head of Intel, to try to get Intel, then a start-up devoted to making memory chips—to produce the CPU chip. Roche presented the proposed chip as a potentially revolutionary development and suggested that Intel develop the chip at its own expense and then sell it to all companies.
Noyce said it was an intriguing idea, and that Intel could do it, but it would be a dumb move. He said that if you have a computer chip, you can only sell one chip per computer, while with memory, you can sell hundreds of chips per computer. Nevertheless, Noyce agreed to grant a $50,000 development contract to Roche.
In the beginning of 1969, Busicom, a trading name of a now defunct Japanese company called Electro-Technical Industries (ETI), was planning a range of next generation programmable calculators. Busicom contacted a new and unknown then company—Intel, and in April, 1969, a contract was signed to develop a custom chipset. Busicom engineers interfaced with Intel’s manager of applications—Marcian (Ted) Hoff and made an initial proposal to produce a set of 7 chips (custom LSI chips). The Japanese wanted separate chips for program control, decimal arithmetic, ROM, binary arithmetic, printer control, output ports and timing logic. Intel assigned its manager of applications—Marcian (Ted) Hoff, and later the engineers Federico Faggin and Stan Mazor to work with Busicom engineer Masatoshi Shima. Ted Hoff (see the nearby photo) soon realized, that 7-chip design could not meet the cost objectives for the project and the design would be too complex, so he decided that Intel could make four chips to do the work of seven. Intel and Busicom agreed and funded the new programmable, general-purpose logic chip. Thus Intel was committed to the first single-chip CPU, the 4004. Initially the 4000 chipset consisted of 4 chips—4004, 4001 (ROM), 4002 (RAM), 4003 (shift register).
Federico Faggin headed the design team along with Ted Hoff and Stan Mazor, who wrote the software for the new chip. Nine months later, a revolution was born. The team created the chip which consisted of 2300 P-MOS (Metal Oxide Semiconductor) transistors in an area of about 3 by 4 millimeters, manufactured using 10µ technology. The baby chip had as much power as the ENIAC, which had filled 3000 cubic feet with 18000 vacuum tubes. The Intel 4004 used binary-coded decimal arithmetic on 4-bit word, executes about 100000 instructions per second, has a 45 command instruction set, and performs basic addition and subtraction. It requires 15 VDC power supply and a peculiar clock source to operate. Memory bus architecture is of the Harvard type, but can be fooled into operating as von Neumann.
The 4004 is part of a family of four LSI components (the MCS-4 family), that can be used to build digital computers with varying amounts of memory. The other components of the MCS-4 family are memories and input/output circuits, which are not considered part of a CPU in any computer classification, but are necessary to implement a complete computer. Specifically: the 4001 is a ROM (read-only memory) with 4 lines of I/O (input/output); the 4002 is a RAM (random access memory) with 4 lines of output; the 4003 is a static shift register to be used for expanding the I/O lines, for example, for keyboard scanning or for controlling a printer. Later on were released also 4 other support chips (4008, 4009, 4269 and 4289).
The initial clock speed of 4004 is 108 kHz, while the maximum clock speed is 740 kHz. Memory bus architecture is of the Harvard type (The Harvard architecture is a computer architecture with physically separate computer storage and signal pathways for instructions and data. The term originated from the Harvard Mark I relay-based computer, which stored instructions on punched tape and data in electromechanical counters). The 4004 used 4-bit bus for transferring, 12-bit addresses, 8-bit instructions and 4-bit data word.
Instruction set contains 46 instructions. Register set contains 16 registers of 4 bits each. It had 3 levels deep stack (stacks in computing architectures are regions of memory where data is added or removed in a LIFO —last-in-first-out manner).
The first working CPU (see the nearby photo) was delivered to Busicom in February, 1971. It was called “Microcomputer on a chip” (the word microprocessor wasn’t used until 1972). The first known advertisement for the 4004 is dated back to November 1971, it appeared in Electronic News. The first commercial product to use a microprocessor was the Busicom calculator 141-PF. After Intel delivered the four chips, Busicom eventually sold some 100000 calculators. Cleverly, Intel decided to buy back the design and marketing rights to the 4004 from Busicom for $60000. Intel followed a clever marketing plan to encourage the development of applications for the 4004 chip, leading to its widespread use within months.
The functional elements integrated in the 4004 are:
16 general purpose registers (In computer architecture, a processor register is a small amount of Computer storage available on the CPU whose contents can be accessed more quickly than storage available elsewhere)
Program counter (The program counter, or PC is a processor register that indicates where the computer is in its instruction sequence. Depending on the details of the particular computer, the PC holds either the address of the instruction being executed, or the address of the next instruction to be executed
ALU—Arithmetic logic unit (An arithmetic logic unit is a digital circuit that performs arithmetic and logical operations. The ALU is a fundamental building block of the central processing unit of a computer, and even the simplest microprocessors contain one for purposes such as maintaining timers)
Instruction decoder (A decoder is a device which does the reverse of an encoder, undoing the encoding so that the original information can be retrieved. The same method used to encode is usually just reversed in order to decode.)
Generation of timing signal for the CPU and for the rest of the MCS-4 family
Control of the external bus for the memory and for the I/O functions.
The 4004 included also the control functions for the memory and the I/O which are not normally handled by the microprocessor. The 4004, therefore, is not only a complete CPU, but has also additional functionality that normally is not considered a part of a CPU (microcontroller).
Stan Mazor, Ted Hoff, and Federico Faggin (seated, rightward) answer questions for high school students
In 1973 Marcian Edward (Ted) Hoff (born October 28, 1937), Stanley Mazor (born October 22, 1941) and Federico Faggin (born December 1, 1941) applied for a patent as assignors of Intel. The patent (see the patent №3821715) of Hoff, Mazor and Faggin was granted the next 1974.
A computer-on-a-chip is a variation of a microprocessor which combines the microprocessor core (CPU), some memory, and I/O (input/output) lines, all on one chip. It is also called as micro-controller. The computer-on-a-chip patent, called the “microcomputer patent” at the time, U.S. Patent №4074351, was awarded to Gary Boone and Michael J. Cochran of TI. Aside from this patent, the standard meaning of microcomputer is a computer using one or more microprocessors as its CPU(s), while the concept defined in the patent is perhaps more akin to a microcontroller.
The first 8-bit microprocessor was manufactured again by Intel, this time under contract of another company—Computer Terminals Corporation, later called Datapoint, of San Antonio TX. Datapoint wanted a chip for a terminal they were designing. Intel marketed it as the 8008 in April, 1972. This was the world’s first 8-bit microprocessor, but the chip was rejected by CTC as it required many support chips. In April 1974, Intel announced its successor, the world-famous 8080, which opened up the microprocessor component marketplace. With the ability to execute 290000 instructions per second and 64K bytes of addressable memory, the 8080 was the first microprocessor with the speed, power, and efficiency to become a key tool for designers. Development labs set up by Hamilton/Avnet, Intel’s first microprocessor distributor, showcased the 8080 and provided a broad customer base which contributed to its becoming the industry standard. A key factor in the 8080’s success was its role in the introduction in January 1975 of the MITS Altair 8800 the first successful personal computer. It used the powerful 8080 microprocessor and established the precedent that personal computers must be easy to expand. With its increased sophistication, expansibility, and an incredibly low price of $395, the Altair 8800 proved the viability of home computers.
MITS Altair 8800, the first successful personal computer
Admittedly Intel was the first, but not the only company for microprocessors (see the Timeline of Intel’s Microprocessors). The competing Motorola 6800 was released August 1974, the similar MOS Technology 6502 in 1975 and Zilog Z80 in 1976.
The first multi-chip 16-bit microprocessor was the National Semiconductor IMP-16, introduced in early 1973. An 8-bit version of the chipset was introduced in 1974 as the IMP-8. During the same year, National introduced the first 16-bit single-chip microprocessor, the PACE (see the nearby photo), which was later followed by an NMOS version, the INS8900. The first single-chip 16-bit microprocessor was TI’s TMS 9900, introduced in 1976, which was also compatible with their TI-990 line of minicomputers. Intel produced its first 16 bit processor, the 8086, in 1978. It was source compatible with the 8080 and 8085 (an 8080 derivative). This chip has probably had more effect on the present day computer market than any other, although whether this is justified is debatable; the chip was compatible with the 4 year old 8080 and this meant it had to use a most unusual overlapping segment register process to access a full 1 Megabyte of memory.
The most significant of the 32-bit designs is the MC68000, introduced in 1979. The 68K, as it was widely known, had 32-bit registers but used 16-bit internal data paths, and a 16-bit external data bus to reduce pin count, and supported only 24-bit addresses. Motorola generally described it as a 16-bit processor, though it clearly has 32-bit architecture. The combination of high performance, large (16 megabytes (2^24)) memory space and fairly low costs made it the most popular CPU design of its class. The Apple Lisa and Macintosh designs made use of the 68000, as did a host of other designs in the mid-1980s, including the Atari ST and Commodore Amiga.
The world’s first single-chip fully-32-bit microprocessor, featuring 32-bit data paths, 32-bit buses, and 32-bit addresses, was the AT&T Bell Labs BELLMAC-32A, with first samples in 1980, and general production in 1982.