Within a year, Intel developed its first product - the 3101 Schottky bipolar 64-bit static random access memory (SRAM), which was followed soon after by the 1101. This chip (1101) was a 256-bit SRAM and had been developed on Intel's new "silicon gate metal -oxide semiconductor (MOS) process," which should become the "industry's process technology of choice.") With the first two products, the young company started with 12 employees and net revenues of $2,672 in 1968, had already gained the technological lead in the field of memory chips.
Intel's first really successful product was the 1103 dynamic random access memory (DRAM), which was manufactured in the MOS process. Introduced in 1970, this chip was the "first merchant market LSI (large-scale integrated) DRAM," and received broad acceptance because it was superior to magnetic core memories. So, by the end of 1971, the 1103 became "the world's largest-selling semiconductor device" and provided the capital for Intel's early growth.)
Until today, semiconductors have "adhered to Moore's Law," which has been framed by the "cofounder of Fairchild and Intel" when the first commercial DRAMs appeared in the early 1970s. This law predicts that the price per bit (the smallest unit of memory) drops by 30 percent every year. It implies that you will receive 30 percent more power (speed/capacity) at the same price, or that the "price of a certain power is 30 percent less.")
Moore's Law applies to both memory chips and microprocessors, and shows the unprecedented rapid progress in microelectronics. This "astonishing ratio" has never occurred in "the history of manufacturing" before. Applied to automobiles, it means that "a Cadillac would have a top speed of 500 miles per hour, get two hundred miles to a gallon of gas and cost less than a dollar" - almost incredible.)
1971 was a crucial year at Intel. The company's revenues surpassed operating expenses for the first time, and the company went public, raising $6.8 million.
Moreover, the company introduced a new memory chip - the first erasable, programmable read only memory (EPROM). Invented by Intel's Dov Frohman, the new memory could store data permanently like already existing ROMs, but besides could be erased simply by a beam of ultraviolet light and be used again. The EPROM was initially viewed as a "prototyping device" for R&D. The invention of the microprocessor in the same year, however, showed the real significance of the EPROM, which could be used by original equipment manufacturer (OEM) customers (they build the end-products) to store microprocessor programs in a "flexible and low-cost way." The "unexpected synergy" between the EPROM and the microprocessor resulted in a growing market for both chips and contributed a great deal to Intel's early success.)
"Ted" Hoff's first microprocessor
The invention of the microprocessor marked a turning point in Intel's history. This development "changed not only the future of the company, but much of the industrial world.")
The story to this technological breakthrough began in 1969, when a Japanese calculator manufacturer called Busicomp asked Intel to design a set of chips for a family of programmable calculators. Marcian "Ted" Hoff, a young and "very bright ex-Stanford research associate") who had joined Intel as employee number 12, was charged with this project. However, he did not like the Japanese design calling for 12 custom chips - each of them was assigned a distinct task. Hoff thought designing so many different chip s would make the calculators as expensive as minicomputers such as DEC's PDP-8, although they could merely be used for calculation. His idea was to develop a four-chip set with a general-purpose logic device as its center, which could be programmed by inst ructions stored on a semiconductor memory chip. This was the theory behind the first microprocessor.
With the help of new employee Stan Mazor, Hoff perfected the design of what would be the 4004 arithmetic chip. After Busicomp had accepted Hoff's chip set, Frederico Faggin, one of the best chip design experts, who had been hired recently, began transforming the design into silicon. The 4004 microprocessor, a 4-bit chip (processes 4 bits - a string of four ones or zeroes - of information at a time), contained 2300 MOS transistors, and was as powerful as the legendary first electronic computer, ENIAC.
Soon after the first 4004s had been delivered to Busicomp, Intel realized the market potential of the chip, and successfully renegotiated with the Japanese to regain the exclusive rights, which had been sold to Busicomp.
In November 1971, Intel introduced the 4004 to the public in an Electronic News ad. It announced not just a new product, but "a new era of integrated electronics [...], a micro programmable computer on a chip.") The microprocessor is - as Gordon Moore call s it - "one of the most revolutionary products in the history of mankind,") and ranks as one of 12 milestones of American technology in a survey of U.S. News and World Report in 1982. This chip is the actual computer itself: It is the central processing u nit (CPU) - the computer's brains. The microprocessor made possible the microcomputer, which is "as big as it is only to accommodate us." For "we'd have a hard time getting information into or out of a microprocessor without a keyboard, a printer and a terminal," as Th.Mahon puts it.)
However significant Hoff's invention, nevertheless, it was hardly noticed in the public until early 1973. The microprocessor had its own instruction set and was to be programmed in order to execute specific tasks. So Ted Hoff had to inform the public and t he engineers about the capabilities of the new device and how to program it.
Дата: 2019-04-23, просмотров: 207.