Structure of the microprocessor
Enviado por bernard1133 • 14 de Junio de 2013 • 4.916 Palabras (20 Páginas) • 368 Visitas
The microprocessor owes its phenomenal success to a paradox created by the combination of technology and economics. Due to techniques that squeeze roughly twice as many circuits onto silicon every 18 months or so by decreasing line widths, increasing wafer diameters, and adding layers, each new generation eventually comes to market at around the same price as the last, but with twice the power. More compact circuitry makes microprocessors faster because electrons have less distance to travel. As chips get smaller, more of them can be etched onto the same diameter silicon wafer by improved fabrication equipment that today handles multiple-layer eight-inch wafers as easily as it did two- and three-inch, single-layer wafers ten years ago. Consequently, microprocessors have taken over functions that used to require warehouses of discrete components, whetting a seemingly limitless appetite for increasingly affordable, higher performing chips.
Over the last decade, these advances in pricing and processing power have made the personal computer the largest consumer of microprocessors. At the same time, microprocessors have transformed the ubiquitous PC from a stand-alone office workhorse doing word-processing and spreadsheets to a widely connected, information machine that can send faxes and e-mail, access on-line services, or provide a video link for meetings. Today, Pentium processors and clones are driving the PC into untapped, new frontiers of mass-market communications and interactive multimedia home computing. By the turn of the century, when high volume chips are capable of executing more than a billion instructions per second, doors will open to brave new worlds we can only begin to imagine such as holographic videoconferencing and personal digital assistants that beep your cardiologist when your stock portfolio slides.
The Pentium microprocessor (actual size)
HOW MICROPROCESSORS WORK
Microprocessor---A microprocessor, also called a CPU, is a tiny, enormously powerful high speed electronic brain etched on a single silicon semiconductor chip which contains the basic logic, storage and arithmetic functions of a computer. It thinks for the computer and, like a traffic cop, coordinates its operations. It receives and decodes instructions from input devices like keyboards, disks, or cassettes, then sends them over a bus system consisting of microscopic etched conductive "wiring" to be processed by its arithmetic calculator and logic unit. The results are temporarily stored in memory cells and released in a timed sequence through the bus system to output devices such as CRT Screens, networks, or printers.
The first microprocessor, the Intel 4004 4-bit (1971), measured just 1/8" by 1/16" yet was as powerful as the first electronic computer 25 years earlier (1946), which weighted 30 tons, used 18,000 vacuum tubes, and required so much electricity that the lights of West Philadelphia are said to have dimmed each time it was turned on. Today, DEC's 64-bit Alpha microprocessor is more than 550 times as powerful as the 4004, with speeds comparable to yesterday's mainframes.
Programmability---CPUs can be programmed either by the chip manufacturer, distributor, or the computer manufacturer. Those programs for a single purpose product like a calculator or a video game, are generally written by the OEM and entered into memory by the CPU manufacturer or distributor. For PCs, the CPU, which must perform a wide range of tasks, is generally programmed by the computer's manufacturer. The user merely inserts a prerecorded cassette tape, cartridge, or floppy disk containing instructions for each application into the computer and the CPU performs the instructions.
Key Components---A microprocessor has five key components; an arithmetic and logic unit (ALU), which calculates and thinks logically; registers, which are memory cells that store information temporarily for the ALU; a control unit, which decodes input instructions, and acts as a traffic cop; bus systems, which are submicron wiring routes connecting the entire system; and a clock, which times the sequential release of the processed data.
Computer Interface---In addition to a CPU, a computer requires memory and parts for connecting it to input/output devices. A device that includes a CPU, memory and input/output ports on a single chip is called a microcontroller. The two basic types of memory are RAM (Random Access Memory) and ROM (Read Only Memory). RAM stores modifiable programs and data that the user needs to perform immediate tasks. ROM stores unalterable programs that govern the computer's standard operations. Input devices like keyboards, mouses, and programs on cassette tape, cartridges, disks, and CD-ROM enable us to communicate with the computer. Output devices like monitors, printers, and moderns enable computers to communicate with each other.
It is astonishing how electronic computer technology has transformed our world considering it is only 50 years old and the microprocessor, which revolutionized computers, is less than 25 years old. Its development involved the convergence of three developing technologies: the calculator, the computer and the transistor.
The first "digital calculators"--fingers and thumbs--are still in use, as is the 5,000 year old abacus. Calculators go back to 17th century inventions, including Schickard's Calculating Clock (1623), Pascal's Pascaline (1642), and Leibriz's Stepped Reckoner (1673)--machine that used hand-made gears and wheels to do arithmetic.
The next breakthrough came as a result of the Industrial Revolution during the first half of the 19th century with Charles Babbage's steam powered mechanical computing and printing machines: The Difference Engine and the Analytical Engine. Although never completed, their designs included a "mill" to carry out arithmetic processes, a "store" to hold data, a "barrel" to control the operation and "punch cards" to provide the external program--all fundamentals of modern computers.
Business calculators appeared next. In 1884, Felt invented the key-driven mechanical calculator. In 1893, Steiger introduced a mass-produced, automated version of Leibniz's machine, widely used for scientific calculations.
Twentieth Century---At the turn of the century, a flood of immigration created a major problem for US Census takers. To hand sort the 1890 census information would have taken a decade, rendering the data virtually useless. But in 1889, Herman Hollerith came up with a solution: the first electromechanical machine for recording and tabulating digital material.
In the 1920's and 30's, the use of punch
...