|Главная Случайная страница
Разделы: Автомобили Астрономия Биология География Дом и сад Другие языки Другое Информатика История Культура Литература Логика Математика Медицина Металлургия Механика Образование Охрана труда Педагогика Политика Право Психология Религия Риторика Социология Спорт Строительство Технология Туризм Физика Философия Финансы Химия Черчение Экология Экономика Электроника
1. Look through the text and divide it into logical parts. Say: What do they deal with?
2. Read the text and compare it with Text II. Find out what information you have known and what additional information you can get.
BABBAGE'S DREAM COME TRUE
(1) The Harvard Mark I. A hundred years passed before a machine like the one Babbage conceived was actually built. This occurred in 1944, when Howard Aiken of Harvard University completed the Harvard Mark I Automatic Sequence Controlled Calculator.
(2) Aiken was not familiar with the Analytical Engine when he designed the Mark I. Later, after people had pointed out Babbage's work to him, he was amazed to learn how many of his ideas Babbage had anticipated.
(3) The Mark I is the closest thing to the Analytical Engine that has ever been built or ever will be. It was controlled by a punched paper tape, which played the same role as Babbage's punched cards. Like the Analytical Engine, it was basically mechanical. However, it was driven by electricity instead of steam. Electricity also served to transmit information from one part of the machine to another, replacing the complex mechanical linkages that Babbage had proposed. Using electricity (which had only been a laboratory curiosity in Babbage's time) made the difference between success and failure.
(4) But, along with several other electromechanical computers built at about the same time, the Mark I was scarcely finished before it was obsolete. The electromechanical machines simply were not fast enough. Their speed was seriously limited by the time required for mechanical parts to move from one position to another. For instance, the Mark I took six seconds for a multiplication and twelve for a division; this was only five or six times faster than what a human with an old desk calculator could do.
(5) ENIAC. What was needed was a machine whose computing, control, and memory elements were completely electrical. Then the speed of operation would be limited not by the speed of mechanical moving parts but by the much greater speed of moving electrons.
(6) In the late 1930s, John V. Atanasoff of Iowa State College demonstrated the elements of an electronic computer. Though his work did not become widely known, it did influence the thinking of John W. Mauchly, one of the designers of ENIAC.
(7) ENIAC — Electronic Numerical Integrator and Computer — was the machine that rendered the electromechanical computers obsolete. ENIAC used vacuum tubes for computing and memory. For control, it used an electrical plug board, like a telephone switchboard. The connections on the plug board specified the sequence of operations ENIAC would carry out.
(8) ENIAC was 500 times as fast as the best electromechanical computer. A problem that took one minute to solve on ENIAC would require eight to ten hours on an electromechanical machine. After ENIAC, all computers would be electronic.
(9) ENIAC was the first of many computers with acronyms for names. The same tradition gave us EDVAC, UNIVAC, JOHNIAC, ILLIAC, and even MANIAC.
(10) EDVAC. The Electronic Discrete Variable Computer — EDVAC — was constructed at about the same time as ENIAC. But EDVAC, influenced by the ideas of the brilliant Hungarian-American mathematician John von Neumann, was by far the more advanced of the two machines. Two innovations that first appeared in EDVAC have been incorporated in almost every computer since.
(11) First, EDVAC used binary notation to represent numbers inside the machine. Binary notation is a system for writing numbers that uses only two digits (0 and 1), instead of the ten digits (0-9) used in the conventional decimal notation. Binary notation is now recognized as the simplest way of representing numbers in an electronic machine.
(12) Second, EDVAC's program was stored in the machine's memory, just like the data. Previous computers had stored the program externally on punched tapes or plug boards. Since the programs were stored the same way the data were, one program could manipulate another program as if it were data. We will see that such program-manipulating programs play a crucial role in modern computer systems.
(13) A stored-program computer — one whose program is stored in memory in the same form as its data — is usually called a von Neumann machine in honor of the originator of the stored-program concept.
(14) From the 1940s to the present, the technology used to build computers has gone through several revolutions. People sometimes speak of different generations of computers, with each generation using a different technology.
(15) The First Generation. First-generation computers prevailed in the 1940s and for much of the 1950s. They used vacuum tubes for calculation, control, and sometimes for memory as well. First-generation machines used several other ingenious devices for memory. In one, for instance, information was stored as sound waves circulating in a column of mercury. Since all these first-generation memories are now obsolete, no further mention will be made of them.
(16) Vacuum tubes are bulky, unreliable, energy consuming, and generate large amounts of heat. As long as computers were tied down to vacuum tube technology, they could only be bulky, cumbersome, and expensive.
(17) The Second Generation. In the late 1950s, the transistor became available to replace the vacuum tube. A transistor, which is only slightly larger than a kernel of corn, generates little heat and enjoys long life.
(18) At about the same time, the magnetic-core memory was introduced. This consisted of a latticework of wires on which were strung tiny, doughnut-shaped beads called cores. Electric currents flowing in the wires stored information by magnetizing the cores. Information could be stored in core memory or retrieved from it in about millionth of a second.
(19) Core memory dominated the high-speed memory scene for much of the second and third generations. To programmers during this period, core and high-speed memory were synonymous.
(20) The Third Generation. The early 1960s saw the introduction of integrated circuits, which incorporated hundreds of transistors on a single silicon chip. The chip itself was small enough to fit on the end of your finger; after being mounted in a protective package, it still would fit in the palm of your hand. With integrated circuits, computers could be made even smaller, less expensive, and more reliable.
(21) Integrated circuits made possible minicomputers, tabletop computers small enough and inexpensive enough to find a place in the classroom and the scientific laboratory.
(22) In the late 1960s, integrated circuits began to be used for high-speed memory, providing some competition for magnetic-core memory. The trend toward integrated-circuit memory has continued until today, when it has largely replaced magnetic-core memory.
(23) The most recent jump in computer technology came with the introduction of large-scale integrated circuits, often referred to simply as chips. Whereas the older integrated circuits contained hundred of transistors, the new ones contain thousands or tens of thousands.
(24) It is the large-scale integrated circuits that make possible the microprocessors and microcomputers. They also make possible compact, inexpensive, high-speed, high-capacity integrated-circuit memory.
(25) All these recent developments have resulted in a microprocessor revolution, which began in the middle 1970s and for which there is no end in sight.
(26) The Fourth Generation. In addition to the common applications of digital watches, pocket calculators, and personal computers, you can find microprocessors — the general-purpose processor-on-a-chip — in virtually every machine in the home or business — microwave ovens, cars, copy machines, TV sets, and so on. Computers today are hundred times smaller than those of the first generation, and a single chip is far more powerful than ENIAC. (26) The Fifth Generation. The term was coined by the Japanese to describe the powerful, intelligent computers they wanted to build by the mid-1990s. Since then it has become an umbrella term, encompassing many research fields in the computer industry. Key areas of ongoing research are artificial intelligence (AI), expert systems, and natural language.