The list of geographical names — КиберПедия 

История развития хранилищ для нефти: Первые склады нефти появились в XVII веке. Они представляли собой землянные ямы-амбара глубиной 4…5 м...

Папиллярные узоры пальцев рук - маркер спортивных способностей: дерматоглифические признаки формируются на 3-5 месяце беременности, не изменяются в течение жизни...

The list of geographical names

2017-12-22 310
The list of geographical names 0.00 из 5.00 0 оценок
Заказать работу

Список географических названий

 

Azores [ə′zo:z ] Азорские острова

The Appalachians [ֽæpə′leit∫јənz ] - Аппалачи

The Arctic ocean [′a:ktik ]- Северный Ледовитый океан

Asia [′ei∫ə ] - Азия

The Atlantic ocean [ət′læntik ′əu∫ən] - Атлантический океан

The Black Sea [′blæk ′si: ] - Черное море

Ben Nevis [′ben ′nevis ] - Бен Невис (гора)

Boston [ ′bostən ] - Бостон

The British Isles [′briti∫ ′ailz ] - Британские острова

Birmingham [′bə:miŋəm ] - г. Бирмингем

Cambrige [ ′keinbridз ] - г. Кембридж

Cardiff [′ka:dif ] - г. Кардиф

California [ֽkæli′fo:nјə ] - Калифорния

Chicago [∫i ′ka:gəu ] - г. Чикаго

Columbia [kə′l۸mbiə ] - Колумбия

The Caucasus [′ko:kəsəs ] - Кавказ

The Caspian Sea [′kæspiən ′si: ] - Каспийское море

Edinburgh [′edinbərə ] - г. Эдинбург

England [′iŋglənd ] - Англия

The English Channel [′iŋgli∫ ′t∫ænl ] - пролив Ла-Манш

Europe [′јuərəp ] - Европа

France [′fra:ns ] - Франция

Glasgow [′gla:sgəu ] - г. Глазго

Greece [′gri:s ] - Греция

Great Lakes [′geit ′leiks ] - Великие Озера

Ireland [′aiələnd ] - Ирландия

Italy [′itəli ] - Италия

The Irish Sea [′airi∫ ′si: ] - Ирландское море

Izhevsk [i′зevsk ] – г. Ижевск

London [′l۸ndən ] - г. Лондон

Los-Angeles [los′ændзili:z ] – г. Лос-Анджелес

Lough-Ness [′lok ′nes ] - оз. Лох-Несс

Manchester [′mænt∫istə ] - г. Манчестер

The Mississippi [ֽmisi′sipi ] - р. Миссисипи

The Missouri [ mi′zuəri ] - р. Миссури

Moscow [′moskəu ] – г. Москва

Murmansk [′murmansk ] - г. Мурманск

The North Sea [′no:Ө ′si: ] - Северное море

North America [′no:Ө ə′merikə ] - Северная Америка

New York [′nјu: ′јo:k ] - г. Нью-Йорк

The Ohio [əu′haiəu ] - р. Охайо

Oxford [′oksfəd ] - г. Оксфорд

The Pacific Ocean [pə′sifik ′əu∫ən ] - Тихий океан

Pennsylvania [ֽpensil′veinјə ] - Пенсильвания

Philadelphia [ֽfilə′delfјə ] - г. Филадельфия

The Potomac [po′təumək ] - р. Потомак

Poland [′pəulənd ] - Польша

Russia [′r۸∫ə ] - Россия

Russian Federation [′r۸∫ən ′fedərei∫n ] - Российская Федерация

The Sayans [′saјans ] - Саяны (горы)

San-Francisco [′sænfrən′siskəu ] - г. Сан-Франциско

Scotland [′skotlənd ] - Шотландия

Sheffield [′∫efi:ld ] - г. Шеффилд

St. Petersburg [snt ′pi:təzbə:g ] - г. Санкт-Петербург

The Thames [temz ] - р. Темза

Texas [′teksəs ] – шт. Техас

The USA-The United States of America -Соединенные Штаты Америки

The United Kingdom of Great Britain and Northern Ireland [јu′naitid ′kiŋdəm əv ′greit ′britn ənd no:Өən ′aiələnd ] - Королевство Великобритании и Северной Ирландии

The Urals [′јurəlz ] - Уральские горы

Virginia [ və′dзinјə ] – шт. Виргиния

Vladivostok [Vladivostok ] - г. Владивосток

Wales [weilz ] - Уэльс

Washington [ ′wo∫iŋtən ] - г. Вашингтон

 

БИБЛИОГРАФИЧЕСКИЙ СПИСОК

 

2. Андрианова Л.Н. Книга для чтения к учебнику английского языка для заочных неязыковых вузов. - М.: Высш. шк., 1973.

3. Аракин В.Д. Практический курс английского языка. - М.: Высш. шк., 1987.

4. Гмирянская В.А. Учебник английского языка. - Киев: Выща. шк., 1991.

5. Новицкая Т.М. Учебник английского языка. - М.: Высш. шк., 1983.

6. Новицкая Т.М. Книга для чтения по английскому языку. М.: Высш. шк., 1983.

7. Рогова Г.В. Английский язык за два года. - М.: Просвещение, 1993.

8. Рожкова Ф.М., Русанова С.В. Пособие по английскому языку для неязыковых вузов. Экскурсия по Москве. - М.: Высш. шк., 1980.

9. Синявская Е.В. Книга для чтения по английскому языку. - М.: Высш. шк., 1993.

10. Синявская Е.В., Гальперина Л.А., Ананьева Н.Ю. Пособие по развитию навыков устной речи на английском языке. - М.: Высш. шк., 1982.

11. Тенсон И.А. Обычаи и традиции Великобритании и США. - М.: Международные отношения, 1978.

12. Токарева Н.Д. Страницы истории Великобритании и США. - М., 1985.

13. Хведченя Л.В. Английский язык для поступающих в вузы. - Минск: Вышэйш. шк., 1994.

 

Английский язык. 600 устных тем для школьников и поступающих в вузы / И.Ю. Баканова, Н.В. Береговая, Н.Г. Брюсова и др. – М.: Дрофа, 1999. – 608 с.

CONTENTS

BIOGRAPHY……………………………………………………………….3

INSTITUTE…………………………………………………………………7

VOLGODONSK……………………………………………………………11

RUSSIA ……………………………………………………………………15

MOSCOW ………………………………………………………………….19

GREAT BRITAIN …………………………………………………………23

LONDON …………………………………………………………………. 27

THE USA …………………………………………………………………..31

WASINGTON ……………………………………………………………..35

THE LIST OF PROPER NAMES………………………………………….40

THE LIST OF GEOGRAPHICAL NAMES……………………………….42

БИБЛИОГРАФИЧЕСКИЙ СПИСОК..………………………………..44

 

 

20. PROCESS CONTROL

The control of processes in general is a wider extension of the pnm i pies used in numerical control of machine tools. Instead of monitoring and controlling solely movement, other parameters, such as temperature, time, gas flow, etc., are monitored and controlled. The possibilities arc endless, provided suitable transducers exist for the parameters to be con­trolled. In this case, the more complex the process the more suitable it is for microcomputer control.

Efficient operation of furnaces is an example where energy savings can be substantial when the process is properly controlled. A micro­processor-based system can monitor signals from thermocouples, air flow meters, fuel flow meters and gas analysers, and on the basis of heat loss calculation and furnace efficiency optimise the fuel/air ration.

In an application such as this, it is also possible to collect informa­tion of the furnace performance over time. An analysis of this informa­tion provides a valuable guide to damage and wear and to establishing the time for appropriate corrective maintenance.

Another heat dependent process is injection moulding. A micro­processor can monitor melt temperature, die temperature, pressure, cooling time, etc. to control the cycle in accordance with the specifica­tion of the material being used.

In practice,-despite theoretical laws, many industrial process pa­rameters are chosen and varied according to industry branch accumu­lated data and operator judgement. This can lead to erratic production and quality problems. With microcomputer control systems, this data can be stored and drawn upon from computer memory leading to great­er uniformity of output.

The calculation of optimum tool life from theoretical laws, from example, is not practical because of the variations in the properties of the actual workpicce. Optimum tool life more realistically should be based upon actual experience. It is feasible nowadays to monitor and analyse data to recalculate continuously optimum tool life.

Continuous monitoring of vibration in machinery allows the vibra­tion pattern to he analysed. Any ubnoimiil wear or breakdown of bear­ings will show up as a iliamatic chttuge in the pattern of vibration.

Notes:

provided — при условии что, н случае crmi provides a valuable guide to damage wmr nud lo «'»(и1»||»1пнм Им for appropriate corrective maintenance!\WI ценны»' > и. повреждениях, износе и о времени необходимом» щюфнт*м»

ческого ремонта

to erratic production and quality problems к нсуеюНчииос!и Ир"»»

водства и снижению камсава продукции

 

21. INSPECTION AND MEASUREMENT

In industrial situations, the ability to inspect anil, if necessary, te ject quickly is desirable if further errors arc to be prevented I lie value of microcomputer-controlled inspection equipment lies in moving probes at high speed or using several probes simultaneously, and in an ■ alysing the reading obtained to produce a final result quickly and with consistent accuracy. For example, in checking turbine blades twenty transducers might be used simultaneously. Immediate indication of "oversize", "-undersize" or (red, orange, green) and a printout is avail­able for permanent record.

The individual readings can be conveniently stored to allow trends, etc. to be identified. This often enables a situation to be altered before faulty work is produced.

For precision engineering an important measurement is surface quality expressed by about twenty parameters (e. g. roundness). The calculation of these parameters is tedious working from a trance of the surface. Traditional surface measurement instruments provide analog output into, say, chart recorders. By interfacing a microcomputer to the output of the instrument, the analysis can be done directly. In this type of application, therefore, there are microprocessor-based surface meas­urement instruments and also "add-on" systems.

Notes:

turbine blade — лопасть турбины trend — тенденция

рт'Ыоигмцинч'Нни никое мншиноароснис

1нин «(met* of lliv HinTiu'f itu проекции понерхпости (на экран, ii'toi им п. nhu'к I iiн,I микроскоп;! и т..1.)

t liitil in umIit <;iMi»riitmvtiiitn мриПор 'wild oil" *)*U'in«дополнительные устройства

 

Early Computing Machines and Inventors

The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use today, may be considered the first computer. This de­vice allows users to make computations using a system of sliding beads arranged on a rack. Early merchants used the abacus to keep trading- transactions. But as the use of paper and pencil spread, particularly in Europe, the abacus lost its importance. It took nearly 12 centuries, however, for the next significant advance in computing devices to emerge. In 1642, Blaise Pascal (1623-1662), the 18-year-old son of a French tax collector, invented what he called a numerical wheel calculator to help his father with his duties. This brass rectangular box, also called a Pascaline, used eight movable dials to add sums up to eight figures long. Pascal's device used a base of ten to accomplish this. For example, as one dial moved ten notches, or one complete revolution, it moved the next dial - which represented the ten's column - one place. When the ten's dial moved one revolution, the dial representing the hundred's place moved one notch and so on.

The drawback to the Pascaline, of course, was its limitation to addition.

In 1694, a German mathematician and philosopher, Gottfried Wilhelm von Leibniz (1646-1716), improved the Pascaline by creating a machine that could also multiply. Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears and dials. Partly by studying Pascal's original notes and drawings, Leibniz was able to refine his machine. The centerpiece of the machine was its stepped-drum gear design, which offered an elongated version of the simple flat gear. It wasn't until 1820, however, that mechanical calculators gained widespread use. Charles Xavier Thomas de Colmar, a Frenchman, invented a machine that could perform the four basic arithmetic functions. Colmar's mechanical calculator, the arithmometer, presented a more practical approach to computing because it could add, subtract, multiply and divide. With its enhanced versatility, the arithmometer was widely used up until the First World War. Although later inventors refined Colmar's calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation. The real beginnings of computers as we know them today, however, lay with an English mathematics professor, Charles Babbage (1791- 1871). Frustrated at the many errors he found while examining calculations for the Royal Astronomical Society, Babbage declared, "I wish to God these calculations had been performed by steam!" With those words, the automation of computers had begun. By 1812, Babbage noticed a natural harmony between machines and mathematics: machines were best at performing tasks repeatedly without mistake; while mathematics, particularly the production of mathematic tables, often required the simple repetition of steps. The problem centered on applying the ability of machines to the needs of mathematics. Babbage's first attempt at solving this problem was in 1822 when he proposed a machine to perform differential equations, called a Difference Engine. Powered by steam and large as alocomotive, the machine would have a stored program and could perform calculations and print the results automatically. After working on the Difference Engine for 10 years, Babbage was suddenly inspired to begin work on the first general-purpose computer, which he called the Analytical Engine. Babbage's assistant, Augusta Ada King, Countess of Lovelace (1815- 1842) and daughter of English poet Lord Byron, was instrumental in the machine's design. One of the few people who understood the Engine's design as well as Babbage, she helped revise plans, secure funding from the British government, and communicate the specifics of the Analytical Engine to the public. Also, Lady Lovelace's fine understanding of the machine allowed her to create the instruction routines to be fed into the com­puter, making her the first female computer programmer. In the 1980's, the U.S. Defense Department named a programming language ADA in her honor.

Babbage's steam-powered Engine, although ultimately never constructed, may seem primitive by today's standards. However, it outlined the basic elements of a modern general purpose computer and was a breakthrough concept. Consisting of over 50,000 components, the basic design of the Analytical Engine included input devices in the form of perforated cards containing operating instructions and a "store" for memory of 1,000 numbers of up to 50 decimal digits long. It also con­tained a "mill" with a control unit that allowed processing instructions in any sequence, and output devices to produce printed results. Babbage borrowed the idea of punch cards to encode the machine's instructions from the Jacquard loom. The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard, used punched boards that controlled the patterns to be woven.

In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the Jacquard loom concept to computing. His first task was to find a faster way to compute the U.S. census. The previous census in 1880 had taken nearly seven years to count and with an expanding population, the bureau feared it would take 10 years to count the latest census. Unlike Babbage's idea of using perforated cards to instruct the machine, Hollerith's method used cards to store data information which he fed into a machine that compiled the results mechanically. Each punch on a card represented one number, and combinations of two punches represented one letter. As many as 80 variables could be stored on a single card. Instead of ten years, census takers compiled their results in just six weeks with Hollerith's machine. In addition to their speed, the punch cards served as a storage method for data and they helped reduce computational errors. Hollerith brought his punch card reader into the business world, founding Tabulating Machine Company in 1896, later to become International Business Machines (IBM) in 1924 after a series of mergers. Other companies such as Remington Rand and Burroghs also manufactured punch readers for business use. Both business and government used punch cards for data processing until the 1960's.

In the ensuing years, several engineers made other significant advances. Vannevar Bush (1890-1974) developed a calculator for solving differential equations in 1931. The machine could solve complex differential equations that had long left scientists and mathematicians baffled. The machine was cumbersome because hundreds of gears and shafts were required to represent numbers and their various relationships to each other. To eliminate this bulkiness, John V. Atanasoff, a professor at Iowa State College (now called Iowa State University) and his graduate student, Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to computer circuitry. This approach was based on the mid-19th century work of George Boole (1815-1864) who clarified the binary system of algebra, which stated that any mathematical equations could be stated simply as either true or false. By extending this concept to electronic circuits in the form of on or off Atanasoff and Berry had developed the first all-electronic computer by 1940. Their project, however, lost its funding and their work was overshadowed by similar developments by other scientists'.

 

2. Speaking about a computer we usually associate it with the PCs we use at home, in the office or at the university. Everybody who deals with information needs a computer. But present day PCs greatly differ from the first calculating machines.

Read the text and:

- point out the main reasons for the development of computers in the 19th

century; the 20th century; '' - point out the main stages of the development of computers;

- describe areas where computers are used today.

 

Development of Computers

In the 19th century the need for rapid calculation expanded throughout the industrial world. Governments taxed and policed larger populations than ever before. Commerce expanded so that there were more money transactions than ever before.

Armies of clerks were employed to calculate and record the mass of transactions conducted by business houses, banks and insurance compa­nies. Scientists and engineers required ever more extensive tables of figures.

To meet the demand, new designs of calculating machine were devised.

In the 20th century electricity was harnessed to drive a variety of calculating machines. But the first general-purpose computing machine that was fully electronic was ENIAC (Electronic Numeral Integrator and Calculator), completed at the University of Pennsylvania in 1945. It employed more than 18,000 thermionic valves, weighed 30 tons and oc­cupied 1,500 sq. ft of floor space.

In the post-war years more computers were built, generally in university research departments. The term 'electronic brain' was coined.

The first part of the economy in which computers became important was finance. In banks and finance houses information began to be recorded directly in machine-readable form by operators at keyboard machines. At first numbers were recorded on punched paper tape or cards; later these were supplanted by magnetic tape and discs. The numbers of clerical staff did not fall, but their productivity rose as the number of transactions they could process swelled. In the early 1980s, for instance, in Britain the National Westminster Bank processes some 2 million cheques and 650,000 credits in each working day.

Large companies computerised their payrolls. Shops and stores kept track of goods with the aid of computers and cut their reserve stocks; hence they could reduce their warehouse costs and free space for a wider variety of goods.

Complex industrial processes such as oil refining and steel-rolling were handed over to the control of the computer. Industrial design depended more and more on the computer. It would be impossible to design a new car or jet airliner with a reasonable expenditure of time and money without computers to carry out the enormous number of calcula­tions involved.

The mammoth American company IBM dominated these developments. When delivery of Univac II, announced by IBM's rival Remington Rand in 1955, was delayed until 1957 by production difficulties, IBM captured the market in large computers.

IBM maintained its lead when the 'second generation' of computers appeared around 1960. These employed transistors in place of valves and were more powerful than their predecessors, yet more compact, reliable and economical of energy. They could be housed in a few cabinets, rather than filling a large air-conditioned room.

The trend towards smallness and cheapness was enormously accelerated when the 'third generation' of computers, based on the silicon chip, appeared around 1965. Electronic components, such as transistors, could now be made in large numbers on a thin square of silicon, typically 1/4 in. square. By 1971 the first microprocessor had appeared in Ameri­ca: the microprocessor was the heart of a computer - the part that does the actual calculating - on a single chip. Other chips could provide mem­ory stores.

When input/output devices, such as a keyboard and printing machine, were added, a complete computing system was obtained that could lit on to a desktop. Such a unit can store about 2 1/2 million characters - letters or numbers - of information. Calculations are completed in seconds and the print-out is between 80-120 characters a second.

A visual display unit - a TV screen that could display text punched in by means of a keyboard, together with the computer's replies - permitted an operator to put instructions and questions to the computer and receive responses.

The computer, now smaller, cheaper and more accessible to ordinary people than ever before, has appeared in the office, on the factory floor and in the home. Computer terminals are seen at airline and thea­tre reservations desks, in stockbrokers' offices, in factory stockrooms, in power-station control rooms and in banks.

Even the toy departments in large stores sell computers: some create video games on home TV sets; others play chess and draughts - some- limes with the machine speaking its moves. But the increasing power of the computer and its 'software' - its programming - has transformed daily life in ways that can pass unnoticed. Computer-fed weather fore­casts are more accurate and range further ahead. Greater volumes of road traffic are handled with less delay, by computerised traffic-light systems responding to information about the flow of vehicles from auto­matic sensors.

Some cars are now equipped with a microcomputer that continuously controls the fuel mixture and ignition timing, which optimises performance and economises on fuel. There are also 'trip computers' which display details of average speed and fuel consumption since the beginning of the journey.

The defensive networks of the major powers are co-ordinated by computers. The dangers of present-day reliance on them were vividly illustrated in June 1980, when a single micro-chip in a North American Air Defence Command computer developed a fault. Twice in three days a false warning of enemy missile attack was flashed to US military forces around the world. On both occasion? the American war machine was on full alert for three minutes.

The jobs of many skilled workers are threatened by the computer. For example, engineering draughtsman and machine-tool operators can be bypassed in the production of mechanical components such as gear boxes or car engine blocks. A designer draws rough diagrams on a TV screen linked to a computer. The computer straightens the lines, smooths the curves, alters the diagram to show different views of the component, and revises it according to the designer's instructions. When the design is finished, the computer can produce magnetic tapes that will control a machine tool. The tool will shape the part with an accuracy superior to

that of a human operator.

Libraries of information now exist in electronic form, and are called data banks. A typical magnetic disc can store a million words of text - the equivalent of ten long novels. Computers retrieve information from them and analyse it in the same way in which they deal with numbers in calculations.

The extension and pooling of data banks pose a threat to the individual's right to privacy. It could happen, for example, that information about a patient's episode of depression could be stored in a data bank serving a network of hospitals. Years later an officer in a government department considering that individual for employment might gain access to the information at the touch of a button and could hardly avoid being prejudiced by it.

Certain companies specialise in providing information about individuals' credit-worthiness to finance and hire-purchase companies. These computerised files are now so comprehensive that a high proportion of the population is listed in at least one such data bank, without being aware of it. Inaccurate information can lead to a person being denied credit without his knowing why.

In the field of crime, the computer can be used in the recognition and apprehension of criminals. Detailed information about a suspect's background may be obtained from a computer and sent by radio to a police­man on the spot - resulting in a speedy arrest. It can also bring together and analyse scattered items of information and so rapidly detect social patterns and trends.

 

3. The 20th century can be considered the Computer Age. The remarkable thing about the Computer Age is that so much has happened in such a short time. Mankind "leapfrogged" through four generations of technology in just over 30 years. It is a span of time whose events are within memories of many people.

Read the text "Five Generations of Modern Computers" and name the period of time for each generation and the technological development with which each generation was connected.

 

Five Generations of Modern Computers

 

First Generation Computers (1945-1956)

With the onset of the Second World War, governments sought to develop computers to exploit their potential strategic importance. This increased funding for computer development projects hastened technical progress. By 1941 German engineer Konrad Ziise had developed a computer, the Z3, to design airplanes and missiles. The Allied forces, however, made greater strides in developing powerful computers. In 1943, the British completed a secret code-breaking computer called Colossus to decode German messages. The Colossus's impact on the development of the computer industry was rather limited for two important reasons. First, Colossus was not a general-purpose computer; it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war.

American efforts produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing an all-electronic calculator by 1944. The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was a electronic relay computer. It used electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in that sequences of calculations could not change); but it could perform basic arithmetic as well as more complex equations.

Another computer development spurred by the war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery that it consumed 160 kilowatts of electrical power, enough energy to dim the lights in an entire section of Philadelphia. Developed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC, unlike the Colossus and Mark I, was a general-purpose computer that computed at speeds 1,000 times faster than Mark I.

In the mid-1940's John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that remained central to computer engineering for the next 40 years. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data. This "stored memory" technique as well as the "conditional control transfer", that allowed the computer to be stopped at any point and then resumed, allowed for greater versatility in computer programming. The key element to the von Neumann architecture was the central processing unit, which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand, became one of the first commercially available computers to take advantage of these advances. Both the U.S. Census Bureau and General

Electric owned UNIVACs. One of UNIVAC's impressive early achieve­ments was predicting the winner of the 1952 presidential election, Dwight D. Eisenhower.

First generation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation computers were the use of vacuum tubes (responsible for their breathtaking size) and magnetic drums for data storage.

Second Generation Computers (1956-1963)

By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large, cumbersome vacu­um tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors. The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly, however, and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Only two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore, California, for which the computer was named (Livermore Atomic Research Com­puter) and the other at the U.S. Navy Research and Development Center in Washington, D.C. Second generation computers replaced machine language with assembly, language, allowing abbreviated programming codes to replace long, difficult binary codes.

Throughout the early 1960's, there were a number of commercially successful second generation computers used in business, universities, and government from companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and others. These second generation computers were also of solid state design, and contained transistors in place of vacuum tubes. They also contained all the components we associate with the modern day computer: printers, tape storage, disk storage, memory, operating systems, and stored programs. One important example was the IBM 1401, which was universally accepted throughout industry, and is considered by many to be the Model T of the computer industry. By 1965, most large business routinely processed financial information using second generation computers.

It was the stored program and programming language that gave computers the flexibility to finally be cost effective and productive for business use. The stored program concept meant that instructions to run a computer for a specific function (known as a program) were held inside the computer's memory, and could quickly be replaced by a different set of instructions for a different function. A computer could print customer invoices and minutes later design products or calculate paychecks. More sophisticated high-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) came into common use during this time, and have expanded to the current day. These languages replaced cryptic binary machine code with words, sentences, and mathematical formulas, making it much easier to program a computer. New types of careers (programmer, analyst, and computer systems expert) and the entire software industry began with second generation computers.

Third Generation Computers (1964-1971)

Though transistors were clearly an improvement over the vacuum tube, they still generated a great deal of heat, which damaged the com­puter's sensitive internal parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the integrated circuit (1С) in 1958. The 1С combined three electronic components onto a small silicon disc, which was made from quartz. Scientists later managed to fit even more components on a single chip, called a semicon­ductor. As a result, computers became ever smaller as more components were squeezed onto the chip. Another third-generation development included the use of an operating system that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory.


Поделиться с друзьями:

История создания датчика движения: Первый прибор для обнаружения движения был изобретен немецким физиком Генрихом Герцем...

Общие условия выбора системы дренажа: Система дренажа выбирается в зависимости от характера защищаемого...

Индивидуальные и групповые автопоилки: для животных. Схемы и конструкции...

История развития пистолетов-пулеметов: Предпосылкой для возникновения пистолетов-пулеметов послужила давняя тенденция тяготения винтовок...



© cyberpedia.su 2017-2024 - Не является автором материалов. Исключительное право сохранено за автором текста.
Если вы не хотите, чтобы данный материал был у нас на сайте, перейдите по ссылке: Нарушение авторских прав. Мы поможем в написании вашей работы!

0.091 с.