Abstract on computer science: "History of the development of computer technology." History of the development of computer technology History of the development of computer technology articles

Municipal educational institution

<< Средняя общеобразовательная школа №2035 >>

Abstract on computer science

<< История развития компьютерной техники >>

The work was prepared by:

7th grade student

Belyakov Nikita

Checked:

IT-teacher

Dubova E.V.

Moscow, 2015

Introduction

As human society developed, it mastered not only matter and energy, but also information. With the advent and widespread distribution of computers, people received a powerful tool for the effective use of information resources and to enhance their intellectual activity. From this moment (mid-20th century) the transition from an industrial society to an information society began, in which information becomes the main resource.

The ability of members of society to use complete, timely and reliable information largely depends on the degree of development and mastery of new information technologies, the basis of which are computers. Let's consider the main milestones in the history of their development.

Beginning of an era

The first ENIAC computer was created at the end of 1945 in the USA.

The basic ideas on which computer technology developed over many years were formulated in 1946 by the American mathematician John von Neumann. They were called von Neumann architecture.

In 1949, the first computer with von Neumann architecture was built - the English EDSAC machine. A year later, the American EDVAC computer appeared.

In our country, the first computer was created in 1951. It was called MESM - small electronic calculating machine. The designer of the MESM was Sergei Alekseevich Lebedev.

Serial production of computers began in the 50s of the 20th century.

Electronic computer technology is usually divided into generations associated with a change in the element base. In addition, cars different generations differ in logical architecture and software, speed, RAM, method of input and output of information, etc.

S.A. Lebedev – Born in Nizhny Novgorod in the family of teacher and writer Alexei Ivanovich Lebedev and teacher from the nobility Anastasia Petrovna (nee Mavrina). He was the third child in the family. The older sister is the artist Tatyana Mavrina. In 1920, the family moved to Moscow.

In April 1928 he graduated from the Higher Technical School. Bauman with a degree in electrical engineer

First generation of computers

The first generation of computers were tube machines from the 50s. The counting speed is the most fast cars the first generation reached 20 thousand operations per second. Punched tapes and punched cards were used to enter programs and data. Since the internal memory of these machines was small (it could hold several thousand numbers and program commands), they were mainly used for engineering and scientific calculations not related to the processing of large volumes of data. These were rather bulky structures, containing thousands of lamps, sometimes occupying hundreds of square meters, consuming hundreds of kilowatts of electricity. Programs for such machines were compiled in machine command languages, so programming in those days was accessible to few.

Second generation of computers

In 1949, the first semiconductor device, replacing a vacuum tube. It was called a transistor. In the 60s transistors have become the elemental base for Second generation computer. The transition to semiconductor elements has improved the quality of computers in all respects: they have become more compact, more reliable, and less energy-intensive. The speed of most machines has reached tens and hundreds of thousands of operations per second. The volume of internal memory has increased hundreds of times compared to the first generation computer. External (magnetic) memory devices have received great development: magnetic drums, magnetic tape drives. Thanks to this, it became possible to create information, reference, and search systems on a computer (this is due to the need to store large amounts of information on magnetic media for a long time). During the second generation, high-level programming languages ​​began to actively develop. The first of them were FORTRAN, ALGOL, COBOL. Programming as an element of literacy has become widespread, mainly among people with higher education.

Third generation of computers

Third generation of computers was created on a new element base - integrated circuits: on a small plate of semiconductor material, with an area of ​​less than 1 cm electronic circuits. They were called integrated circuits (ICs). The first ICs contained dozens, then hundreds of elements (transistors, resistances, etc.). When the degree of integration (number of elements) approached a thousand, they began to be called large integrated circuits - LSI; then ultra-large-scale integrated circuits (VLSI) appeared. Third generation computers began to be produced in the second half of the 60s, when the American company IBM started production of machine systems IBM -360. In the Soviet Union in the 70s, the production of machines of the ES series of computers began ( One system COMPUTER). The transition to the third generation is associated with significant changes in computer architecture. It became possible to run several programs simultaneously on one machine. This mode of operation is called multiprogram (multi-program) mode. The operating speed of the most powerful computer models has reached several million operations per second. On third-generation machines, a new type of external storage device appeared - magnetic disks. New types of input/output devices are widely used: displays, plotters. During this period, the areas of application of computers expanded significantly. Databases, the first artificial intelligence systems, computer-aided design (CAD) and control systems (ACS) began to be created. In the 70s, the line of small (mini) computers received powerful development.

Fourth generation of computers

Another revolutionary event in electronics occurred in 1971, when an American company Intel announced the creation of a microprocessor. Microprocessor is an ultra-large integrated circuit capable of performing the functions of the main unit of a computer - the processor. Initially, microprocessors began to be built into various technical devices: machines, cars, airplanes. By connecting a microprocessor with input-output devices and external memory, we got a new type of computer: a microcomputer. Microcomputers are fourth generation machines. A significant difference between microcomputers and their predecessors is their small size (the size of a household TV) and comparative low cost. This is the first type of computer that appeared in retail sales.

The most popular type of computer today is personal computers. computers (PCs). First PC was born in 1976 in the USA. Since 1980, an American company has become a trendsetter in the PC market. IBM . Its designers managed to create an architecture that actually became international standard for professional PCs. The cars of this series were called IBM PC ( Personal Computer ). The emergence and spread of the personal computer in its significance for social development is comparable to the advent of book printing. It was PCs that made computer literacy a mass phenomenon. With the development of this type of machine, the concept of “information technology” appeared, without which it has become impossible to do without in most areas of human activity.

Another line in the development of fourth-generation computers is a supercomputer. Machines of this class have speeds of hundreds of millions and billions of operations per second. A supercomputer is a multiprocessor computing complex.

Conclusion

Developments in the field of computer technology continue. Fifth generation computer These are cars of the near future. Their main quality should be a high intellectual level. They will allow voice input, voice communication, machine “vision,” and machine “touch.”

Fifth generation machines are realized artificial intelligence.

Http://otvet.mail.ru/question/73952848

One of the first devices (V-IV centuries BC), from which the history of the development of computers can be considered to have begun, was a special board, later called the “abacus”. Calculations on it were carried out by moving bones or stones in the recesses of boards made of bronze, stone, ivory and the like. In Greece, the abacus existed already in the 5th century. BC, the Japanese called it “serobayan”, the Chinese called it “suanpan”. In Ancient Rus', a device similar to an abacus was used for counting - a “plank counting”. In the 17th century, this device took the form of the usual Russian abacus.

Abacus (V-IV centuries BC)

The French mathematician and philosopher Blaise Pascal created the first machine in 1642, which received the name Pascalina in honor of its creator. A mechanical device in the form of a box with many gears, in addition to addition, also performed subtraction. Data was entered into the machine by turning dials that corresponded to numbers from 0 to 9. The answer appeared at the top of the metal case.


Pascalina

In 1673, Gottfried Wilhelm Leibniz created a mechanical calculating device (Leibniz calculator - Leibniz calculator), which for the first time not only added and subtracted, but also multiplied, divided and calculated the square root. Subsequently, Leibniz's wheel became the prototype for mass calculating instruments - adding machines.


Leibniz step calculator model

English mathematician Charles Babbage developed a device that not only performed arithmetic operations, but also immediately printed the results. In 1832, a tenfold smaller model was built from two thousand brass parts, which weighed three tons, but was capable of performing arithmetic operations accurate to the sixth decimal place and calculating second-order derivatives. This computer became the prototype of real computers; it was called a differential machine.

Differential machine

A summing apparatus with continuous transmission of tens is created by the Russian mathematician and mechanic Pafnuty Lvovich Chebyshev. This device achieves automation of all arithmetic operations. In 1881, an attachment to the adding machine for multiplication and division was created. The principle of continuous transmission of tens has been widely used in various counters and computers.


Chebyshev summing apparatus

Automated data processing appeared at the end of the last century in the USA. Herman Hollerith created a device - the Hollerith Tabulator - in which information printed on punched cards was deciphered by electric current.

Hollerith tabulator

In 1936, a young Cambridge scientist, Alan Turing, came up with a mental calculating machine that existed only on paper. His “smart machine” operated according to a specific algorithm. Depending on the algorithm, the imaginary machine could be used for a wide variety of purposes. However, at that time these were purely theoretical considerations and schemes that served as the prototype of a programmable computer, as a computing device that processes data in accordance with a certain sequence of commands.

Information revolutions in history

In the history of the development of civilization, several information revolutions have occurred - transformations of social public relations due to changes in the field of processing, storing and transmitting information.

First The revolution is associated with the invention of writing, which led to a gigantic qualitative and quantitative leap in civilization. There is an opportunity to transfer knowledge from generation to generation.

Second(mid-16th century) revolution was caused by the invention of printing, which radically changed industrial society, culture, and organization of activities.

Third(end of the 19th century) revolution with discoveries in the field of electricity, thanks to which the telegraph, telephone, radio, and devices appeared that make it possible to quickly transmit and accumulate information in any volume.

Fourth(since the seventies of the 20th century) the revolution is associated with the invention of microprocessor technology and the advent of the personal computer. Computers and data transmission systems (information communications) are created using microprocessors and integrated circuits.

This period is characterized by three fundamental innovations:

  • transition from mechanical and electrical means of information conversion to electronic ones;
  • miniaturization of all components, devices, instruments, machines;
  • creation of software-controlled devices and processes.

History of the development of computer technology

The need for storage, transformation and transmission of information appeared in humans much earlier than the creation of the telegraph apparatus, the first telephone exchange and an electronic computer (computer). In fact, all the experience, all the knowledge accumulated by humanity, one way or another, contributed to the emergence of computer technology. The history of the creation of computers - the general name for electronic machines for performing calculations - begins far in the past and is associated with the development of almost all aspects of human life and activity. As long as human civilization has existed, certain automation of calculations has been used for as long.

The history of the development of computer technology goes back about five decades. During this time, several generations of computers have changed. Each subsequent generation was distinguished by new elements (electron tubes, transistors, integrated circuits), the manufacturing technology of which was fundamentally different. Currently, there is a generally accepted classification of computer generations:

  • First generation (1946 - early 50s). The element base is electron tubes. Computers were distinguished by their large dimensions, high energy consumption, low speed, low reliability, and programming in codes.
  • Second generation (late 50s - early 60s). Element base - semiconductor. Almost everything has improved compared to the previous generation computers specifications. Algorithmic languages ​​are used for programming.
  • 3rd generation (late 60s - late 70s). Element base - integrated circuits, multilayer printed circuit assembly. A sharp reduction in the size of computers, increasing their reliability, increasing productivity. Access from remote terminals.
  • Fourth generation (from the mid-70s to the end of the 80s). The element base is microprocessors, large integrated circuits. Technical characteristics have been improved. Mass release personal computers. Directions of development: powerful multiprocessor computing systems with high performance, creation of cheap microcomputers.
  • Fifth generation (from the mid-80s). The development of intelligent computers began, but has not yet been successful. Introduction into all areas of computer networks and their integration, use of distributed data processing, widespread use of computer information technologies.

Along with the change of generations of computers, the nature of their use also changed. If at first they were created and used mainly to solve computational problems, then later the scope of their application expanded. This includes information processing, automation of control of production, technological and scientific processes, and much more.

Principles of operation of computers by Konrad Zuse

The idea of ​​​​the possibility of building an automated calculating apparatus came to the mind of the German engineer Konrad Zuse, and in 1934 Zuse formulated the basic principles on which future computers should work:

  • binary number system;
  • use of devices operating on the “yes/no” principle (logical 1/0);
  • fully automated process of the computer;
  • software control of the calculation process;
  • support for floating point arithmetic;
  • using large capacity memory.

Zuse was the first in the world to determine that data processing begins with a bit (he called the bit “yes/no status”, and the formulas of binary algebra - conditional propositions), the first to introduce the term “machine word” (Word), the first to combine arithmetic and logical calculators operation, noting that “the elementary operation of a computer is testing two binary numbers for equality. The result will also be a binary number with two values ​​(equal, not equal).”

First generation - computers with vacuum tubes

Colossus I is the first tube-based computer, created by the British in 1943 to decipher German military ciphers; it consisted of 1,800 vacuum tubes—devices for storing information—and was one of the first programmable electronic digital computers.

ENIAC - was created to calculate artillery ballistics tables; this computer weighed 30 tons, occupied 1000 square feet and consumed 130-140 kW of electricity. The computer contained 17,468 vacuum tubes of sixteen types, 7,200 crystal diodes and 4,100 magnetic elements, and they were contained in cabinets with a total volume of about 100 m 3. ENIAC had a performance of 5000 operations per second. The total cost of the machine was $750,000. Electricity consumption was 174 kW, and the total space occupied was 300 m2.


ENIAC - a device for calculating artillery ballistics tables

Another representative of the 1st generation of computers that you should pay attention to is EDVAC (Electronic Discrete Variable Computer). EDVAC is interesting because it attempted to record programs electronically in so-called “ultrasonic delay lines” using mercury tubes. In 126 such lines it was possible to store 1024 lines of four-digit binary numbers. It was a “fast” memory. As a “slow” memory, it was supposed to record numbers and commands on a magnetic wire, but this method turned out to be unreliable, and it was necessary to return to teletype tapes. EDVAC was faster than its predecessor, adding in 1 µs and dividing in 3 µs. It contained only 3.5 thousand electronic tubes and was located on 13 m 2 of area.

UNIVAC (Universal Automatic Computer) was an electronic device with programs stored in memory, which were entered there not from punched cards, but using magnetic tape; this ensured high speed of reading and writing information, and, consequently, higher performance of the machine as a whole. One tape could contain a million characters, written in binary form. Tapes could store both programs and intermediate data.


Representatives of the first generation of computers: 1) Electronic Discrete Variable Computer; 2) Universal Automatic Computer

The second generation is a computer with transistors.

Transistors replaced vacuum tubes in the early 60s. Transistors (which act like electrical switches) consume less power and generate less heat and take up less space. Combining several transistor circuits on one board produces an integrated circuit (chip, literally, plate). Transistors are binary number counters. These parts record two states - the presence of current and the absence of current, and thereby process the information presented to them in exactly this binary form.

In 1953, William Shockley invented the p-n junction transistor. The transistor replaces the vacuum tube and at the same time operates at a higher speed, produces very little heat and consumes almost no electricity. Simultaneously with the process of replacing electronic tubes with transistors, methods of storing information were improved: magnetic cores and magnetic drums began to be used as memory devices, and already in the 60s, storing information on disks became widespread.

One of the first transistor computers, the Atlas Guidance Computer, was launched in 1957 and was used to control the launch of the Atlas rocket.

Created in 1957, the RAMAC was a low-cost computer with modular external disk memory, a combination of magnetic core and drum random access memory. And although this computer was not yet completely transistorized, it was distinguished by high performance and ease of maintenance and was in great demand in the office automation market. Therefore, a “large” RAMAC (IBM-305) was urgently released for corporate customers; to accommodate 5 MB of data, the RAMAC system needed 50 disks with a diameter of 24 inches. Based on this model Information system flawlessly processed arrays of requests in 10 languages.

In 1959, IBM created its first all-transistor large mainframe computer, the 7090, capable of 229,000 operations per second—a true transistorized mainframe. In 1964, based on two 7090 mainframes, the American airline SABER first used automated system sales and booking of air tickets in 65 cities around the world.

In 1960, DEC introduced the world's first minicomputer, the PDP-1 (Programmed Data Processor), a computer with a monitor and keyboard that became one of the most notable phenomena on the market. This computer was capable of performing 100,000 operations per second. The machine itself occupied only 1.5 m 2 on the floor. The PDP-1 became, in fact, the world's first gaming platform thanks to MIT student Steve Russell, who wrote a Star War computer toy for it!


Representatives of the second generation of computers: 1) RAMAC; 2) PDP-1

In 1968, Digital launched the first serial production of minicomputers - it was the PDP-8: their price was about $10,000, and the model was the size of a refrigerator. This particular PDP-8 model was able to be purchased by laboratories, universities and small businesses.

Domestic computers of that time can be characterized as follows: in terms of architectural, circuit and functional solutions, they corresponded to their time, but their capabilities were limited due to the imperfection of the production and element base. The most popular machines were the BESM series. Serial production, quite insignificant, began with the release of the Ural-2 computer (1958), BESM-2, Minsk-1 and Ural-3 (all - 1959). In 1960, the M-20 and Ural-4 series went into production. The maximum performance at the end of 1960 was “M-20” (4500 lamps, 35 thousand semiconductor diodes, memory with 4096 cells) - 20 thousand operations per second. The first computers based on semiconductor elements (“Razdan-2”, “Minsk-2”, “M-220” and “Dnepr”) were still in the development stage.

Third generation - small-sized computers based on integrated circuits

In the 50s and 60s, assembling electronic equipment was a labor-intensive process that was slowed by the increasing complexity of electronic circuits. For example, a computer type CD1604 (1960, Control Data Corp.) contained about 100 thousand diodes and 25 thousand transistors.

In 1959, Americans Jack St. Clair Kilby (Texas Instruments) and Robert N. Noyce (Fairchild Semiconductor) independently invented an integrated circuit (IC) - a collection of thousands of transistors placed on a single silicon chip inside a microcircuit.

The production of computers using ICs (they were later called microcircuits) was much cheaper than using transistors. Thanks to this, many organizations were able to purchase and use such machines. And this, in turn, led to an increase in demand for general-purpose computers designed to solve various problems. During these years, computer production acquired an industrial scale.

At the same time, semiconductor memory appeared, which is still used in personal computers to this day.


Representative of the third generation of computers - ES-1022

Fourth generation - personal computers based on processors

The forerunners of the IBM PC were the Apple II, Radio Shack TRS-80, Atari 400 and 800, Commodore 64 and Commodore PET.

The birth of personal computers (PC) is rightfully associated with Intel processors. The corporation was founded in mid-June 1968. Since then, Intel has grown into the world's largest manufacturer of microprocessors with more than 64 thousand employees. Intel's goal was to create semiconductor memory and, in order to survive, the company began to take third-party orders for the development of semiconductor devices.

In 1971, Intel received an order to develop a set of 12 chips for programmable microcalculators, but Intel engineers found the creation of 12 specialized chips cumbersome and inefficient. The problem of reducing the range of microcircuits was solved by creating a “pair” of semiconductor memory and an actuator capable of operating according to commands stored in it. It was a breakthrough in computing philosophy: a universal logic unit in the form of a 4-bit central processing unit, the i4004, which was later called the first microprocessor. It was a set of 4 chips, including one chip controlled by commands that were stored in semiconductor internal memory.

As a commercial development, the microcomputer (as the chip was then called) appeared on the market on November 11, 1971 under the name 4004: 4 bit, containing 2300 transistors, clocked at 60 kHz, cost $200. In 1972, Intel released the eight-bit microprocessor 8008, and in 1974 - its improved version Intel-8080, which by the end of the 70s became the standard for the microcomputer industry. Already in 1973, the first computer based on the 8080 processor, Micral, appeared in France. For various reasons, this processor was not successful in America (in the Soviet Union it was copied and produced for a long time called 580VM80). At the same time, a group of engineers left Intel and formed Zilog. Its most high-profile product is the Z80, which has an extended instruction set of the 8080 and, which ensured its commercial success for household appliances, made do with a single 5V supply voltage. On its basis, in particular, the ZX-Spectrum computer was created (sometimes called by the name of its creator - Sinclair), which practically became the prototype of the Home PC of the mid-80s. In 1981, Intel released the 16-bit processor 8086 and 8088 - an analogue of the 8086, with the exception of the external 8-bit data bus (all peripherals were still 8-bit back then).

A competitor to Intel, the Apple II computer was distinguished by the fact that it was not a completely finished device and there was some freedom left for modification directly by the user - it was possible to install additional interface boards, memory boards, etc. It was this feature, which later came to be called “open architecture,” that became its main advantage. The success of the Apple II was facilitated by two more innovations developed in 1978. Inexpensive floppy disk storage, and the first commercial calculation program, the VisiCalc spreadsheet.

The Altair-8800 computer, built on the Intel-8080 processor, was very popular in the 70s. Although the Altair's capabilities were quite limited - the RAM was only 4 KB, the keyboard and screen were missing, its appearance was greeted with great enthusiasm. It was launched on the market in 1975, and several thousand sets of the machine were sold in the first months.


Representatives of the IV generation of computers: a) Micral; b) Apple II

This computer, developed by MITS, was sold by mail as a kit of parts for self-assembly. The entire assembly kit cost $397, while the Intel processor alone sold for $360.

The spread of PCs by the end of the 70s led to a slight decrease in the demand for large computers and minicomputers - IBM released the IBM PC based on the 8088 processor in 1979. The software that existed in the early 80s was focused on word processing and simple electronic tables, and the very idea that a “microcomputer” could become a familiar and necessary device at work and at home seemed incredible.

On August 12, 1981, IBM introduced the Personal Computer (PC), which, in combination with software from Microsoft, became the standard for the entire PC fleet. modern world. The price of an IBM PC model with a monochrome display was about $3,000, with a color display - $6,000. IBM PC configuration: Intel 8088 processor with a frequency of 4.77 MHz and 29 thousand transistors, 64 KB random access memory, 1 floppy drive with a capacity of 160 KB, - a regular built-in speaker. At this time, launching and working with applications was a real pain: due to the lack of a hard drive, you had to constantly change floppy disks, there was no “mouse”, no graphical window user interface, no exact correspondence between the image on the screen and the final result (WYSIWYG ). Color graphics were extremely primitive, there was no talk of three-dimensional animation or photo processing, but the history of the development of personal computers began with this model.

In 1984, IBM introduced two more new products. First, a model for home users was released, called the PCjr, based on the 8088 processor, which was equipped with perhaps the first wireless keyboard, but this model did not achieve success in the market.

The second new product is the IBM PC AT. The most important feature: the transition to microprocessors is more high levels(80286 with 80287 digital coprocessor) while maintaining compatibility with previous models. This computer turned out to be a standard-setter for many years to come in a number of respects: it was the first to introduce a 16-bit expansion bus (which remains standard to this day) and EGA graphics adapters with a resolution of 640x350 and 16-bit color depth.

In 1984, the first Macintosh computers were released with a graphical interface, a mouse, and many other user interface attributes that are essential to modern desktop computers. The new interface did not leave users indifferent, but the revolutionary computer was not compatible with previous programs or hardware components. And in the corporations of that time, WordPerfect and Lotus 1-2-3 had already become normal working tools. Users have already become accustomed and adapted to the DOS character interface. From their point of view, the Macintosh even looked somehow frivolous.

Fifth generation of computers (from 1985 to the present time)

Distinctive features of the V generation:

  1. New production technologies.
  2. Refusal of traditional programming languages ​​such as Cobol and Fortran in favor of languages ​​with increased capabilities for manipulating symbols and elements of logic programming (Prolog and Lisp).
  3. Emphasis on new architectures (e.g. data flow architecture).
  4. New user-friendly input/output methods (e.g., speech and image recognition, speech synthesis, natural language message processing)
  5. Artificial intelligence (that is, automation of problem solving processes, drawing conclusions, manipulating knowledge)

It was at the turn of the 80-90s that the Windows-Intel alliance was formed. When Intel released the 486 microprocessor in early 1989, computer makers didn't wait for IBM or Compaq to lead the way. A race began, in which dozens of companies entered. But all the new computers were extremely similar to each other - they were united by compatibility with Windows and processors from Intel.

In 1989, the i486 processor was released. It had a built-in math coprocessor, pipeline, and built-in L1 cache.

Directions of computer development

Neurocomputers can be classified as the sixth generation of computers. Despite the fact that the real use of neural networks began relatively recently, neurocomputing as scientific direction The seventh decade has passed, and the first neurocomputer was built in 1958. The developer of the car was Frank Rosenblatt, who gave his brainchild the name Mark I.

The theory of neural networks was first outlined in the work of McCulloch and Pitts in 1943: any arithmetic or logical function can be implemented using a simple neural network. Interest in neurocomputing reignited in the early 1980s and was fueled by new work with multilayer perceptrons and parallel computing.

Neurocomputers are PCs consisting of many simple computing elements, called neurons, working in parallel. Neurons form so-called neural networks. The high performance of neurocomputers is achieved precisely due to the huge number of neurons. Neurocomputers are built on a biological principle: the human nervous system consists of individual cells - neurons, the number of which in the brain reaches 10 12, despite the fact that the response time of a neuron is 3 ms. Each neuron performs fairly simple functions, but since it is connected on average to 1–10 thousand other neurons, such a group successfully ensures the functioning of the human brain.

Representative of the VI generation of computers - Mark I

In optoelectronic computers, the information carrier is light flux. Electrical signals are converted to optical and vice versa. Optical radiation as an information carrier has a number of potential advantages compared to electrical signals:

  • Light flows, unlike electrical ones, can intersect with each other;
  • Light fluxes can be localized in the transverse direction of nanometer dimensions and transmitted through free space;
  • The interaction of light fluxes with nonlinear media is distributed throughout the environment, which gives new degrees of freedom in organizing communication and creating parallel architectures.

Currently, developments are underway to create computers entirely consisting of optical information processing devices. Today this direction is the most interesting.

An optical computer has unprecedented performance and a completely different architecture than an electronic computer: in 1 clock cycle lasting less than 1 nanosecond (this corresponds to a clock frequency of more than 1000 MHz), an optical computer can process a data array of about 1 megabyte or more. To date, individual components of optical computers have already been created and optimized.

An optical computer the size of a laptop can give the user the opportunity to place almost all the information about the world in it, while the computer will be able to solve problems of any complexity.

Biological computers are ordinary PCs, only based on DNA computing. There are so few truly demonstrative works in this area that there is no need to talk about significant results.

Molecular computers are PCs whose operating principle is based on the use of changes in the properties of molecules during the process of photosynthesis. During the process of photosynthesis, the molecule takes on different states, so that scientists can only assign certain logical values ​​to each state, that is, “0” or “1”. Using certain molecules, scientists have determined that their photocycle consists of only two states, which can be “switched” by changing the acid-base balance of the environment. The latter is very easy to do using an electrical signal. Modern technologies already make it possible to create entire chains of molecules organized in this way. Thus, it is very possible that molecular computers are waiting for us “just around the corner.”

The history of computer development is not over yet; in addition to improving old ones, completely new technologies are being developed. An example of this is quantum computers - devices that operate on the basis of quantum mechanics. A full-scale quantum computer is a hypothetical device, the possibility of building which is associated with the serious development of quantum theory in the field of many particles and complex experiments; this work lies at the cutting edge of modern physics. Experimental quantum computers already exist; elements of quantum computers can be used to increase the efficiency of calculations on existing instrumentation.

  • 1623 The first "counting machine" created by William Schickard. This rather cumbersome apparatus could apply simple arithmetic operations (addition, subtraction) with 7-digit numbers.
  • 1644 Blaise Pascal's "Calculator" was the first truly popular counting machine that performed arithmetic operations on 5-digit numbers.
  • 1668 Ser Samuel Morland's computer, intended for financial transactions.
  • 1674 Wilhelm Godfried von Leibniz designed a mechanical counting machine

a machine that could perform not only addition and subtraction operations, but also multiplication!

  • 1820 The first calculator is the Arithmometer by Charles de Colmar. Lasted on the market (with some improvements) for 90 years!
  • 1834 Charles Babbage's famous Analytical Engine was the first programmable computer, using primitive programs on punched cards.
  • 1871 Babbage created a prototype of an analytical computer device and a printing device - a printer.
  • 1886 Dorr Felt created the Comptometer, the first key-based data entry device.
  • 1890 A census was carried out in the United States - for the first time, a “counting machine” created by Herman Hallrit took part in it.
  • 1935 IBM Corporation (International Business Machines) began production of mass-produced computers IBM-601.
  • 1937 Mathematician Alan Turing created a "mathematical model" of a computer, called the "Turing Machine".
  • 1938 Kondrad Zuse, a friend and colleague of the famous Wernher von Braun, created one of the first computers in Berlin - V1.
  • 1943 Howard Aiken creates the ASCC Mark I, a machine considered the grandfather of modern computers. Its weight was more than 7 tons and consisted of 750,000 parts. The machine was used for military purposes - to calculate artillery tables.
  • 1945 John von Neumann developed a theoretical model of a computer - the world's first description of a computer that used externally loaded programs. In the same year, Mauchly and Eckert created ENIAC, the most ambitious and powerful tube computer of that era. The computer weighs more than 70 tons and contains almost 18 thousand vacuum tubes. The operating frequency of the computer does not exceed 100KHz (several hundred operations per second).
  • 1956 The first transistor-based computer was created at the Massachusetts Institute of Technology. In the same year, IBM created the first information storage device - a hard drive prototype - HDD KAMAS 305.
  • 1958-1959 D. Kilby and R. Noyce created a unique circuit of logical elements based on

surface of a silicon crystal connected by aluminum contacts -

the first prototype of a microprocessor, an integrated circuit.

  • 1960 AT developed the first modem.
  • 1963 Douglas Engelbart received a patent for the manipulator he invented - the “mouse”.
  • 1968 Founding of Intel by Robert Noyce and Gordon Moore.
  • 1969 Intel introduces the first 1 KB RAM chip. In the same year, Xerox created laser image copying technology, which many years later would form the basis of laser printer printing technology. The first "copiers".
  • 1971 Commissioned by the Japanese microcalculator manufacturer Busicom, the Intel development team led by Ted Hoff creates the first 4-bit microprocessor Intel-4004. Processor speed - 60 thousand operations per second. In the same year, a team and researchers at the IBM San Jose laboratory created the first 8-inch floppy disk.
  • 1972 The new microprocessor from Intel is the 8-bit Intel-8008. Xerox creates the first microcomputer, the Dynabook, slightly larger than a notebook.
  • 1973 A prototype of the first personal computer was created at the Xerox Research Center. The first character to appear on screen was Korzhik, a character from the children's television series Sesame Street. In the same year, Scelbi Computer Consulting Company launched the first ready-made personal computer equipped with an Intel-8008 processor and 1 KB of RAM. In the same year, IBM introduced the IBM 3340 hard disk. The disk capacity was 16 KB, it contained 30 magnetic cylinders of 30 tracks each. Because of this, it was called a “Winchester” (30/30" - the brand of the famous rifle). And in the same year, Bob Matcalf invents a computer communication system called Ethernet.
  • 1974 The new processor from Intel is the 8-bit Intel-8080. Speed ​​640 thousand operations per second. An inexpensive Altair computer based on this processor will soon appear on the market, running operating system CP/M. In the same year, the first processor was released by Intel's main competitor in the 70s, Zilog.
  • 1975 IBM releases the first laptop. The first musical composition reproduced using a computer was the melody of The Beatles song "Fool On The Hill".
  • 1976 Advanced Micro Devices (AMD) obtains the right to copy instructions and microcode of Intel processors. The beginning of the "processor war". In the same year, Steve Wozniak and Steve Jobs They assemble an Apple series computer in their own garage workshop. And on April 1 of the same year, Apple Computer was born. The Apple I computer goes on sale with a very sacramental figure on the price tag - $666.66.
  • 1977 Commodore and Apple II computers go on sale to the masses. Which
  • 1977 Commodore and Apple II computers go on sale to the masses. Which is equipped with 4 KB of RAM, 16 KB of permanent memory, a keyboard and a display. The price for all the fun is $1300. Apple II is getting a fashionable addition - a floppy disk drive.
  • 1978 Intel introduces a new microprocessor - 16-bit Intel-8086, operating at a frequency of 4.77 MHz (330 thousand operations per second). The Hayes company was founded - the future leader in the production of modems. Commodore launched the first models of dot-matrix printers on the market.
  • 1979 The appearance of the Intel-8088 processor, as well as the first video games and computer consoles for them. The Japanese company NEC produces the first microprocessor in this country. Hayes releases the first 300 baud modem for the new Apple computer.
  • 1980 The Atari computer becomes the most popular computer of the year. Seagate Technologies presents the first hard drive for personal computers - a hard drive with a diameter of 5.25 inches.
  • 1981 The Apple III computer appears. Intel introduces the first coprocessor. The company Creative Technology (Singapore) was founded - the creator of the first sound card. The first mass-produced hard drive with a capacity of 5 MB and a cost of $1,700 appears on sale.
  • 1982 A new model from IBM appears on the market - the famous IBM PC AT - and the first clones of the IBM PC. IBM introduces the 16-bit 80286 processor. Operating frequency is 6 MHz. (1.5 million operations per second). Hercules introduces the first black and white video card - Hercules Graphics Adapter (HGA).
  • 1983 Commodore releases the first portable computer with a color display (5 colors). Computer weight 10 kg, price $1600. IBM introduces the IBM PC XT, equipped with a 10 MB hard drive, a 360 KB floppy drive and 128 (later 768) KB of RAM. The price of the computer was $5,000. The millionth Apple II computer was released. The first SIMM memory modules appear. Philips and Sony introduce CD-ROM technology to the world.
  • 1984 Apple releases a 1200 baud modem. Hewlett-Packard releases the first LaserJet series laser printer with a resolution of up to 300 dpi. Philips releases the first CD-ROM drive. IBM introduces the first EGA monitors and video adapters (16 colors, resolution - 630x350 dpi), as well as professional 14-inch monitors supporting 256 colors and a resolution of 640x480 pixels.
  • 1985 New processor from Intel - 32-bit 80386DX (with built-in coprocessor). Operating frequency 16 MHz, speed about 5 million operations per second. The first modem from U.S. Robotics is Courier 2400 baud.
  • 1986 The first animated video with sound effects is shown on the Amiga computer. The birth of multimedia technology. The birth of the SCSI (Small Computer System Interface) standard.
  • 1987 Intel introduces a new version of the 80386DX processor with an operating frequency of 20 MHz. The Swedish National Institute for Testing and Measurement approves the first standard acceptable values radiation from monitors. U.S. Robotics Introduces Courier HST 9600 Modem
  • 1988 compaq releases the first computer with 640 KB of RAM - standard memory for all subsequent generations of DOS. Hewlett-Packard releases the first DeskJet inkjet printer. Steve Jobs and the company he founded, NexT, released the first workstation, equipped with a new Motorola processor, a fantastic amount of memory for that time (8 MB), a 17-inch monitor and a 256 MB hard drive. The price of the computer is $6500.
  • 1989 Creative Labs introduces Sound Blaster 1.0, an 8-bit mono sound card. The birth of the SuperVGA standard (resolution 800x600 pixels with support for 16 thousand colors).
  • 1990 The birth of the Internet. Intel introduces a new processor - the 32-bit 80486SX. Speed ​​27 million operations per second. IBM presents new standard video cards - XGA - as a replacement for traditional VGA (resolution 1024x768 pixels with support for 65 thousand colors).
  • 1991 Apple introduces the first monochrome handheld scanner. AMD presents improved "clones" of Intel processors - 386DX with a clock frequency of 40 MHz and 486SX with a frequency of 20 MHz. The first stereo music card is the 8-bit Sound Blaster Pro.
  • 1992 NEC releases the first double-speed (2x) CD-ROM drive.
  • 1993 Intel introduces a new bus and slot standard for connecting additional cards - PCI. The first processor of the new generation of Intel processors is the 32-bit Pentium. Operating frequency from 60 MG, speed - from 100 million operations per second. Microsoft and Intel, together with the largest PC manufacturers, are developing Plug&Play technology (plug and play), allowing automatic recognition computer of new devices, as well as their configuration.
  • 1994 Iomega introduces ZIP and JAZ disks and drives - an alternative

existing floppy disks 1. 44 MB. US Robotics releases the first 28800 baud modem.

  • 1995 The standard for new laser disc media - DVD - has been announced. AMD releases the latest 486 generation processor, the AMD 486DX-120. Intel introduces the Pentium Pro processor, designed for powerful workstations. 3dfx produces the Voodoo chipset, which formed the basis of the first 3D graphics accelerators for home PCs. The first “virtual reality” glasses and helmets for home PCs.
  • 1996 The birth of the USB bus. Intel releases the Pentium MMX processor with support for new multimedia instructions. Beginning of mass production of liquid crystal monitors for home PCs.
  • 1997 The emergence of Pentium II processors and alternative AMD K6 processors. The first DVD drives. Release of the first PCI sound cards. New AGP graphics port.
  • 1998 Apple is releasing a new iMac that's not only powerful, but also stunning in design. Release of Celeron processors with reduced L2 cache. “3D revolution”: a dozen new models of 3D accelerators integrated into conventional video cards are appearing on the market. During the year, the production of video cards without 3D accelerators was discontinued.
  • 1999 Release of new Pentium III processors.
  • 2000-2003 Fierce competition between Intel and AMD, which led to the creation of processors with a terrifying speed of 3200 MHz. This led to an increase in RAM, volume hard drives, video cards, etc.

Most people seem to consider the terms “computer” and “computer hardware” to be synonymous and associate them with physical hardware, such as a microprocessor, display, disks, printers and other devices that attract people's attention when a person sees computer While these devices are important, they are only the tip of the iceberg. initial stage use modern computer we are not dealing with the computer itself, but with a set of rules, called programming languages, in which the actions that the computer must perform are specified. The importance of a programming language is emphasized by the fact that the computer itself can be considered as a hardware interpreter of a specific language, which is called machine language. To ensure efficient operation of the machine, machine languages ​​have been developed, the use of which presents certain difficulties for humans. Most users do not experience these inconveniences due to the presence of one or more languages ​​designed to improve human-machine communication. The flexibility of a computer is manifested in the fact that it can execute translator programs (generally called compilers or interpreters) to convert programs from user-oriented languages ​​into machine language programs. (In turn, even the programs, games, system shells themselves are nothing more than a fairly simple translator program, which, as it works or plays, uses its commands to access “computer insides and outsides,” translating its commands into machine languages And all this happens in real time.)

Human life in the twenty-first century is directly related to artificial intelligence. Knowledge of the main milestones in the creation of computers is an indicator of an educated person. The development of computers is usually divided into 5 stages - it is customary to talk about five generations.

1946-1954 - first generation computers

It is worth saying that the first generation of computers (electronic computers) was tube-based. Scientists at the University of Pennsylvania (USA) developed ENIAC - that was the name of the world's first computer. The day it was officially put into operation is 02/15/1946. When assembling the device, 18 thousand vacuum tubes were used. The computer, by today's standards, had a colossal area of ​​135 square meters and a weight of 30 tons. Electricity needs were also high - 150 kW.

It is a well-known fact that this electronic machine directly to help solve the most difficult problems of creating an atomic bomb. The USSR was rapidly catching up and in December 1951, under the leadership and with the direct participation of Academician S.A. Lebedev, the fastest computer in Europe was presented to the world. She bore the abbreviation MESM (Small Electronic Calculating Machine). This device could perform from 8 to 10 thousand operations per second.

1954 - 1964 - second generation computers

The next step in development was the development of computers running on transistors. Transistors are devices made from semiconductor materials that allow you to control the current flowing in a circuit. The first known stable operating transistor was created in America in 1948 by a team of physicists and researchers Shockley and Bardin.

In terms of speed, electronic computers differed significantly from their predecessors - the speed reached hundreds of thousands of operations per second. The dimensions have also decreased, and the consumption of electrical energy has become less. The scope of use has also increased significantly. This happened due to the rapid development of software. Our best computer, BESM-6, had a record speed of 1,000,000 operations per second. Developed in 1965 under the leadership of chief designer S. A. Lebedev.

1964 - 1971 - third generation computers

The main difference of this period is the beginning of the use of microcircuits with a low degree of integration. Using sophisticated technologies, scientists were able to place complex electronic circuits on a small semiconductor wafer, with an area of ​​less than 1 square centimeter. The invention of microcircuits was patented in 1958. Inventor: Jack Kilby. The use of this revolutionary invention made it possible to improve all parameters - the dimensions were reduced to approximately the size of a refrigerator, the performance increased, as well as reliability.

This stage in the development of computers is characterized by the use of a new storage device - a magnetic disk. The PDP-8 minicomputer was first introduced in 1965.

In the USSR, similar versions appeared much later - in 1972 and were analogues of the models presented on the American market.

1971 - modern times - fourth generation computers

An innovation in fourth-generation computers is the application and use of microprocessors. Microprocessors are ALUs (arithmetic logic units) placed on a single chip and have a high degree of integration. This means that the chips begin to take up even less space. In other words, a microprocessor is a small brain that performs millions of operations per second according to the program embedded in it. Size, weight and power consumption have been reduced dramatically, and performance has reached record highs. And that's when Intel came into the game.

The first microprocessor was called Intel-4004 - the name of the first microprocessor assembled in 1971. It had a 4-bit capacity, but at that time it was a gigantic technological breakthrough. Two years later, Intel introduced the eight-bit Intel-8008 to the world; in 1975, the Altair-8800 was born - this is the first personal computer based on the Intel-8008.

This was the beginning of an entire era of personal computers. The machine began to be used everywhere for completely different purposes. A year later, Apple entered the game. The project had big success, and Steve Jobs became one of the most famous and richest people on Earth.

The IBM PC becomes the undisputed standard of computers. It was released in 1981 with 1 megabyte of RAM.

It is noteworthy that at the moment IBM-compatible electronic computers occupy approximately ninety percent of the computers produced! Also, we can’t help but mention Pentium. The development of the first processor with an integrated coprocessor was successful in 1989. Now this brand is an indisputable authority in the development and use of microprocessors in the computer market.

If we talk about prospects, then this is, of course, development and implementation latest technologies: ultra-large integrated circuits, magnetic-optical elements, even elements of artificial intelligence.

Self-learning electronic systems are the foreseeable future, called the fifth generation in the development of computers.

A person strives to erase the barrier in communication with a computer. Japan worked on this for a very long time and, unfortunately, unsuccessfully, but this is the topic of a completely different article. At the moment, all projects are only in development, but at the current pace of development, this is the near future. The present time is the time when history is made!

Share.

Fifth generation computers. The development of the next generations of computers is based on large integral increases in integration and the use of optoelectronic principles (lasers, holography). Development is also moving along the path of “intellectualization” of computers, eliminating the barrier between a person and a computer; they will be able to perceive information from a handwritten or printed test, from forms, from a human voice, recognize the user by voice, and translate from one language to another. In fifth-generation computers there will be a qualitative transition from data processing to knowledge processing. The architecture of future generation computers will contain two main blocks. One of them is a traditional computer, but now it is deprived of communication with the user. This connection is carried out by a block, the so-called intelligent interface. Its task is to understand text written in natural language and containing the condition of the problem, and translate it into a working computer program. The problem of decentralization of computing will also be solved using computer networks, both large ones located at a considerable distance from each other, and miniature computers located on a single semiconductor chip.

If you find an error, please select a piece of text and press Ctrl+Enter.