Computer (English computer, IPA: [kəmˈpjuː.tə(ɹ)]- "calculator", from the Latin computare - to count, calculate]) is a term that came to Russian from foreign (mainly English) sources, one of the names of an electronic computer.  It is used in this sense in Russian literary language, scientific, popular science literature. 

The use of computers for various purposes is described in terms of automated (for example, automated control), machine (for example, computer graphics),  computing (for example, computer technology).

A computer system is any device or group of interconnected or related devices, one or more of which, acting in accordance with the program, carries out automated data processing

The word computer is derived from the English words to compute, computer, which are translated as "to calculate", "calculator" (the English word, in turn, comes from the Latin computāre - "to calculate"). Originally, in English, this word meant a person who performed arithmetic calculations with or without the involvement of mechanical devices. In the future, its importance was transferred to the machines themselves, but modern computers perform many tasks that are not directly related to mathematics.

For the first time, the interpretation of the word computer appeared in 1897 in the Oxford English Dictionary. Its compilers then understood the computer as a mechanical computing device. In 1946, the dictionary was replenished with additions that allow to separate the concepts of digital, analog and electronic computer.

The concept of a computer should be distinguished from the concept of an electronic computer (computer); the latter is one way to implement a computer. A computer implies the use of electronic components as its functional nodes, but a computer can be arranged on other principles - it can be mechanical, biological, optical, quantum, etc., working due to the movement of mechanical parts, the movement of electrons, photons or the effects of other physical phenomena. In addition, according to the type of operation, the computer can be digital (CMM) and analog (AVM). On the other hand, the term "computer" implies the possibility of changing the executed program (reprogramming), which is not possible for all types of computers.

Currently, the term computer, as referring more to questions of a specific physical implementation of a computer, has been almost supplanted from domestic use and is mainly used by digital electronics engineers as a legal term in legal documents, as well as in a historical sense - to refer to computer equipment of the 1940-1980s and large computing devices, in contrast to personal ones.

Exponential development of computer technology

After the invention of the integrated circuit, the development of computer technology accelerated dramatically. This empirical fact, noticed in 1965 by Intel co-founder Gordon E. Moore, was named after him Moore's Law. The process of miniaturization of computers is developing just as rapidly. The first electronic computers (such as the ENIAC created in 1946) were huge devices that weighed tons, occupied entire rooms and required a large number of maintenance personnel to function successfully. They were so expensive that only governments and large research organizations could afford them, and seemed so exotic that it seemed as if a small handful of such systems could meet any future needs. In contrast, modern computers — much more powerful and compact and much less expensive — have become truly ubiquitous.

Architecture and structure
The architecture of computers can vary depending on the type of tasks being solved. Optimization of the computer architecture is made in order to mathematically simulate the physical (or other) phenomena under study as realistically as possible. Thus, electronic flows can be used as models of water flows in computer simulations (simulations) of dams, dams or blood flow in the human brain. Similarly designed analog computers were common in the 1960s, but today have become quite rare.

Quantum computers

A quantum computer is a computing device that uses the phenomena of quantum superposition and quantum entanglement to transmit and process data. A quantum computer does not operate with bits, but with qubits. As a result, it has the ability to process all possible states simultaneously, achieving a huge superiority over conventional computers in a number of algorithms.

A full-fledged quantum computer is still a hypothetical device, the very possibility of building which is associated with a serious development of quantum theory. Developments in this area are associated with the latest discoveries and achievements of modern physics. Now only single experimental systems have been implemented that execute a fixed algorithm of small complexity.

Design features

Modern computers use the full range of design solutions developed throughout the development of computer technology These solutions, as a rule, do not depend on the physical implementation of computers, but are themselves the basis on which developers rely. Below are the most important questions solved by the creators of computers:

Digital or analog
The fundamental decision when designing a computer is to choose whether it will be a digital or analog system. While digital computers work with discrete numerical or symbolic variables, analog computers are designed to handle continuous streams of incoming data. Today, digital computers have a much wider range of applications, although their analog counterparts are still used for some special purposes. It should also be mentioned that other approaches are possible here, for example, in pulsed and quantum computing, but so far they are either highly specialized or experimental solutions.

Among the simplest discrete calculators are known abacus, or ordinary abacus; the most complex of these kinds of systems is the supercomputer.

Number system
An example of a computer based on the decimal numeral system is the first American computer, the Mark I.

The most important step in the development of computer technology was the transition to the internal representation of numbers in binary form. This greatly simplified the design of computing devices and peripheral equipment. The adoption of the binary number system as a basis made it possible to more simply implement arithmetic functions and logical operations.

However, the transition to binary logic was not an instantaneous and unconditional process. Many designers tried to develop computers based on a more familiar decimal numeral system for humans. Other design solutions were also used.

Under the guidance of Academician Y. A. Khetagurov, a "highly reliable and protected microprocessor of a non-dual coding system for real-time devices" was developed, using a 1 of 4 encoding system with an active zero.

In general, however, the choice of an internal system for presenting data does not change the basic principles of computer operation - any computer can emulate any other.

Storage of programs and data
During the execution of calculations, it is often necessary to save intermediate data for later use. The performance of many computers is largely determined by the speed at which they can read and write values in (from) memory and its overall capacity. Initially, computer memory was used only to store intermediate values, but it was soon proposed to store program code in the same memory (von Neumann architecture, aka "Princeton") as the data. This solution is used today in most computer systems. However, for control controllers (micro-computers) and signal processors, a scheme in which data and programs are stored in different memory partitions (Harvard architecture) turned out to be more convenient.

Programming
  
John von Neumann is one of the founders of the creation of the architecture of modern computers
The ability of a machine to execute a certain variable set of instructions (programs) without the need for physical reconfiguration is a fundamental feature of computers. This feature was further developed when machines acquired the ability to dynamically control the process of program execution. This allows computers to independently change the order in which program instructions are executed depending on the state of the data. The first really working programmable computer was designed by the German Konrad Zuse in 1941.

With the help of calculations, the computer is able to process information according to a certain algorithm. The solution of any problem for a computer is a sequence of calculations.

In most modern computers, the problem is first described in a form they understand (while all information, as a rule, is represented in binary form - in the form of ones and zeros, although the computer can be implemented on other bases, both integer - for example, a ternary computer, and non-integer), after which the actions for its processing are reduced to the use of a simple algebra of logic. A fast enough electronic computer can be used to solve most mathematical problems, as well as most information processing problems that can be reduced to mathematical ones.

It has been found that computers can solve not every mathematical problem. For the first time, problems that cannot be solved with the help of computers were described by the English mathematician Alan Turing.

Application
  

The first computers were created exclusively for calculations (which is reflected in the names "computer" and "computer"). Even the most primitive computers in this field are many times superior to humans (except for some unique human counters). It is no coincidence that the first high-level programming language was Fortran, designed exclusively for performing mathematical calculations.

The second major application was databases. First of all, governments and banks needed them. Databases require more sophisticated computers with advanced I/O and information storage systems. For this purpose, the Cobol language was developed. Later, DBMSs with their own programming languages appeared.

The third application was to control all sorts of devices. Here, the development went from highly specialized devices (often analog) to the gradual introduction of standard computer systems on which control programs are run. In addition, more and more of the equipment begins to include a control computer.

Fourth. Computers have evolved so much that they have become a major information tool both in the office and at home. Now almost any work with information is often carried out through a computer – whether it is typing or watching movies. This applies both to the storage of information and to its transmission via communication channels. The main application of modern home computers is navigation on the Internet and games.

Fifth. Modern supercomputers are used for computer modeling of complex physical, biological, meteorological and other processes and solving applied problems. For example, to simulate nuclear reactions or climate change. Some projects are carried out using distributed computing, when a large number of relatively weak computers simultaneously work on small parts of a common task, thus forming a very powerful computer.

The most complex and underdeveloped application of computers is artificial intelligence – the use of computers to solve such problems where there is no well-defined more or less simple algorithm. Examples of such tasks are games, machine translation of text, expert systems.