The history of information technology dates back to long before the emergence of the modern discipline of computer science, which emerged in the 20th century. Information technology (IT) is associated with the study of methods and means of collecting, processing and transmitting data in order to obtain new quality information about the state of an object, process or phenomenon.

Due to the increasing need of mankind to process more and more data, the means of obtaining information have improved from the earliest mechanical inventions to modern computers. Also, within the framework of information technology, there is the development of related mathematical theories, which now form modern concepts.

Early history

The earliest mention of the use of computing devices dates back to 2700-2300 B.C., when the abacus was common in ancient Sumer. It consisted of a blackboard with lines drawn to delineate the sequence of orders of the number system. The original use of the Sumerian abacus was to draw lines on sand and pebbles. Modified abacus were also used as modern calculators.

The Antikythera mechanism is considered to be the earliest known mechanical analogue of a computer. It was designed to calculate astronomical positions. Such a mechanism was discovered in 1901 on the ruins of the Greek island of Antikythera between Kythira and Crete and was dated to 100 B.C. Technological artifacts of this complexity did not appear again until the 14th century, when mechanical astronomical clocks were invented in Europe.

Mechanical analog computing devices appeared hundreds of years later in the medieval Islamic world. Examples of devices from this period are the equatorium of the inventor al-Zarqali, the mechanical motor of the astrolabe of Abu Rayhan al-Biruni, and the torquetum of Jabir ibn Aflah. Muslim engineers built a number of automatons, including music machines, that could be "programmed" to play a variety of musical compositions. These devices were developed by the brothers Banu Musa and Al-Jazari. Muslim mathematicians have also made important advances in cryptography and cryptanalysis, as well as Al-Kindi's frequency analysis.

After John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of significant progress among inventors and scientists in creating calculation tools. In 1623, Wilhelm Schickard developed a calculating machine, but abandoned the project when the prototype he began building was destroyed by fire in 1624. Around 1640, Blaise Pascal, a leading French mathematician, built the first mechanical addition device. The structure of the description of this device is based on the ideas of the Greek mathematician Heron. Then, in 1672, Gottfried Wilhelm Leibniz invented a step-by-step calculator, which he assembled in 1694.

In 1837, Charles Babbage described his first Analytical Engine, which is considered the earliest design of the modern computer. The Analytical Engine had an expandable memory, an arithmetic device, and logic circuits with the ability to interpret a programming language with loops and conditional branches. Although it was not built, the project was well studied and reflected the idea of Turing's completeness. The Analytical Engine would have less than 1 kilobyte of memory and a clock speed of less than 10 Hz.

In order to be able to create the first modern computer, a significant development in the theory of mathematics and electronics was still required.

Binary logic
In 1703, Gottfried Wilhelm Leibniz developed formal logic, the mathematical meaning of which is described in his writings and consists in the reduction of logic to a binary number system. In it, ones and zeros formally represent true and false values, or the on and off state of some element that can be in two states. These works far preceded the work of George Boole, who published his results in 1854. A new impetus to the development of Boolean algebra was given by Claude Shannon in his 1933 papers, where he showed that the states and transitions between the states of relay switching circuits could be formally described in terms of Boolean algebra, and that the mathematical apparatus of Boolean algebra, which had been well developed by that time, was suitable for their analysis and synthesis. Boolean algebra is now the basis for the logical design of processors, video cards, and many other binary logic systems and devices.

By this time, the first mechanical device controlled by a binary circuit had been invented. The Industrial Revolution gave impetus to the mechanization of many tasks, including weaving. Punched cards controlled the operation of Joseph Marie Jacquard's looms, where a perforated hole on the card meant a binary one, and an unperforated place meant a binary zero. Thanks to punched cards, the machines were able to reproduce the most complex patterns. Jacquard's loom was far from being called a computer, but it shows that a binary system could be used to control machinery.

Establishing discipline

Pioneers of computing

Charles Babbage is considered a pioneer of computing. Babbage had a clear understanding of the mechanical calculations of numbers and tables. From the 1810s, he began to turn his ideas into reality by developing a calculator to calculate numbers up to 8 decimal places. Continuing the success of this idea, Babbage worked to create a machine that could compute numbers up to 20 decimal places. By 1830, Babbage had come up with a plan to develop a machine that could use punched cards to perform arithmetic operations. The machine was supposed to store numbers in blocks of memory and contain a form of sequential control. This means that the operations must be carried out sequentially in such a way that the machine returns a response in the form of good luck or failure. This machine became known as the "Analytical Engine", which was the first prototype of the modern computer. Much later, on January 21, 1888, the Babbage Analytical Engine, which his son had built, underwent a partial test. On this device, pi was successfully calculated to within 29 digits.

The pioneer of computer programming is Ada Lovelace. Lovelace began working for Charles Babbage as an assistant while Babbage worked on The Analytical Engine. During her time with Babbage, Ada Lovelace developed the first computer algorithm that could compute Bernoulli numbers. In addition, the result of her work with Babbage was the prediction that computers would not only perform mathematical calculations, but also manipulate various symbols, not just mathematical ones. She could not see the results of her work, as the "Analytical Engine" was not created during her lifetime, but from the 1940s onwards, her efforts did not go unnoticed.

Until the 1920s, computers (something like a calculating machine) were clerks who performed calculations. Many thousands of these computers were employed in commerce, government, and research institutions. The "computers" were mostly women who had special education. Some performed astronomical calculations for calendars.

The mathematical foundations of modern computer science were laid by Kurt Gödel in his incompleteness theorem (1931). In this theorem, he showed that there are limits to what can be proved and disproved by a formal system. This led to the definition and description of formal systems by Gödel and others, including the definition of concepts such as the μ-recursive function and λ-definable functions.

1936 was a pivotal year for computer science. In parallel, Alan Turing and Alonzo Church introduced the formalization of algorithms, defining the limits of what can be computed, and a "purely mechanical" model for computation.

Alan Turing and his Analytical Engine

After the 1920s, the term "computing machine" refers to any machine that performed the work of a human-computer, especially those that were designed according to the effective methods of the Church-Turing thesis. This thesis is formulated as: "Any algorithm can be given in the form of a corresponding Turing machine or a partially recursive definition, and the class of computable functions coincides with the class of partially recursive functions and with the class of functions computable on Turing machines." In other words, the Church-Turing thesis is defined as a hypothesis about the nature of mechanical computing devices, such as electronic computers. Any possible computation can be done on a computer, as long as it has enough time and storage space.

The mechanisms that work on computing with infinities became known as the analog type. Values in such mechanisms were represented by continuous numerical quantities, such as the angle of rotation of a shaft or an electric potential difference.

Unlike analog machines, digital machines had the ability to represent the state of a numerical value and store each digit separately. Digital machines used a variety of processors or relays before the invention of the RAM device.

Since the 1940s, the name computer has been replaced by the concept of computer. Those computers were able to do the calculations that clerks used to do. From the time that values ceased to depend on physical characteristics (as in analog machines), a logical computer based on digital hardware was able to do everything that could be described by a purely mechanical system.

In 1937, Alan Turing presented his idea of what is now called a Turing machine. The theoretical Turing machine became a hypothetical device, theorized in order to study the properties of such equipment. Anticipating modern computers that have the ability to store programs, he described what became known as the Universal Turing Machine.

Turing machines were designed to formally mathematically determine what could be computed given the constraints of computational power. If the Turing machine can execute a task, then the problem is considered Turing compute. Turing mainly focused on designing a machine that could determine what could be calculated. Turing concluded that as long as there was a Turing machine that could calculate the approximation of a number, that value was calculable. In addition, the Turing machine can interpret Boolean operators such as AND, OR, XOR, NOT, and If-Then-Else to determine whether a function is computable.

At the Symposium on Large-Scale Digital Engineering in Cambridge, Turing said, "We're trying to build a machine to do things just by programming, not by adding more equipment."

Shannon and information theory

Before and during the 1930s, electrical engineers were able to construct electronic circuits to solve mathematical and logical problems, but most of them did so in an ad hoc way without any theoretical rigor. That all changed with the publication of Claude Elwood Shannon's 1937 dissertation on A Symbolic Analysis of Relay and Switching Circuits. Shannon, influenced by Boole's work, recognized that it could be used to organize electromechanical relays for solving logic problems (later used in telephone switchboards). This concept (about the use of the properties of electrical switches) was the basis of all electronic digital computers.

Shannon founded a new branch of computer science, information theory. In 1948, he published a paper titled Mathematical Theory of Communication. The ideas in this article are applied in probability theory to solving the problem of how best to encode the information that the sender wants to convey. This work is one of the theoretical foundations for many fields of research, including data compression and cryptography.

Wiener and cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy aircraft, Norbert Wiener coined the term cybernetics from the Ancient Greek. κυβερνητική "The Art of Management". He published the paper "Cybernetics" in 1948, which influenced the emergence of artificial intelligence. Wiener also compared computing, computing, memory devices, and other cognitively similar concepts to a kind of brainwave analysis.

John von Neumann and von Neumann architecture

In 1946, a model of computer architecture was created, which became known as the von Neumann architecture. From 1950 onwards, the von Neumann model ensured the uniformity of the designs of subsequent computers. Von Neumann's architecture was considered groundbreaking because von Neumann introduced a concept that allowed the use of machine instructions and the allocation of memory areas. The Neumann model consists of 3 main parts: the arithmetic logic device (ALU), the memory (PU), and the memory control unit.

The design of the von Neumann machine uses the RISC architecture (Abbreviated Instruction Set), which means using a set of 21 instructions to perform all tasks. Unlike RISC, CISC (Complex Set of Calculation Instructions) has more instructions to choose from. A set of commands consisted of addresses, operations, and data types. In the von Neumann architecture, RAM together with the battery (the register that contains the result of logical operations) are two addressable blocks of memory.

Operations can be performed as simple arithmetic expressions (they are performed by ALU and include addition, subtraction, multiplication, and division), conditional transitions, and logical moves between different machine components (now more commonly referred to as "if" or "while" loops, "goto" transitions). The von Neumann architecture accepts fractions and instructions as data types. Finally, just as von Neumann's architecture is simple, its registers are just as simple to manage. The architecture uses a set of seven registers to manipulate and interpret the received data and instructions. These registers include: IR (Instruction Register), IBR (Instruction Buffer Register), MQ (Multiplier/Quotient Register), MAR (Memory Address Register), and MDR (Register Data Memory). The architecture also uses a program counter (PC) to keep track of where the program is at what stage.

Hardware

First and second generations of computers

In 1941, Konrad Zuse developed the world's first functional software-controlled Turing-complete computer, the Z3. Zuse noted that the Z2 computer is considered to be the first computer with a controlled process. In 1941, he founded one of the first computer factories to produce the Z4, which became the first commercial computer in the world. In 1946, he developed the first high-level programming language, Plankalkül. In 1969, Zuse proposed the concept of digital physics in his book Rechnender Raum (Calculating Space))

In 1944, the Mark I, the first American programmable computer, was launched. And in 1948, the Manchester Kid was built, the first practical computer based on a model of the Turing machine, capable of running stored programs.

On September 9, 1945, scientists at Harvard University testing the Mark II Aiken Relay Calculator found a moth stuck between the contacts of an electromechanical relay. The insect was pasted into a technical journal with the accompanying caption: "First actual case of bug being found."

The term "bug" is often, but erroneously, attributed to Grace Hopper, the future U.S. Navy rear admiral who allegedly found the "bug" on September 9, 1945.

The first electronic computer is usually called ENIAC (Electronical Numerical Integrator and Calculator), the development of which was carried out under the leadership of John Mauchley and D. Eckert and ended in 1946, although the priority of Mauchly and Eckert was challenged by D. Atanasov by a court decision in 1973. The ENIAC machine was installed at the University of Pennsylvania. It consisted of 18,000 vacuum tubes and 1,500 relays and consumed about 150 kW of electricity. Programmatic control of the sequence of operations was carried out with the help of plugs and dial fields, as in calculating and analytical machines. Configuring ENIAC for a task meant manually changing the connection of 6,000 wires. All these wires had to be switched again when there was another task to be solved. On 2 October 1955, ENIAC was halted.

In 1950, the National Physical Laboratory (UK) completed the Pilot ACE, a small-scale programmable computer based on a model of the Turing machine.

Among other significant developments, IBM introduced the first 5 megabyte hard disk drive (hard disk) RAMAC on Sept. 13, 1956, and on Sept. 12, 1958, Texas Instruments put into production the first microchip (Jack Kilby and Robert Noyce, one of the founders of Intel, are considered to be the inventors of the chip).

Third and subsequent generations of computers

In 1985, Intel introduced a new processor, the 80386, with an operating frequency of 12 MHz.

On April 3, 1986, IBM announced the release of the first laptop computer: the IBM 5140, or IBM PC Convertible, based on the Intel 8088 processor. Compaq launches the first computer based on the 80386 processor.

In 1987, IBM released the IBM PS/2 series of computers, which, however, did not repeat the success of its predecessor, the IBM PC. The junior model of the Model 30 was an analogue of the IBM PC and was equipped with an 8086 processor with a frequency of 8 MHz, 640 KB of RAM, a 20 MB hard drive, and a 3.5-inch floppy disk drive with a capacity of 720 KB. Some computers run the first version of the OS/2 operating system, developed jointly by IBM and Microsoft. The Swedish National Institute for Control and Measurement has approved the MRP standard, the first standard for permissible values of monitor emissions. U.S. Robotics introduced the Courier HST 9600 modem (9600 baud).

In 1988, Compaq released the first computer with 640 KB of RAM, the standard memory for all subsequent generations of DOS. Intel introduced a "stripped-down" version of the Class 386 processor, the 80386SX (with the coprocessor disabled). Operating frequencies are 16-33 MHz, performance is 2-3 million operations per second. In the same year, Hewlett-Packard released the first inkjet printer, the DeskJet series, and Tandy released the first CD-RW disc. NeXT has launched the first NeXT workstation equipped with a new Motorola processor, with a fantastic 8MB of RAM, a 17-inch monitor and a 256MB hard drive. The first version of the NeXTStep operating system was installed on the computers. The second generation NeXT workstations were created in 1990. The new lineup included a redesigned NeXT computer, called the NeXTcube, and a NeXTstation, called the "cooker," which used a "pizza box" form factor.

Then, in 1989, Creative Labs introduced the Sound Blaster 1.0, an 8-bit mono sound card for the PC. Intel launched the first model of the 486DX family of processors (1.25 million transistors, up to 1.6 in later models) with a frequency of 20 MHz and a computing speed of 20 million operations per second. IBM released the first 1GB hard drive, the Model 3380, which weighed more than 250 kg and cost $40,000. Birth of the SuperVGA standard (800x600 pixels resolution with support for 16 thousand colors).

And in 1990, Intel introduced a new processor, the 32-bit 80486SX. The speed is 27 million operations per second. In the same year, MSDOS 4.01 and Windows 3.0 were created. IBM introduced a new video card standard, XGA, as a replacement for traditional VGA (1024x768 pixels with support for 65,000 colors). A specification for the SCSI-2 interface standard has been developed.

Apple introduced the first monochrome handheld scanner in 1991, with AMD releasing improved clones of Intel's 40 MHz 386DX processors, and Intel's 486 SX 20 MHz processor (about 900,000 transistors). The first multimedia computer standard, created by Microsoft in cooperation with a number of major PC manufacturers — MRC, was approved. The first stereo music card is the 8-bit Sound Blaster Pro. IBM introduced the first laptop with a screen based on an active color liquid crystal matrix (AC LCD), the Thinkpad 700C.

In 1992, NEC launched the first CD-ROM drive with double the speed. Intel introduced the 486DX2/40 processor with a "doubling" of the system bus frequency (1.25 million transistors). The speed is 41 million operations per second. At the same time, Cyrix released a "stripped-down" 486SLC processor (with the coprocessor disabled).

Software development

Operating systems

In 1964, Bell Labs, as well as General Electric and researchers at the Massachusetts Institute of Technology, began the Multics OS project. Due to problems with the organization of the user interface, the project was soon closed. Ken Thompson and Brian Kernigan began to improve it in 1969, and later named it by a similar name, UNICS. After a while, the name was shortened to UNIX. The operating system was written in assembler. In November 1971, the first edition of UNIX was published. The first commercial version of UNIX SYSTEM III (based on the seventh version of the system) was published in 1982.

IBM Corporation commissioned Microsoft to work on an operating system for new models of IBM-PC personal computers. At the end of 1981, the first version of the new operating system, PC DOS 1.0, was released. From then on, PC-DOS was used only in IBM computers, and Microsoft got its own modification of MS-DOS. In 1982, PC-DOS and MS-DOS version 1.1 appeared simultaneously with some added and expanded features. Later, these operating systems were combined, and up until the sixth version, they were not much different. The principles of MS-DOS were later used in later Microsoft operating systems.

The first version of Mac OS was published in 1984 along with Apple's first Macintosh personal computer. Combining existing developments and their own ideas, Apple programmers created Mac OS, the first graphical operating system. On March 24, 2000, Apple's new CEO, Steve Jobs, unveiled Mac OS X 10.0, which is highly stable, making it unlike its predecessor, Mac OS 9.

The first Windows, which was released in 1982, differed from its contemporaries, firstly, in the graphical interface (at that time only Mac OS had one), as well as the ability to run several programs at the same time and switch between them. In November 1985, Windows 1.0 was released, followed by versions 2.0, 3.0, and Windows NT 3.5, which had built-in support for local networking at the system level. August 24, 1995 is the official release date of Windows 95. A little later, the new Windows NT was released. While Windows 95 was designed more for user computers, NT was used more in the enterprise environment. In 1998, Windows 98 came out with built-in Internet Explorer 4.0 and Outlook, with the ability to install a web page (called Active Desktop) and active feeds on the desktop, which were the forerunner of modern RSS. At the moment, the most common are Windows XP, 7 and 8, and 10.

Mobile operating systems are also gaining popularity. These are operating systems that run on smartphones, tablets, PDAs, or other digital mobile devices. Modern mobile operating systems combine the features of a personal computer operating system with features such as touch screen, cellular, Bluetooth, Wi-Fi, GPS navigation, camera, video camera, speech recognition, voice recorder, media player, NFC and IR.

Mobile devices with mobile communication capabilities (such as a smartphone) contain two mobile operating systems. The software platform, which is available to the user, is complemented by a second, low-level, proprietary real-time operating system, which powers the radio and other equipment. The most common mobile operating systems are Android, Asha, Blackberry, iOS, Windows Phone, Firefox OS, Sailfish OS, Tizen, Ubuntu Touch OS.