The history of computers кратко

Обновлено: 30.06.2024

In 1822 Charles Babbage, professor of mathematics at Cambridge University in England, created the “Analytical engine”, a mechanical calculator that could automatically produce mathematical tables, a tedious and error-prone manual task in those days. Babbage conceived of a large-scale, steam-driven (!) model, that could perform a wide range of computational tasks. The model has never been completed as revolving shafts and gears could not be manufactured with the crude industrial technology of the day.

By the 1880s manufacturing technology had improved to the point that practical mechanical calculators, including versions of Babbage's Analytical engine, could be produced. The new technology achieved worldwide fame in tabulating the US Census of 1890. The Census Bureau turned to a new tabulating machine invented by Herman Hollerith, which reduced personal data to holes punched in paper cards.

Tiny mechanical fingers "felt" the holes and closed an electrical circuit that in turn advanced the mechanical counter. Hollerith's invention eventually became the foundation on which the International Business Machines Corporation (IBM) was built.

Analog and digital calculators with electromechanical components appeared in a variety of military and intelligence applications in 1930s. Many people credit the invention of the first electronic computer to John Vincent Atanasoff. He produced working models of computer memory and data processing units at the University of Iowa in 1939 although had never assembled a complete working computer.

World War II prompted the development of the first working all-electronic digital computer, Colossus, which the British secret service designed to crack Nazi codes. Similarly, the need to calculate detailed mathematical tables to help aim cannons and missiles led to the creation of the first, general-purpose computer, the electronic numerical integrator and calculator ENIAC at the University of Pennsylvania in 1946.

After leaving their university (arguing over the patent rights) developers of ENIAC, J. Prosper Eckert and John Mauchly, turned to business pursuits. They also had an ugly scandal with an academic colleague, John von Neumann, whom they accused of having unfairly left their names off the scientific paper that first described the computer and allowed von Neumann to claim that he had invented it. Eckert and Mauchly went on to create UNIVAC for the Remington Rand Corporation, an early leader in the computer industry. UNIVAC was the first successful commercial computer, and the first model was sold to the US Census Bureau in 1951.

История вычислений предшествует кремниевым микросхемам и процессорам на сотни лет. Современные компьютеры, с которыми мы все знакомы, могут проследить свои корни до простых вычислительных машин, которые кажутся далекими от того, что мы сегодня считаем компьютером.


кредит: Aidon / Digital Vision / Getty Images

Простые компьютеры

По определению, компьютер - это любое устройство, способное выполнять математические уравнения или вычисления. Поэтому многие простые устройства, такие как счеты (датируемые не менее 300 г. до н.э.) или скользящее правило (впервые появившиеся в Англии в 1630-х годах), являются предшественниками современных современных компьютеров.

Счетные часы

Гарвард Марк-1

В 1944 году Гарвард Марк-1 компьютер завершен. Этот компьютер ближе к современному компьютеру, но на самом деле это просто большой калькулятор, управляемый распределительным валом без хранимых программ. Этот массивный компьютер заполнил большую комнату и был совместным усилием Гарвардского университета и IBM.

ENIAC

Еще один важный компьютер, датируемый 1940-ми годами, - это ENIAC, построенный между 1943 и 1945 годами. ENIA (электронный числовой интегратор и калькулятор) был проектом Университета Пенсильвании и охватил многие комнаты и использовал почти 20 000 вакуумных трубок.

Заря Микропроцессора

Микропроцессор - это микросхема, которая в основном содержит весь компьютер (или, по крайней мере, компьютер эпохи 1940-х годов), использующий интегральную схему. Первый микропроцессор был сделан Intel в 1971 году. С появлением этой технологии, домашний компьютер стал возможным.

Первый ПК

Первым персональным компьютером для домашнего использования был Altair 8800, который содержал микропроцессор Intel 8080. Однако этот компьютер должен был собирать человек, который его купил.

Как просмотреть детали истории звонков для повышения

Как просмотреть детали истории звонков для повышения

Boost Mobile - это поставщик услуг сотовой связи с предоплатой и беспроводной связи, который является дочерним предприятием корпорации Sprint Nextel. Он предлагает ежемесячные планы без требований контракта. Один.

Как просмотреть последние истории печати

Как просмотреть последние истории печати

Хотя вы можете просматривать недавно поставленные в очередь задания на печать в очереди вашего принтера, он не предоставляет полный журнал недавно напечатанных заданий на печать. Для того, чтобы войти историю всех .

Краткое руководство: Raspberry Pi + XMMC + Hulu

Краткое руководство: Raspberry Pi + XMMC + Hulu

Raspbmc превращает ваш Raspberry Pi за 35 долларов в очень подходящий HTPC. С помощью дополнения Bluecop Hulu вы даже можете бесплатно смотреть полные эпизоды из Hulu.

Свидетельство и скидка на обучение каждому участнику

Зарегистрироваться 15–17 марта 2022 г.

Historical Development of Computers

We are living in the computer age. Most of our day to day jobs are being influenced by the use of computers. It is used increasingly in each and every field of our life. In the areas of science and technology improvements can not be achieved without the use of computers. Hence it has become necessary to have basic knowledge about computers.

Strictly speaking, computer is a calculating device having certain important characteristics like speed, storage capacity, accuracy etc. But, now days it is used for many more applications other than computing. It has become an indispensible tool in the field of communications.

History of Computers:

Historians start the history of calculations with the abacus, a wooden frame with balls or beads strung on parallel wires. But, principally first such machine having principles of today’s computing machines, was developed by Charles Babbage in Nineteenth Century. It had certain basic ideas of stored computer programs in the machine. Such a machine was devised by Babbage in the year 1822 and was called difference engine. It was used to perform simple arithmetic computation needed for setting up of trigonometric and logarithmic tables. Further he developed and analytical engine around 1871 that was a prototype computer.

hello_html_m3ca1169f.jpg

Meanwhile an important theoretical development occurred, around 1850, when Geroge Boole, a mathematician developed an algebraic system which is now called as Boolian Algebra. This Boolian algebraic system is used to represent quantities as binary numbers i.e 0s and 1s and also represent and manipulate logical expressions.

The significance of Boolian Algebra was not utilized at that time. In the nineteenth Century, around 1880, Hollerith developed techniques and machine that had significant impact on the future design of computers. He designed a machine in which data was represented in the form of punched holes on paper cards. This machine could work with punched cards and handled 50-80 punched cards per minute. The punched cards contained 80 columns and rectangular punches. These machines were called tabulators. These machines were also used for semiautomatic selection and sorting of cards. He set up his own company “Computer Tabulating Recording Company” which eventually became International Business Machine Corporation (IBM). Today, IBM is one of the largest companies in the computer world.
Early Computers: In, 1937, Howard Alken, of Harward University, designed a huge mechanical calculator called MARK I with a number of switches, mechanical relays and cards. The size was 15X 2.4 m X 0.6 m. This was the immediate predecessor of automatic electronic computers. ENIAC (Electronic Numerical Integrator and Calculator) designed in 1946was the first electronic calculator. It occupied a room of 15X 9m and its weight was 30 tons. It was water

cooled and much faster than MARKXI.

Around 1950, a computer named EDVAC (Electronic Discrete Automatic Computer) was designed which was based on Neumann’s idea. (Frequently referred to as father of modern computer) He was first to use stored programme concept in computers. The storage capacity of EDVAC was 1024 words of 44 bits each. It also had an auxiliary storage of 20,000 words.

First Generation of Computers (1946-55):

The computers manufactured between 1945 -55 are called first Generation Computers. They were extremely large in size with vacuum tubes in their circuitry which generated considerable heat. Hence, special air conditioning arrangements were required to dissipate this heat.

They were extremely slow and their storage capacity was also very less compared to today’s computers. In these computers punched cards were used to enter data in to the computer. These were cards with rectangular holes punched in them using some punching devices. UNIVACI was the first commercially available computer, built in 1951 by Remington Rand Company. It had storage capacity of about 2000 words. These were used mostly for payroll, billing and some mathematical computing.

Second Generation Computers (1956-1965) :

The computers, in which vacuum tubes were replaced by transistors made from semiconductors, were called second generation computers. The use of transistors reduced the heat generated during the operation. It also decreased the size and increased storage capacity. It required less power to operate and were much faster than first generation computers. Magnetic media was being used as an auxiliary storage of data. These computers used high level languages for writing computer programs. FORTRAN and COBOL were the languages used.

Third Generation Computers (1966-1976):

The third generation computers started in 1966 with incorporation of integrated circuits (IC) in the circuitry. IC is a monolithic circuit comprising a circuitry equivalent to tens of transistors on a single chip of semiconductor having a small area a number of pins for external circuit connections.
IBM 360 series computers in this generation had provision for facilitating time sharing and multiprograms also.

These were small size and cost effective computers compared to Second generation computers. Storage capacity and speed of these computers was increased many folds as include user friendly package programs, word processing and remote terminals. Remote terminals could use central computer facilities and get the result, instantaneously.

Fourth Generation Computers:

Fourth Generation Computers were introduced after 1976 and in these computers electronic components were further miniaturized through Large Scale Integration (LSI) techniques Microprocessor which are programmable Ics fabricated using LSI technique are used in these computers. Micro computers were developed by combing microprocessor with other LSI Chips, with compact size, increased speed and increased storage capacity. In recent days, Ics fabricated using VLSI (Very Large Scale Integration) techniques are used in Computers. Through this techniques, the storage capacity is increased many folds. Not only that, the speed of these computers is also very high as compared to earlier computers.

During 1980s, some computers called as super computers were introduced in the market. These computers perform operation with exceptionally high speed (approx 100 million operations per sec). This speed is attained by employing number of microprocessors consequently there cost is also very high. These are normally used in very complex application like artificial intelligence etc.

Let us take a look at the history of computers that we know today. The
very first calculating device used was the ten fingers of a m an’s hands. This, in
fact, is why today we still count in tens and multiples of tens.
Then the abacus was invented. People went on using some form of abacus well into the 16th century, and it is still being used in some parts of the
world because it can be understood without knowing how to read.
During the 17th and 18th centuries many people tried to find easy ways
of calculating. J. Napier, a Scotsman, invented a mechanical way of multiplying and dividing, which is now the modern slide rule works. Henry Briggs
used Napier’s ideas to produce logarithm tables which all mathematicians
use today.
Calculus, another branch of mathematics, was independently invented
by both Sir Isaak Newton, an Englishman, and Leibnitz, a German m athematician. The first real calculating machine appeared in 1820 as the result of
several people’s experiments.
In 1830 Charles Babbage, a gifted English mathematician proposed to
build a general-purpose problem-solving machine that he called “the analytical engine.” This machine, which Babbage showed at the Paris Exhibition in
1855, was an attempt to cut out the human being altogether, except for providing the machine with the necessary facts about the problem to be solved.
He never finished this work, but many of his ideas were the basis for building
today’s computers.
By the early part of the 20th century electromechanical machines had
been developed and were used for business data processing. Dr. Herman Hollerith, a young statistician from the US Census Bureau successfully tabulated
the 1890 census. Hollerith invented a means of coding the data by punching
holes into cards. He built one machine to punch the holes and others to tabulate the collected data. Later Hollerith left the Census Bureau and established his own tabulating machine company. Through a series of merges the company eventually became the IBM Corporation.
Until the middle of the 20th century machines designed to manipulate
punched card data were widely used for business data processing. These early
electromechanical data processors were called unit record machines because
each punched card contained a unit of data.
In the mid-1940s electronic computers were developed to perform
calculations for military and scientific purposes. By the end of the 1960s
commercial models of these computers were widely used for both scientific computation and business data processing. Initially these computers
accepted their input data from punched cards. By the late 1970s punched
cards had been almost universally replaced by keyboard terminals. Since
that time advances in science have led to the proliferation of computers throughout our society, and the past is but the prologue that gives us
a glimpse of the future.

Let us take a look at the history of the computers that we know today. The very first calculating device used was the ten fingers of a man’s hands. This, in fact, is why today we still count in tens and multiples of tens. Then the abacus was invented, a bead frame in which the beads are moved from left to right. People went on using some form of abacus well into the 16th century, and it is still being used in some parts of the world because it can be understood without knowing how to read.

During the 17th and 18th centuries many people tried to find easy ways of calculating. J. Napier, a Scotsman, devised a mechanical way of multiplying and dividing, which is how the modern slide rule works. Henry Briggs used Napier’s ideas to produce logarithm tables which all mathematicians use today.
Calculus, another branch of mathematics, was independently invented by both Sir Isaac Newton, an Englishman, and Leibnitz, a German
The first real calculating machine appeared in 1820 as the result of several people's experiments. This type of machine, which saves a great deal of time and reduces the possibility of making mistakes, depends on a series of ten-toothed gear wheels. In 1830 Charles Babbage, an Englishman, designed a machine that was called ‘The Analytical Engine’. This machine, which Babbage showed at the Paris Exhibition in 1855, was an attempt to cut out the human being altogether, except for providing the machine with the necessary facts about the problem to be solved. He never finished this work, but many of his ideas were the basis for building today's computers.

In 1930, the first analog computer was built by an American named Vannevar Bush. This device was used in World War II to help aim guns. Mark I, the name given to the first digital computer, was completed in 1944. The men responsible for this invention were Professor Howard Aiken and some people from IBM. This was the first machine that could figure out long lists of mathematical problems, all at a very fast rate. In 1946 two engineers at the University of Pennsylvania, J. Eckert and J. Mauchly, built the first digital computer using parts called vacuum tubes. They named their new invention ENIAC. Another important advancement in computers came in 1947, when John von Newmann developed the idea of keeping instructions for the computer inside the computer’s memory.

The first generation of computers, which used vacuum tubes, came out in 1950. Univac I is an example of these computers which could perform thousands of calculations per second. In 1960, the second generation of computers was developed and these could perform work ten times faster than their predecessors. The reason for this extra speed was the use of transistors instead of vacuum tubes. Second-generation computers were smaller, faster and more dependable than first-generation computers. The third-generation computers appeared on the market in 1965. These computers could do a million calculations a second, which is 1000 times as many as first-generation computers. Unlike second-generation computers, these are controlled by tiny integrated circuits and are consequently smaller and more dependable. Fourth-generation computers have now arrived, and the integrated circuits that are being developed have been greatly reduced in size. This is due to microminiaturization, which means that the circuits are much smaller than before; as many as 1000 tiny circuits now fit onto a single chip.A chip is a square or rectangular piece of silicon, usually from to inch, upon which several layers of an integrated circuit are etched or imprinted, after which the circuit is encapsulated in plastic, ceramic or metal. Fourth-generation computers are 50 times faster than third-generation computers and can complete approximately 1,000,000 instructions per second.

давайте взглянем на историю компьютеры, которые мы знаем сегодня.первый расчет устройства был десять пальцев мужской руки.это, по сути, поэтому сегодня мы по - прежнему рассчитывать в десятки и десятки раз.тогда она была придумана, бусинка кадр, в котором бисер двигался слева направо.люди шли с использованием определенной формы счеты и в 16 веке, и он по - прежнему используются в некоторых частях мира, потому что это можно понять, не зная, как читать.

в 17 - 18 веках многие люди пытались найти легких способов расчета.джон напьер, шотландец, разработали механические путем умножения и деления, как современный слайд - правило работает.бригс, генри, используемых напьер идеи производить логарифм таблиц, в которых все математики, используемых сегодня.
исчисление, другая ветвь математики, была независимо придумана как сэр исаак ньютон, англичанин, и лайбниц, немецкий
первый реальный вычислительная машина появилась в 1820 году в результате нескольких людей, эксперименты.машины такого типа,экономит много времени и уменьшает вероятность ошибки, зависит от целого ряда десять зубчатые шестерни.в 1830 году чарльз бэббидж, англичанина, таким образом, машина, которая называлась "аналитический двигателя".эта машина, в которой бэббидж показали на выставке в париже в 1855 году, была попытка вырезать человек в общей сложностиза исключением предоставления машины с необходимые факты о проблема будет решена.он не закончил работу, но многие его идеи были основой для создания современных компьютеров.

в 1930 году, первый аналоговый компьютер был построен американец по имени вэнивар буш.это устройство было использовано во время второй мировой войны, чтобы помочь направлено оружие.марк, я, как называется первый цифровой компьютер,был завершен в 1944 году.людей, ответственных за это изобретение профессор говард эйкен, и некоторые люди из IBM.это была первая машина, которая могла бы выяснить, длинные списки математических проблем, все весьма быстрыми темпами.в 1946 году, два инженера в университете пенсильвании, J. and J. мокли, построил первый цифровой компьютер с помощью частей под названием вакуумной трубки.они назвали новое изобретение эниак.другой важный прогресс в компьютеры появились в 1947 году, когда джон фон newmann развитых идеи сохранения инструкции для компьютера, в памяти компьютера.

первого поколения компьютеров, которые используются в вакуумной трубки, вышел в 1950 году.серии UNIVAC I является примером этих компьютеров, которые могут выполнять тысячи расчетов в секунду.в 1960 году,второе поколение компьютеров была разработана и они могут выполнять работы в 10 раз быстрее своих предшественников.основанием для этого дополнительные скорости было использование транзисторов, вместо того чтобы вакуумной трубки.второе поколение компьютеры были меньше, быстрее и более надежным, чем первого поколения компьютеров.третьего поколения компьютеров появился на рынке в 1965 году.эти компьютеры могут сделать миллион вычислений второго, который в тысячу раз больше, чем в первом поколении компьютеров.в отличие от второго поколения компьютеров, они контролируются маленькой интегральных схем и, следовательно, являются более мелких и более надежной.четвертого поколения компьютеров уже доставлена и интегральных схем, которые разрабатываются были значительно сокращены по размеру.это объясняется microminiaturization, который означает, что каналы являются гораздо меньше, чем раньше, как многие тысячи крошечных цепей теперь сидит на один чип. чип квадратный или прямоугольный предмет кремния, обычно на дюйм, после чего несколько слоев комплексную цепь укоренились или загружена, после чего трасса встроенное в пластик, керамики и металла.четвертого поколения компьютеров в 50 раз быстрее, чем третьего поколения компьютеров и может завершить примерно 1 000 000 команд в секунду.

Читайте также: