EUROPEAN INTERNATIONAL JOURNAL OF MULTIDISCIPLINARY RESEARCH
AND MANAGEMENT STUDIES
ISSN: 2750-8587
VOLUME04 ISSUE12
93
THE HISTORY OF COMPUTER
O’tkurov Ibrokhim
Target International school, Uzbekistan
AB O U T ART I CL E
Key words:
History of computing, abacus,
Pascaline, Charles Babbage, Artificial intelligence,
Quantum Computing, Technological
Advancement.
Received:
01.12.2024
Accepted
: 05.12.2024
Published
: 10.12.2024
Abstract:
This article explores at the
development of computing from its early
beginnings to the major breakthroughs that have
shaped today’s technology. It covers key moments
in the history of computing, such as the creation of
the abacus, the Pascaline, and Charles Babbage's
engines, as well as recent innovations in areas like
AI and quantum computing. By examining the
work of key figures and significant advancements,
the article offers a detailed overview of how the
field of computing has progressed and how it
continues to influence the modern worldqa.
INTRODUCTION
The history of
computers goes back a long way, with early inventions laying the groundwork for today’s
technology. In ancient times, cultures like the Egyptians and Greeks created tools for counting, such as
the abacus around 2400 BCE and the Antikythera mechanism around 100 BCE. These early devices
were some of the first to perform simple calculations. In the 17th century, Blaise Pascal invented the
Pascaline, a machine that could do basic math, which was a big step forward. However, it wasn’t until
the 19th century that Charles Babbage designed the Difference Engine and Analytical Engine, which
were the first machines that could be programmed, setting the stage for modern computers.
Early Beginnings of Computing:
The history of computer is rich and multifaced. With roots its extending back to the ancient civilization.
Early devices and concepts created the groundwork to advanced technologies that we use nowadays.
The journey from simple mechanical devise to advanced electronic computers stretched centuries and
VOLUME04 ISSUE12
https://doi.org/10.55640/eijmrms-04-12-17
Pages: 93-96
EUROPEAN INTERNATIONAL JOURNAL OF MULTIDISCIPLINARY RESEARCH
AND MANAGEMENT STUDIES
ISSN: 2750-8587
VOLUME04 ISSUE12
94
involved many pivotal moments. The ancient civilizations, including the Egyptains, Babylonians, and
Greeks developed the counting and calculating tool. 1) the abacus(c.2400 BCE): one of the first
calculating tools that was used for subtractions, additions, divisions and multiplications. Even it is still
used in some parts of Asia and middle east. 2) The Antikythera Mechanism (c. 100 BCE) is another
earliest example of mechanical devise that was used for speculating astronomical positions and
eclipses. It is created in ancient Greek.
The renaissance and early modern development in this era the developments have become more
sophisticated with mechanical calculators and people have developed device that could automate
calculations. In 1642 Blaise Pascal have invented the Pascaline device this considered the mechanical
calculator that could perfon addition and subtraction this device with the help of gears and wheel could
illustrate numbers and perform operation.
The birth of modern computing by Charles Babbage. Charles abbage is a man who was considered father
of music. He was the first person who created modern computing system and he created two computing
system. 1) his first major creaton was difference engine he created it in 1822,it is create to avoid errors
by humans and it could automatedly calculate and print mathematical tables. It made a significant step
in the development of automatic computation despite the fact that he could not finish it because of his
financial and technical problems. The second one is analytical engine it was created in 1837 which
surpassed difference engine in advancement.
It was designed to be completely programmable and it could solve any mathematical calculations since
it featured modern key elements: CPU
–
central processing unit, memory unit, punch card system and
conditional branching.
THE birth of modern computing:
The innovations such as ENIAC, transistor and high level of programing marked the birth of modern
computing and it created today’s digital revolution. 1) the full name of
ENIAC Is the eloctonic numerical
integrator and computer it was made by John Presper Eckert and it was the first completely electronic
digital computer it was so fast that it could calculate thousands of copulations per second. Invention of
the transistor the transistor was invited on December 23 in, 1947 by Walter Brattain, John Bardeen and
William Shockley this creation won the Nobel prize in physic for the breakthrough. The bell laborites in
the USA which was the world’s leading telephone companies rec
ognized that transistors could be really
beneficial for applications far removed from telecommunication. Another development is that high level
programing launguages. Before that early computers were programed by using machine code which
EUROPEAN INTERNATIONAL JOURNAL OF MULTIDISCIPLINARY RESEARCH
AND MANAGEMENT STUDIES
ISSN: 2750-8587
VOLUME04 ISSUE12
95
was extremely difficult since there was a set of binary instructions. So three creations in development
of high-level programming in the 1950 make it more accessible.. 1) Fortan which refers to formula
transition that was created in 1957 and one of the first high-level programing launguages and it was
created by ibm in order help scientist and enginiers to write programs for both mathematical and
scientific calculations. Also it helped programmers to use symbolic names for operations, making codes
more easier to write and readable compared to raw machine. Second creation is that LISP developed in
1958 by John McCarthy, it refers to list processing was the first language programing language
recursion and symbolic expression and it was specially designed for artificial intelligence. There is
chance that. The Final invention was COBOL, which stands for Common Business-Oriented Language,
emerged in 1959 to cater to the specific demands of business data processing. Its syntax is notably
similar to English, making it much easier for individuals without a programming background to grasp.
This was a significant factor in encouraging businesses to adopt computing technology.
Modern Advancements in Computing:
Modern advancements in computing, including artificial intelligence, quantum computing, and cloud
technologies, are transforming industries and redefining the limits of technology in unprecedented
ways. To begin with the invention of ai. artificial intelligence is another transformative technology that
is driving modern advancements. Three people including Goodfellow, Bengio, and Courvill often stated
that ai can get don various tasks typically requiring humans intelligence. For instance, chat gpt have
showcased the potential of ai has really proven itself in taking on complicated tasks and making precise
predictions. You can especially see this in fields like healthcare and self-driving cars (Russell & Norvig,
2021). Furthermore, nowadays many companies such as IBM, Google, and Microsoft are creating
quantum computers that can use quantum bits(qubits) because it solves complex problems faster that
means they try to avoid classical computers. Moreover, Companies like Amazon Web Services (AWS),
Microsoft Azure, and Google Cloud have changed how businesses work with data. Cloud computing
helps companies save money and easily grow when they need to. I’ve learned that using these services
makes it easier for teams to work together, even if they are far apart
CONCLUSION
Overall, this article gives information about the journey of computers from simple mechanical devises
to the advanced ones that we use today. The inventions such as the abacus, Pascaline, and Babbage’s
machines helped to create modern calculation. Innovations like the transistor, ENIAC, and high-level
programming languages made computers faster and more accessible. Advancement developments like
EUROPEAN INTERNATIONAL JOURNAL OF MULTIDISCIPLINARY RESEARCH
AND MANAGEMENT STUDIES
ISSN: 2750-8587
VOLUME04 ISSUE12
96
artificial intelligence, quantum computing, and cloud technology show that computing will continue to
grow, impacting our lives in new and exciting ways
REFERENCE
1.
Barsalou, Lawrence
W. “Grounded Cognition.” Annual Review of Psychology, vol. 59, no. 1, Jan. 2008,
pp. 617
–
645, https://doi.org/10.1146/annurev.psych.59.103006.093639. Accessed 29 Apr. 2019.
2.
Jones, Anne, and Mandy Curnow. “Lego Antikythera Mechanism.” Nature, 9 Dec. 2012,
https://doi.org/10.1038/d41586-019-00474-6.
3.
Ceruzzi, Paul E. A History of Modern Computing. Cambridge, Mass.; London, MIT Press, 2003.
4.
Kavis, Michael. Architecting the Cloud: Design Decisions for Cloud Computing Service Models (SaaS,
PaaS, and IaaS). Hoboken, New Jersey, John Wiley & Sons, Inc., 2014.
5.
Mcluhan, Marshall. Understanding Media: The Extensions of Man. Createspace Independent
Publishing Platform, 1964.
6.
Nielsen, Michael A., and Isaac L. Chuang. Quantum Computation and Quantum Information.
Cambridge University Press, 9 Dec. 2010.
7.
Russell, Stuart J., and Peter Norvig. Artificial Intelligence: A Modern Approach. 4th ed., London,
Pearson, 2021.
8.
Swade, Doron. The Difference Engine. Penguin Group, 2002.
9.
Xavier, Cyrus. Hardware Evolution. Publifye AS, 30 Sept. 2024.
10.
Ericsson. “The Transistor –
an Invention Ahead of Its Time.” Ericsson, 29 Aug. 2016,
www.ericsson.com/en/about-us/history/products/other-products/the-transistor--an-invention-
ahead-of-its-time.
