History of computer science

Source: Wikipedia, the free encyclopedia.

The history of computer science began long before the modern discipline of

modern computer concepts and machines, led to the development of a major academic field, massive technological advancement across the Western world, and the basis of a massive worldwide trade and culture.[2]

Prehistory

logarithms

The earliest known tool for use in computation was the

In the 5th century BC in

Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions.[6]

The Antikythera mechanism is believed to be an early mechanical analog computer.[7] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.[7]

Mechanical analog computer devices appeared again a thousand years later in the

Banū Mūsā brothers,[12]

Technological artifacts of similar complexity appeared in 14th century Europe, with mechanical astronomical clocks.[13]

When

Stepped Reckoner which he completed in 1694.[17]

In 1837

Turing equivalent. The analytical engine would have had a memory capacity of less than 1 kilobyte of memory and a clock speed of less than 10 Hertz.[18]

Considerable advancement in mathematics and electronics theory was required before the first modern computers could be designed.

Binary logic

Gottfried Wilhelm Leibniz

Gottfried Wilhelm Leibniz (1646–1716) developed logic in a binary number system and has been called the "founder of computer science".[19]

In 1702,

Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.[23]

By this time, the first mechanical devices driven by a binary pattern had been invented. The

Punched cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary one and an unpunched spot indicated a binary zero. Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems.[23]

Emergence of a discipline

Charles Babbage (1791–1871), one of the pioneers of computing

Charles Babbage and Ada Lovelace

Charles Babbage is often regarded as one of the first pioneers of computing. Beginning in the 1810s, Babbage had a vision of mechanically computing numbers and tables. Putting this into reality, Babbage designed a calculator to compute numbers up to 8 decimal points long. Continuing with the success of this idea, Babbage worked to develop a machine that could compute numbers with up to 20 decimal places. By the 1830s, Babbage had devised a plan to develop a machine that could use punched cards to perform arithmetical operations. The machine would store numbers in memory units, and there would be a form of sequential control. This means that one operation would be carried out before another in such a way that the machine would produce an answer and not fail. This machine was to be known as the "Analytical Engine", which was the first true representation of what is the modern computer.[24]

Ada Lovelace (1815–1852) predicted the use of computers in symbolic manipulation

Bernoulli numbers,[26] although this is arguable as Charles was the first to design the difference engine and consequently its corresponding difference based algorithms, making him the first computer algorithm designer. Moreover, Lovelace's work with Babbage resulted in her prediction of future computers to not only perform mathematical calculations but also manipulate symbols, mathematical or not.[27] While she was never able to see the results of her work, as the "Analytical Engine" was not created in her lifetime, her efforts in later years, beginning in the 1840s, did not go unnoticed.[28]

Early post-Analytical Engine designs

Leonardo Torres Quevedo (1852–1936) proposed a consistent manner to store floating-point numbers

Following Babbage, although at first unaware of his earlier work, was Percy Ludgate, a clerk to a corn merchant in Dublin, Ireland. He independently designed a programmable mechanical computer, which he described in a work that was published in 1909.[29][30]

Two other inventors,

Electromechanical Arithmometer, which consisted of an arithmetic unit connected to a (possibly remote) typewriter, on which commands could be typed and the results printed automatically.[34] Bush's paper Instrumental Analysis (1936) discussed using existing IBM punch card machines to implement Babbage's design. In the same year he started the Rapid Arithmetical Machine project to investigate the problems of constructing an electronic digital computer.[35]

Charles Sanders Peirce and electrical switching circuits

Charles Sanders Peirce (1839–1914) described how logical operations could be carried out by electrical switching circuits

In an 1886 letter, Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits.[36] During 1880–81 he showed that NOR gates alone (or alternatively NAND gates alone) can be used to reproduce the functions of all the other logic gates, but this work on it was unpublished until 1933.[37] The first published proof was by Henry M. Sheffer in 1913, so the NAND logical operation is sometimes called Sheffer stroke; the logical NOR is sometimes called Peirce's arrow.[38] Consequently, these gates are sometimes called universal logic gates.[39]

Eventually,

Lee De Forest's modification, in 1907, of the Fleming valve can be used as a logic gate. Ludwig Wittgenstein introduced a version of the 16-row truth table as proposition 5.101 of Tractatus Logico-Philosophicus (1921). Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924. Konrad Zuse designed and built electromechanical logic gates for his computer Z1
(from 1935 to 1938).

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an ad hoc manner, lacking any theoretical rigor. This changed with

digital system design in almost all areas of modern technology.[43]

While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. His thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.[44]

Alan Turing and the Turing machine

Alan Turing, English computer scientist, mathematician, logician, and cryptanalyst. (circa 1930)

Before the 1920s,

computers (sometimes computors) were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Many of these clerks who served as human computers were women.[45][46][47][48] Some performed astronomical calculations for calendars, others ballistic tables for the military.[49]

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the

Church-Turing thesis
. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the analog kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase computing machine gradually gave way, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." The theoretical

, is a hypothetical device theorized in order to study the properties of such hardware.

The mathematical foundations of modern computer science began to be laid by

In 1936 Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing.[51] This became the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis states that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.[51]

In 1936,

Turing computable.[53]

The

Stanley Frankel, has described John von Neumann's view of the fundamental importance of Turing's 1936 paper, in a letter:[52]

I know that in or about 1943 or ‘44 von Neumann was well aware of the fundamental importance of Turing's paper of 1936… Von Neumann introduced me to that paper and at his urging I studied it with care. Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing...

John V. Atanasoff (1903–1995) created the first electric digital computer, known as the Atanasoff–Berry computer

Kathleen Booth and the first assembly language

Kathleen Booth wrote the first assembly language and designed the assembler and autocode for the Automatic Relay Calculator (ARC) at Birkbeck College, University of London.[54] She helped design three different machines including the ARC, SEC (Simple Electronic Computer), and APE(X)C.

Early computer hardware

The world's first electronic digital computer, the Atanasoff–Berry computer, was built on the Iowa State campus from 1939 through 1942 by John V. Atanasoff, a professor of physics and mathematics, and Clifford Berry, an engineering graduate student.

In 1941,

process control computer. He founded one of the earliest computer businesses in 1941, producing the Z4, which became the world's first commercial computer. In 1946, he designed the first high-level programming language, Plankalkül.[57]

In 1948, the

Turing Machines and of his logico-mathematical contributions to the project, were both crucial to the successful development of the Baby.[52]

In 1950, Britain's

Macintosh computer, which was enormous by the standards of his day.[52] Had Turing's ACE been built as planned and in full, it would have been in a different league from the other early computers.[52]

Later in the 1950s, the first Operating System was created allowing more than one program to run at the same time created for General Motors by IBM called GMOS.

In 1969, an experiment was conducted by two research teams at UCLA and Stanford to create a network between 2 computers although the system crashed during the initial attempt to connect to the other computer but was a huge step towards the Internet.

Claude Shannon (1916–2001) helped creating the field of information theory

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II.[59] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a future rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9, 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9, 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).[59]

Shannon and information theory

Claude Shannon went on to found the field of information theory with his 1948 paper titled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.[60]

Norbert Wiener (1894–1964) created the term cybernetics

Wiener and cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.[61]

John von Neumann (1903–1957) introduced the computer architecture known as Von Neumann architecture

John von Neumann and the von Neumann architecture

In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture. Since 1950, the von Neumann model provided uniformity in subsequent computer designs. The von Neumann architecture was considered innovative as it introduced an idea of allowing machine instructions and data to share memory space.[62] The von Neumann model is composed of three major parts, the arithmetic logic unit (ALU), the memory, and the instruction processing unit (IPU). In von Neumann machine design, the IPU passes addresses to memory, and memory, in turn, is routed either back to the IPU if an instruction is being fetched or to the ALU if data is being fetched.[63]

Von Neumann's machine design uses a RISC (Reduced instruction set computing) architecture,[

complex instruction set computing, instruction sets which have more instructions from which to choose.) With von Neumann architecture, main memory along with the accumulator (the register that holds the result of logical operations)[64] are the two memories that are addressed. Operations can be carried out as simple arithmetic (these are performed by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more commonly seen now as if statements or while loops. The branches serve as go to statements), and logical moves between the different components of the machine, i.e., a move from the accumulator to memory or vice versa. Von Neumann architecture accepts fractions and instructions as data types. Finally, as the von Neumann architecture is a simple one, its register management is also simple. The architecture uses a set of seven registers to manipulate and interpret fetched data and instructions. These registers include the "IR" (instruction register), "IBR" (instruction buffer register), "MQ" (multiplier quotient register), "MAR" (memory address register), and "MDR" (memory data register)."[63] The architecture also uses a program counter ("PC") to keep track of where in the program the machine is.[63]

John McCarthy (1927–2011) is considered one of the founding fathers of artificial intelligence

John McCarthy, Marvin Minsky and artificial intelligence

The term artificial intelligence was credited by John McCarthy to explain the research that they were doing for a proposal for the

Dartmouth Summer Research. The naming of artificial intelligence also led to the birth of a new field in computer science.[65] On August 31, 1955, a research project was proposed consisting of John McCarthy, Marvin L. Minsky, Nathaniel Rochester, and Claude E. Shannon
. The official project began in 1956 that consisted of several significant parts they felt would help them better understand artificial intelligence's makeup.

McCarthy and his colleagues' ideas behind automatic computers was while a machine is capable of completing a task, then the same should be confirmed with a computer by compiling a program to perform the desired results. They also discovered that the human brain was too complex to replicate, not by the machine itself but by the program. The knowledge to produce a program that sophisticated was not there yet.

The concept behind this was looking at how humans understand our own language and structure of how we form sentences, giving different meaning and rule sets and comparing them to a machine process. The way computers can understand is at a hardware level. This language is written in binary (1s and 0's). This has to be written in a specific format that gives the computer the ruleset to run a particular hardware piece.[66]

Minsky's process determined how these

artificial neural networks
could be arranged to have similar qualities to the human brain. However, he could only produce partial results and needed to further the research into this idea.

McCarthy and Shannon's idea behind this theory was to develop a way to use complex problems to determine and measure the machine's efficiency through mathematical theory and computations.[67] However, they were only to receive partial test results.

The idea behind self-improvement is how a machine would use self-modifying code to make itself smarter. This would allow for a machine to grow in intelligence and increase calculation speeds.[68] The group believed they could study this if a machine could improve upon the process of completing a task in the abstractions part of their research.

The group thought that research in this category could be broken down into smaller groups. This would consist of sensory and other forms of information about artificial intelligence. Abstractions in computer science can refer to mathematics and programming language.[69]

Their idea of computational creativity is how the program or a machine can be seen in having similar ways of human thinking.[70] They wanted to see if a machine could take a piece of incomplete information and improve upon it to fill in the missing details as the human mind can do. If this machine could do this; they needed to think of how did the machine determine the outcome.

See also

References

  1. ^ Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. Chapman Hall.
  2. ^ "History of Computer Science". uwaterloo.ca.
  3. .
  4. .
  5. ^ Bellos, Alex (2012-10-25). "Abacus adds up to number joy in Japan". The Guardian. London. Retrieved 2013-06-25.
  6. .
  7. ^ a b "Project Overview". The Antikythera Mechanism Research Project. Retrieved 2023-07-06.
  8. ^ "Islam, Knowledge, and Science". Islamic Web. Retrieved 2017-11-05.
  9. ^ Simon Singh, The Code Book, pp. 14–20
  10. ^ "Al-Kindi, Cryptography, Codebreaking and Ciphers". 9 June 2003. Retrieved 2023-08-25.
  11. ..
  12. .
  13. .
  14. ^ "1.6 Shickard's Calculating Clock | Bit by Bit". Retrieved 2021-03-17.
  15. ^ "History of Computing Science: The First Mechanical Calculator". eingang.org.
  16. ^ Kidwell, Peggy Aldritch; Williams, Michael R. (1992). The Calculating Machines: Their history and development. MIT Press., p.38-42, translated and edited from Martin, Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim.
  17. ^ "CS History". everythingcomputerscience.com. Retrieved 2020-05-01.
  18. ^ "2021: 375th birthday of Leibniz, father of computer science". people.idsia.ch.
  19. ^ Lande, Daniel. "Development of the Binary Number System and the Foundations of Computer Science". The Mathematics Enthusiast: 513–540.
  20. ^ Wiener, N., Cybernetics (2nd edition with revisions and two additional chapters), The MIT Press and Wiley, New York, 1961, p. 12.
  21. S2CID 28452205. Archived from the original
    on 23 July 2021. Retrieved 23 July 2021.
  22. ^ a b Tedre, Matti (2014). The Science of Computing: Shaping a Discipline. CRC Press.
  23. ^ "Charles Babbage". Encyclopædia Britannica Online Academic Edition. Encyclopædia Britannica In. 3 July 2023. Retrieved 2023-07-06.
  24. ^ Evans 2018, p. 16.
  25. ^ Evans 2018, p. 21.
  26. ^ Evans 2018, p. 20.
  27. ^ Isaacson, Betsy (2012-12-10). "Ada Lovelace, World's First Computer Programmer, Celebrated With Google Doodle". The Huffington Post. Retrieved 2013-02-20.
  28. ^ "The John Gabriel Byrne Computer Science Collection" (PDF). Archived from the original on 2019-04-16. Retrieved 2019-08-08.
  29. ^ "1907: was the first portable computer design Irish?". Ingenious Ireland. 17 October 2012.
  30. ^ L. Torres Quevedo (1914). "Ensayos sobre Automática – Su definicion. Extension teórica de sus aplicaciones". Revista de la Academia de Ciencias Exacta, Revista 12: 391–418.
  31. ^ Torres Quevedo, Leonardo (19 November 1914). "Automática: Complemento de la Teoría de las Máquinas" (PDF). Revista de Obras Públicas. LXII (2043): 575–583.
  32. .
  33. .
  34. ^ Randell, Brian. "From Analytical Engine to Electronic Digital Computer: The Contributions of Ludgate, Torres, and Bush" (PDF). Archived from the original (PDF) on 21 September 2013. Retrieved 9 September 2013.
  35. Burks, Arthur W., "Review: Charles S. Peirce, The new elements of mathematics", Bulletin of the American Mathematical Society v. 84, n. 5 (1978), pp. 913–18, see 917. PDF Eprint
    .
  36. ^ Peirce, C. S. (manuscript winter of 1880–81), "A Boolian Algebra with One Constant", published 1933 in Collected Papers v. 4, paragraphs 12–20. Reprinted 1989 in Writings of Charles S. Peirce v. 4, pp. 218–21, Google [1]. See Roberts, Don D. (2009), The Existential Graphs of Charles S. Peirce, p. 131.
  37. .
  38. .
  39. .
  40. ^ "Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics". IPSJ Computer Museum. Information Processing Society of Japan.
  41. ^ Radomir S. Stanković (University of Niš), Jaakko T. Astola (Tampere University of Technology), Mark G. Karpovsky (Boston University), Some Historical Remarks on Switching Theory, 2007, DOI 10.1.1.66.1248
  42. ^
    ISSN 1456-2774. Archived from the original (PDF) on 2021-03-08.{{cite book}}: CS1 maint: location missing publisher (link) (3+207+1 pages) 10:00 min
  43. , retrieved 2021-03-17
  44. .
  45. .
  46. .
  47. .
  48. ^ Grier 2013, p. 138.
  49. ^ "Gödel and the limits of logic". plus.maths.org. 2006-06-01. Retrieved 2020-05-01.
  50. ^ a b Copeland, B. Jack (2019). "The Church-Turing Thesis". In Zalta, Edward N. (ed.). Stanford Encyclopedia of Philosophy (Spring 2019 ed.). Metaphysics Research Lab, Stanford University. Retrieved 2020-05-01.
  51. ^ a b c d e f g "Turing's Automatic Computing Engine". The Modern History of Computing. Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. 2017.
  52. ^ Barker-Plummer, David (1995-09-14). "Turing Machines". Stanford Encyclopedia of Philosophy. Retrieved 2013-02-20.
  53. ^ Booth, Kathleen HV, "Machine language for Automatic Relay Computer", Birkbeck College Computation Laboratory, University of London
  54. S2CID 14606587
    .
  55. ^ Rojas, Raúl. "How to Make Zuse's Z3 a Universal Computer". Archived from the original on 2014-07-14.
  56. Science Museum (London)
    on 18 November 2010
  57. ^ "BBC News – How Alan Turing's Pilot ACE changed computing". BBC News. May 15, 2010.
  58. ^ a b "The First "Computer Bug"". CHIPS. 30 (1). United States Navy: 18. January–March 2012. Retrieved 2023-12-03.
  59. OCLC 2654027
    .
  60. .
  61. ^ "Von Neumann Architecture - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 2021-03-17.
  62. ^ .
  63. ^ "Accumlator" Def. 3. Oxford Dictionaries. Archived from the original on May 18, 2013.
  64. ISSN 2371-9621
    .
  65. .
  66. .
  67. .
  68. .
  69. ^ "The Creativity Post | What is Computational Creativity?". The Creativity Post. Retrieved 2021-03-04.

Sources

Further reading

External links