History of computing: Difference between revisions

Source: Wikipedia, the free encyclopedia.
Content deleted Content added
JKlug81 (talk | contribs)
named an existing reference and linked it to another note that it applied to
Tag: Reverted
Line 30: Line 30:
In the 3rd century BC, [[Archimedes]] used the mechanical principle of balance (see [[Archimedes Palimpsest#Mathematical content]]) to calculate mathematical problems, such as the number of grains of sand in the universe (''[[The sand reckoner]]''), which also required a recursive notation for numbers (e.g., the [[myriad]] [[myriad]]).
In the 3rd century BC, [[Archimedes]] used the mechanical principle of balance (see [[Archimedes Palimpsest#Mathematical content]]) to calculate mathematical problems, such as the number of grains of sand in the universe (''[[The sand reckoner]]''), which also required a recursive notation for numbers (e.g., the [[myriad]] [[myriad]]).


Around 200 BC the development of gears had made it possible to create devices in which the positions of wheels would correspond to positions of astronomical objects. By about 100 AD [[Hero of Alexandria]] had described an odometer-like device that could be driven automatically and could effectively count in digital form.<ref>{{cite book|last=Wolfram|first=Stephen|title=A New Kind of Science|publisher=Wolfram Media, Inc.|year=2002|page=[https://archive.org/details/newkindofscience00wolf/page/1107 1107]|isbn=1-57955-008-8|url-access=registration|url=https://archive.org/details/newkindofscience00wolf/page/1107}}</ref> But it was not until the 1600s that mechanical devices for digital computation appear to have actually been built.
Around 200 BC the development of gears had made it possible to create devices in which the positions of wheels would correspond to positions of astronomical objects.<ref name="Wolfram" /> By about 100 AD [[Hero of Alexandria]] had described an odometer-like device that could be driven automatically and could effectively count in digital form.<ref name=Wolfram>{{cite book|last=Wolfram|first=Stephen|title=A New Kind of Science|publisher=Wolfram Media, Inc.|year=2002|page=[https://archive.org/details/newkindofscience00wolf/page/1107 1107]|isbn=1-57955-008-8|url-access=registration|url=https://archive.org/details/newkindofscience00wolf/page/1107}}</ref> But it was not until the 1600s that mechanical devices for digital computation appear to have actually been built.


The [[Antikythera mechanism]] is believed to be the earliest known mechanical analog computer.<ref>{{cite web|url=http://www.antikythera-mechanism.gr/project/overview|title=Project Overview|website=The Antikythera Mechanism Research Project|access-date=2020-01-15}}</ref> It was designed to calculate astronomical positions. It was discovered in 1901 in the [[Antikythera]] wreck off the Greek island of Antikythera, between Kythera and [[Crete]], and has been dated to ''circa'' 100 BC.
The [[Antikythera mechanism]] is believed to be the earliest known mechanical analog computer.<ref>{{cite web|url=http://www.antikythera-mechanism.gr/project/overview|title=Project Overview|website=The Antikythera Mechanism Research Project|access-date=2020-01-15}}</ref> It was designed to calculate astronomical positions. It was discovered in 1901 in the [[Antikythera]] wreck off the Greek island of Antikythera, between Kythera and [[Crete]], and has been dated to ''circa'' 100 BC.

Revision as of 16:32, 11 March 2021

The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.

Concrete devices

Digital computing is intimately tied to the representation of numbers.[1] But long before abstractions like the number arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as:

Numbers

Eventually, the concept of numbers became concrete and familiar enough for counting to arise, at times with sing-song mnemonics to teach

Piraha language, have words for at least "one" and "two", and even some animals like the blackbird can distinguish a surprising number of items.[5]

Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven. See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.

By the High Middle Ages, the

trigonometric functions. By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation.[6] Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overflowed the memory of the calculators, by hand, just to learn the answer; by 1976 Feynman had purchased an HP-25 calculator with a 49 program-step capacity; if a differential equation required more than 49 steps to solve, he could just continue his computation by hand.[7]

Early computation

Mathematical statements need not be abstract only; when a statement can be illustrated with actual numbers, the numbers can be communicated and a community can arise. This allows the repeatable, verifiable statements which are the hallmark of mathematics and science. These kinds of statements have existed for thousands of years, and across multiple civilizations, as shown below:

The earliest known tool for use in computation is the Sumerian abacus, and it was thought to have been invented in Babylon c. 2700–2300 BC. Its original style of usage was by lines drawn in sand with pebbles. Abaci, of a more modern design, are still used as calculation tools today. This was the first known calculator and most advanced system of calculation known to date - preceding Archimedes by 2,000 years.

In c. 1050–771 BC, the

Chinese abacus.[8]

In the 5th century BC in

Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions.[9]

In the 3rd century BC,

The sand reckoner), which also required a recursive notation for numbers (e.g., the myriad myriad
).

Around 200 BC the development of gears had made it possible to create devices in which the positions of wheels would correspond to positions of astronomical objects.[10] By about 100 AD Hero of Alexandria had described an odometer-like device that could be driven automatically and could effectively count in digital form.[10] But it was not until the 1600s that mechanical devices for digital computation appear to have actually been built.

The Antikythera mechanism is believed to be the earliest known mechanical analog computer.[11] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC.

The Russian abacus, the schoty (Russian: счёты, plural from Russian: счёт, counting), was one of the earliest abacuses ever created. It usually had a single slanted deck, with ten beads on each wire (except one wire, usually positioned near the user, with four beads for quarter-ruble fractions). Older models have another 4-bead wire for quarter-kopeks, which were minted until 1916. The Russian abacus is often used vertically, with each wire from left to right like lines in a book. The wires are usually bowed to bulge upward in the center, to keep the beads pinned to either of the two sides. It is cleared when all the beads are moved to the right. During manipulation, beads are moved to the left.

Mechanical analog computer devices appeared again a thousand years later in the

castle clock, which is considered to be the first programmable analog computer.[17]

During the Middle Ages, several European philosophers made attempts to produce analog computer devices. Influenced by the Arabs and

Gottfried Leibniz
(early 18th century), who developed his ideas further, and built several calculating tools using them.

Indeed, when

Bernoulli numbers, a complex calculation requiring a recursive algorithm. This is considered to be the first example of a true computer program, a series of instructions that act upon data not known in full until the program is run. Following Babbage, although unaware of his earlier work, Percy Ludgate in 1909 published the 2nd of the only two designs for mechanical analytical engines in history.[18]

Several examples of analog computation survived into recent times. A

air
both as the analog quantity and the controlling element. Unlike modern digital computers, analog computers are not very flexible, and need to be reconfigured (i.e., reprogrammed) manually to switch them from working on one problem to another. Analog computers had an advantage over early digital computers in that they could be used to solve complex problems using behavioral analogues while the earliest attempts at digital computers were quite limited.

Smith Chart is a well-known nomogram
.

Since computers were rare in this era, the solutions were often hard-coded into paper forms such as nomograms,[19] which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system.

Digital electronic computers

The “brain” [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.

— British newspaper The Star in a June 1949 news article about the EDSAC computer, long before the era of the personal computers.[20]

None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.

In an 1886 letter, Charles Sanders Peirce described how logical operations could be carried out by electrical switching circuits.[21] During 1880-81 he showed that NOR gates alone (or alternatively NAND gates alone) can be used to reproduce the functions of all the other logic gates, but this work on it was unpublished until 1933.[22] The first published proof was by Henry M. Sheffer in 1913, so the NAND logical operation is sometimes called Sheffer stroke; the logical NOR is sometimes called Peirce's arrow.[23] Consequently, these gates are sometimes called universal logic gates.[24]

Eventually,

Lee De Forest's modification, in 1907, of the Fleming valve can be used as a logic gate. Ludwig Wittgenstein introduced a version of the 16-row truth table as proposition 5.101 of Tractatus Logico-Philosophicus (1921). Walther Bothe, inventor of the coincidence circuit, got part of the 1954 Nobel Prize in physics, for the first modern electronic AND gate in 1924. Konrad Zuse designed and built electromechanical logic gates for his computer Z1
(from 1935–38).

The first recorded idea of using digital electronics for computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams.[25] From 1934 to 1936, NEC engineer Akira Nakashima published a series of papers introducing switching circuit theory, using digital electronics for Boolean algebraic operations,[26][27][28] influencing Claude Shannon's seminal 1938 paper "A Symbolic Analysis of Relay and Switching Circuits".[29]

The 1937 Atanasoff–Berry computer design was the first digital electronic computer, though it was not programmable. The Z3 computer, built by German inventor Konrad Zuse in 1941, was the first programmable, fully automatic computing machine, but it was not electronic.

Turing-complete
programming systems.

During World War II, ballistics computing was done by women, who were hired as "computers." The term computer remained one that referred to mostly women (now seen as "operator") until 1945, after which it took on the modern definition of machinery it presently holds.[30]

The ENIAC (Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer, announced to the public in 1946. It was Turing-complete,[citation needed] digital, and capable of being reprogrammed to solve a full range of computing problems. Women implemented the programming for machines like the ENIAC, and men created the hardware.[30]

The

Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[31]

mass-production basis, which limited them to a number of specialised applications.[39]

In 1954, 95% of computers in service were being used for engineering and scientific purposes.[40]

Personal computers

The

computer revolution.[46][47] The MOSFET is the most widely used transistor in computers,[48][49] and is the fundamental building block of digital electronics.[50]

The

CPU design in 1968,[55][54] before Sharp's Tadashi Sasaki conceived of a single-chip CPU design, which he discussed with Busicom and Intel in 1968.[56] The Intel 4004 was then developed as a single-chip microprocessor from 1969 to 1970, led by Intel's Federico Faggin, Marcian Hoff, and Stanley Mazor, and Busicom's Masatoshi Shima.[54] The chip was mainly designed and realized by Faggin, with his silicon-gate MOS technology.[51] The microprocessor led to the microcomputer revolution, with the development of the microcomputer, which would later be called the personal computer
(PC).

Most early microprocessors, such as the

TI-99/4A
computers.

The 1980s brought about significant advances with microprocessor that greatly impacted the fields of engineering and other sciences. The

Macintosh
computer, still running on the Motorola 68000 microprocessor, but with only 128KB of RAM, one floppy drive, and no hard drive in order to lower the price.

In the late 1980s and early 1990s, we see more advancements with computers becoming more useful for actual computational purposes.[clarification needed] In 1989, Apple released the Macintosh Portable, it weighed 7.3 kg (16 lb) and was extremely expensive, costing US$7,300. At launch it was one of the most powerful laptops available, but due to the price and weight, it was not met with great success, and was discontinued only two years later. That same year Intel introduced the Touchstone Delta supercomputer, which had 512 microprocessors. This technological advancement was very significant, as it was used as a model for some of the fastest multi-processor systems in the world. It was even used as a prototype for Caltech researchers, who used the model for projects like real time processing of satellite images and simulating molecular models for various fields of research.

Supercomputers

In terms of supercomputing, the first widely acknowledged supercomputer was the

Cray Research Inc.[61] With support from investors in Wall Street, an industry fueled by the Cold War, and without the restrictions he had within CDC, he created the Cray-1 supercomputer. With a clock speed of 80 MHz or 136 megaFLOPS, Cray developed a name for himself in the computing world. By 1982, Cray Research produced the Cray X-MP equipped with multiprocessing and in 1985 released the Cray-2, which continued with the trend of multiprocessing and clocked at 1.9 gigaFLOPS. Cray Research developed the Cray Y-MP
in 1988, however afterwards struggled to continue to produce supercomputers. This was largely due to the fact that the Cold War had ended, and the demand for cutting edge computing by colleges and the government declined drastically and the demand for micro processing units increased.

Today, supercomputers are still used by the governments of the world and educational institutions for computations such as simulations of natural disasters, genetic variant searches within a population relating to disease, and more. As of November 2020[update], the fastest supercomputer is Fugaku.

Navigation and astronomy

Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed by looking up numbers in a

W.J. Eckert
systematized the use of interpolation in tables of numbers for punch card calculation.

Weather prediction

The numerical solution of differential equations, notably the

Navier-Stokes equations
was an important stimulus to computing, with
weather forecasts.[citation needed
]

Symbolic computations

By the late 1960s, computer systems could perform symbolic algebraic manipulations well enough to pass college-level calculus courses.[citation needed]

Important women and their contributions

Women are often underrepresented in

STEM fields, when compared to their male counterparts.[65]
However, there have been notable examples of women in the history of computing, such as:

See also

References

  1. ^ "Digital Computing - Dictionary definition of Digital Computing | Encyclopedia.com: FREE online dictionary". www.encyclopedia.com. Retrieved 2017-09-11.
  2. ^ "One-to-One Correspondence: 0.5". Victoria Department of Education and Early Childhood Development. Archived from the original on 20 November 2012.
  3. ^ W., Weisstein, Eric. "3, 4, 5 Triangle". mathworld.wolfram.com. Retrieved 2017-09-11.{{cite web}}: CS1 maint: multiple names: authors list (link)
  4. .
  5. ^ "DIY: Enrico Fermi's Back of the Envelope Calculations".
  6. ^ "Try numbers" was one of Feynman's problem solving techniques.
  7. ^ Xu Yue (190 CE) Supplementary Notes on the Art of Figures, a book of the Eastern Han Dynasty
  8. .
  9. ^ .
  10. ^ "Project Overview". The Antikythera Mechanism Research Project. Retrieved 2020-01-15.
  11. ^ "Islam, Knowledge, and Science". University of Southern California. Archived from the original on 2008-01-19. Retrieved 2008-01-22.
  12. ^ Simon Singh, The Code Book, pp. 14-20
  13. ^ "Al-Kindi, Cryptgraphy, Codebreaking and Ciphers". Retrieved 2007-01-12.
  14. ..
  15. on March 1, 2014, retrieved 2008-09-06
  16. ^ "Percy E. Ludgate Prize in Computer Science" (PDF). The John Gabriel Byrne Computer Science Collection. Retrieved 2020-01-15.
  17. ^ Steinhaus, H. (1999). Mathematical Snapshots (3rd ed.). New York: Dover. pp. 92–95, p. 301.
  18. ^ "Tutorial Guide to the EDSAC Simulator" (PDF). Retrieved 2020-01-15.
  19. Burks, Arthur W., "Review: Charles S. Peirce, The new elements of mathematics", Bulletin of the American Mathematical Society v. 84, n. 5 (1978), pp. 913–18, see 917. PDF Eprint
    .
  20. ^ Peirce, C. S. (manuscript winter of 1880–81), "A Boolian Algebra with One Constant", published 1933 in Collected Papers v. 4, paragraphs 12–20. Reprinted 1989 in Writings of Charles S. Peirce v. 4, pp. 218–21, Google [1]. See Roberts, Don D. (2009), The Existential Graphs of Charles S. Peirce, p. 131.
  21. .
  22. .
  23. ^ History of Research on Switching Theory in Japan, IEEJ Transactions on Fundamentals and Materials, Vol. 124 (2004) No. 8, pp. 720-726, Institute of Electrical Engineers of Japan
  24. ^ Switching Theory/Relay Circuit Network Theory/Theory of Logical Mathematics, IPSJ Computer Museum, Information Processing Society of Japan
  25. ^ Radomir S. Stanković, Jaakko Astola (2008), Reprints from the Early Days of Information Sciences: TICSP Series On the Contributions of Akira Nakashima to Switching Theory, TICSP Series #40, Tampere International Center for Signal Processing, Tampere University of Technology
  26. CiteSeerX 10.1.1.66.1248
    .
  27. ^ .
  28. .
  29. .
  30. .
  31. ^ Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
  32. ^ Early Computers, Information Processing Society of Japan
  33. ^ a b 【Electrotechnical Laboratory】 ETL Mark III Transistor-Based Computer, Information Processing Society of Japan
  34. ^ Early Computers: Brief History, Information Processing Society of Japan
  35. ^ Martin Fransman (1993), The Market and Beyond: Cooperation and Competition in Information Technology, page 19, Cambridge University Press
  36. ^ .
  37. .
  38. ^ "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine. Computer History Museum.
  39. .
  40. ^ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
  41. JSTOR 24923169
    .
  42. . The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
  43. .
  44. ^ "Remarks by Director Iancu at the 2019 International Intellectual Property Conference". United States Patent and Trademark Office. June 10, 2019. Retrieved 20 July 2019.
  45. ^ "Dawon Kahng". National Inventors Hall of Fame. Retrieved 27 June 2019.
  46. ^ "Martin Atalla in Inventors Hall of Fame, 2009". Retrieved 21 June 2013.
  47. ^ "Triumph of the MOS Transistor". YouTube. Computer History Museum. 6 August 2010. Retrieved 21 July 2019.
  48. ^ a b c "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum. Retrieved 22 July 2019.
  49. .
  50. ^ "1968: Silicon Gate Technology Developed for ICs". Computer History Museum. Retrieved 22 July 2019.
  51. ^ a b c Federico Faggin, The Making of the First Microprocessor, IEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
  52. ^ Nigel Tout. "The Busicom 141-PF calculator and the Intel 4004 microprocessor". Retrieved November 15, 2009.
  53. ^ Aspray, William (1994-05-25). "Oral-History: Tadashi Sasaki". Interview #211 for the Center for the History of Electrical Engineering. The Institute of Electrical and Electronics Engineers, Inc. Retrieved 2013-01-02.
  54. ^ Conner, Stuart. "Stuart's TM 990 Series 16-bit Microcomputer Modules". www.stuartconner.me.uk. Retrieved 2017-09-05.
  55. ^ "Computers | Timeline of Computer History | Computer History Museum". www.computerhistory.org. Retrieved 2017-09-05.
  56. ^ Vaughan-Nichols, Steven (November 27, 2017). "A super-fast history of supercomputers: From the CDC 6600 to the Sunway TaihuLight".
  57. ^ "CDC 7600".
  58. ^ "Seymour R. Cray". Encyclopædia Britannica.
  59. ^ Charney, Fjörtoft and von Neumann, 1950, Numerical Integration of the Barotropic Vorticity Equation Tellus, 2, 237-254
  60. ^ Witman, Sarah (16 June 2017). "Meet the Computer Scientist You Should Thank For Your Smartphone's Weather App". Smithsonian. Retrieved 22 July 2017.
  61. . Retrieved 2020-01-15.
  62. ^ Myers, Blanca (March 3, 2018). "Women and Minorities in Tech, By the Numbers". Wired.

External links

British history links