Fifth Generation Computer Systems
This article's tone or style may not reflect the encyclopedic tone used on Wikipedia. (February 2019) |
The Fifth Generation Computer Systems (FGCS;
The term "fifth generation" was intended to convey the system as being advanced: In the history of computing hardware, there were four "generations" of computers. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those using microprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs to gain performance.[citation needed]
Background
In the late 1960s until the early 1970s, there was much talk about "generations" of computer hardware, then usually organized into three generations.
- First generation: Thermionic vacuum tubes. Mid-1940s. IBM pioneered the arrangement of vacuum tubes in pluggable modules. The IBM 650 was a first-generation computer.
- Second generation: Transistors. 1956. The era of miniaturization begins. Transistors are much smaller than vacuum tubes, draw less power, and generate less heat. Discrete transistors are soldered to circuit boards, with interconnections accomplished by stencil-screened conductive patterns on the reverse side. The IBM 7090 was a second-generation computer.
- Third generation: Integrated circuits (silicon chips containing multiple transistors). 1964. A pioneering example is the ACPX module used in the IBM 360/91, which, by stacking layers of silicon over a ceramic substrate, accommodated over 20 transistors per chip; the chips could be packed together onto a circuit board to achieve unprecedented logic densities. The IBM 360/91 was a hybrid second and third-generation computer.
Omitted from this taxonomy is the "zeroth-generation" computer based on metal gears (such as the
There was also a parallel set of generations for software:
- Machine language.
- Second generation: Low-level programming languages such as Assembly language.
- FORTRAN.
- Fourth generation: "Non-procedural" high-level programming languages (such as object-oriented languages).[1]
Throughout these multiple generations up to the 1970s, Japan built computers following U.S. and British leads. In the mid-1970s, the Ministry of International Trade and Industry stopped following western leads and started looking into the future of computing on a small scale. They asked the Japan Information Processing Development Center (JIPDEC) to indicate a number of future directions, and in 1979 offered a three-year contract to carry out more in-depth studies along with industry and academia. It was during this period that the term "fifth-generation computer" started to be used.
Prior to the 1970s, MITI guidance had successes such as an improved steel industry, the creation of the oil
The primary fields for investigation from this initial project were:
- Inference computer technologies for knowledge processing
- Computer technologies to process large-scale data bases and knowledge bases
- High-performance workstations
- Distributed functional computer technologies
- Super-computers for scientific calculation
Project launch
The aim was to build parallel computers for artificial intelligence applications using concurrent logic programming. The project imagined an "epoch-making" computer with supercomputer-like performance running on top of large
Ehud Shapiro captured the rationale and motivations driving this project:[3]
"As part of Japan's effort to become a leader in the computer industry, the Institute for New Generation Computer Technology has launched a revolutionary ten-year plan for the development of large computer systems which will be applicable to knowledge information processing systems. These Fifth Generation computers will be built around the concepts of logic programming. In order to refute the accusation that Japan exploits knowledge from abroad without contributing any of its own, this project will stimulate original research and will make its results available to the international research community."
Logic programming
The target defined by the FGCS project was to develop "Knowledge Information Processing systems" (roughly meaning, applied
- The use of logic to express information in a computer.
- The use of logic to present problems to a computer.
- The use of logical inference to solve these problems.
More technically, it can be summed up in two equations:
- Program = Set of axioms.
- Computation = Proof of a statement from axioms.
The Axioms typically used are universal axioms of a restricted form, called
Logic programming was thought of as something that unified various gradients of computer science (
Results
After having influenced the
The project ran from 1982 to 1994, spending a little less than ¥57 billion (about US$320 million) total.
Concurrent logic programming
In 1982, during a visit to the ICOT, Ehud Shapiro invented Concurrent Prolog, a novel programming language that integrated logic programming and concurrent programming. Concurrent Prolog is a process oriented language, which embodies dataflow synchronization and guarded-command indeterminacy as its basic control mechanisms. Shapiro described the language in a Report marked as ICOT Technical Report 003,[7] which presented a Concurrent Prolog interpreter written in Prolog. Shapiro's work on Concurrent Prolog inspired a change in the direction of the FGCS from focusing on parallel implementation of Prolog to the focus on concurrent logic programming as the software foundation for the project.[3] It also inspired the concurrent logic programming language Guarded Horn Clauses (GHC) by Ueda, which was the basis of KL1, the programming language that was finally designed and implemented by the FGCS project as its core programming language.
The FGCS project and its findings contributed greatly to the development of the concurrent logic programming field. The project produced a new generation of promising Japanese researchers.
Commercial failure
Five running
The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intel x86 machines).
A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a
Another problem was that existing CPU performance quickly overcame the barriers that experts anticipated in the 1980s, and the value of parallel computing dropped to the point where it was for some time used only in niche situations. Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found themselves soon outperformed by "off the shelf" units available commercially.
The project also failed to incorporate outside innovations. During its lifespan,
The FGCS workstations had no appeal in a market where general purpose systems could replace and outperform them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on general-purpose computers, making expensive Lisp machines unnecessary.[9]
Ahead of its time
In summary, the Fifth-Generation project was revolutionary, and accomplished some basic research that anticipated future research directions. Many papers and patents were published. MITI established a committee which assessed the performance of the FGCS Project as having made major contributions in computing, in particular eliminating bottlenecks in parallel processing software and the realization of intelligent interactive processing based on large knowledge bases. However, the committee was strongly biased to justify the project, so this overstates the actual results.[5]
Many of the themes seen in the Fifth-Generation project are now being re-interpreted in current technologies, as the hardware limitations foreseen in the 1980s were finally reached in the 2000s. When
In the early 21st century, many flavors of
It appears, however, that these new technologies do not cite FGCS research. It is not clear if FGCS was leveraged to facilitate these developments in any significant way. No significant impact of FGCS on the computing industry has been demonstrated.[citation needed]
References
- ^ "Roger Clarke's Software Generations".
- ^ J. Marshall Unger, The Fifth Generation Fallacy (New York: Oxford University Press, 1987)
- ^ S2CID 5955109.
- ^ Van Emden, Maarten H., and Robert A. Kowalski. "The semantics of predicate logic as a programming language." Journal of the ACM 23.4 (1976): 733-742.
- ^ .
- ISBN 978-1-4899-7144-9.
- ^ Shapiro E. A subset of Concurrent Prolog and its interpreter, ICOT Technical Report TR-003, Institute for New Generation Computer Technology, Tokyo, 1983. Also in Concurrent Prolog: Collected Papers, E. Shapiro (ed.), MIT Press, 1987, Chapter 2.
- ^ Carl Hewitt. Inconsistency Robustness in Logic Programming ArXiv 2009.
- S2CID 35914860. Archived from the original(PDF) on 12 February 2012.