The Information Age (also known as the Third Industrial Revolution, Computer Age, Digital Age, Silicon Age, New Media Age, Internet Age, or the Digital Revolution
historical period that began in the mid-20th century to the early 21st century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology.[2] The onset of the Information Age has been linked to the development of the transistor in 1947[2] and the optical amplifier in 1957.[3]
These technological advances have had a significant impact on the way information is processed and transmitted.
Digital communication became economical for widespread adoption after the invention of the personal computer. Claude Shannon, a Bell Labs mathematician, is credited for having laid out the foundations of digitalization in his pioneering 1948 article, A Mathematical Theory of Communication.[6]
The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely.
The turning point of the revolution was the change from analogue to digitally recorded music.
cassette tapes, as the popular medium of choice.[8]
digital computers. From the late 1940s, universities, military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with the LEO
being the first commercially available general-purpose computer.
Other important technological developments included the invention of the monolithic
In 1962 AT&T deployed the T-carrier for long-haul pulse-code modulation (PCM) digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades the digitisation of voice became the norm for all but the last mile (where analogue continued to be the norm right into the late 1990s).
Following the development of
microcomputer revolution
that began in the 1970s.
MOS technology also led to the development of semiconductor
The public was first introduced to the concepts that led to the
protocols. The ARPANET in particular led to the development of protocols for internetworking
, in which multiple separate networks could be joined into a network of networks.
The Whole Earth movement of the 1960s advocated the use of new technology.[18]
In the 1970s, the home computer was introduced,[19]time-sharing computers,[20] the video game console, the first coin-op video games,[21][22] and the golden age of arcade video games began with Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data.
In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, business, and industry.
Apple, Commodore, and Tandy. To this day the Commodore 64 is often cited as the best selling computer of all time, having sold 17 million units (by some accounts)[23]
between 1982 and 1994.
In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%).[24] By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one.[citation needed] By the late 1980s, many businesses were dependent on computers and digital technology.
Motorola created the first mobile phone,
Motorola DynaTac, in 1983. However, this device used analog communication - digital cell phones were not sold commercially until 1991 when the 2G
network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.
Compute! magazine predicted that CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.[25]
The first true digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States.[26] By the mid-2000s, digital cameras had eclipsed traditional film in popularity.
Digital ink was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in 1989's The Little Mermaid and for all their animation films between 1990's The Rescuers Down Under and 2004's Home on the Range
.
1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0
that June; it was played in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside Japan.
The
dial-up was the only connection type affordable by individual users; the present day mass Internet culture
was not possible.
In 1989, about 15% of all households in the United States owned a personal computer.[32]
For households with children, nearly 30% owned a computer in 1989, and in 2000, 65% owned one.
much more advanced
than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games.
United States of America where text messaging didn't become commonplace till the early 2000s.[citation needed
]
The digital revolution became truly global in this time as well - after revolutionizing society in the
HDTV became the standard television broadcasting format in many countries by the end of the decade. In September and December 2006 respectively, Luxembourg and the Netherlands became the first countries to completely transition from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having broadband internet at home.[36] According to estimates from the Nielsen Media Research, approximately 45.7 million U.S. households in 2006 (or approximately 40 percent of approximately 114.4 million) owned a dedicated home video game console,[37][38] and by 2015, 51 percent of U.S. households owned a dedicated home video game console according to an Entertainment Software Association annual industry report.[39][40] By 2012, over 2 billion people used the Internet, twice the number using it in 2007. Cloud computing had entered the mainstream by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a smartphone.[41] By 2016, half of the world's population was connected[42] and as of 2020, that number has risen to 67%.[43]
In the late 1980s, less than 1% of the world's technologically stored information was in digital format, while it was 94% in 2007, with more than 99% by 2014.[44]
It is estimated that the world's capacity to store information has increased from 2.6 (optimally compressed)
Cell phone subscribers: 12.5 million (0.25% of world population in 1990)[46]
Internet users: 2.8 million (0.05% of world population in 1990)[47]
2000
Cell phone subscribers: 1.5 billion (19% of world population in 2002)[47]
Internet users: 631 million (11% of world population in 2002)[47]
2010
Cell phone subscribers: 4 billion (68% of world population in 2010)[48]
Internet users: 1.8 billion (26.6% of world population in 2010)[42]
2020
Cell phone subscribers: 4.78 billion (62% of world population in 2020)[49]
Internet users: 4.54 billion (59% of world population in 2020)[50]
Overview of early developments
Library expansion and Moore's law
Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years where sufficient space made available.[51] He advocated replacing bulky, decaying printed works with miniaturizedmicroformanalog photographs, which could be duplicated on-demand for library patrons and other institutions.
The world's technological capacity to store information grew from 2.6 (optimally
zettabytes in 2014,[45] the informational equivalent of 4,500 stacks of printed books from the earth to the sun.[citation needed
]
The amount of
Kryder's law prescribes that the amount of storage space available appears to be growing approximately exponentially.[55][56][57][53]
Information transmission
The world's technological capacity to receive information through one-way
zettabytes in 2000; and 1.9 zettabytes in 2007, the information equivalent of 174 newspapers per person per day.[44]
The world's effective capacity to
petabytes of (optimally compressed) information in 1986; 471 petabytes in 1993; 2.2 (optimally compressed) exabytes in 2000; and 65 (optimally compressed) exabytes in 2007, the information equivalent of six newspapers per person per day.[44] In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. A computer that cost $3000 in 1997 would cost $2000 two years later and $1000 the following year, due to the rapid advancement of technology.[citation needed
]
Computation
The world's technological capacity to compute information with human-guided general-purpose computers grew from 3.0 × 108
Trends in Ecology and Evolution in 2016 reported that:[45]
general-purpose computers
were capable of performing well over 10^18 instructions per second. Estimates suggest that the storage capacity of an individual human brain is about 10^12 bytes. On a per capita basis, this is matched by current digital storage (5x10^21 bytes per 7.2x10^9 people).
Genetic information
Genetic code may also be considered part of the
information revolution. Now that sequencing has been computerized, genome can be rendered and manipulated as data. This started with DNA sequencing, invented by Walter Gilbert and Allan Maxam[58] in 1976-1977 and Frederick Sanger in 1977, grew steadily with the Human Genome Project, initially conceived by Gilbert and finally, the practical applications of sequencing, such as gene testing, after the discovery by Myriad Genetics of the BRCA1 breast cancer gene mutation. Sequence data in Genbank has grown from the 606 genome sequences registered in December 1982 to the 231 million genomes in August 2021. An additional 13 trillion incomplete sequences are registered in the Whole Genome Shotgun submission database as of August 2021. The information contained in these registered sequences has doubled every 18 months.[59]
During rare times in human history, there have been periods of innovation that have transformed human life. The Neolithic Age, the Scientific Age and the Industrial Age all, ultimately, induced discontinuous and irreversible changes in the economic, social and cultural elements of the daily life of most people. Traditionally, these epochs have taken place over hundreds, or in the case of the Neolithic Revolution, thousands of years, whereas the Information Age swept to all parts of the globe in just a few years, as a result of the rapidly advancing speed of information exchange.
Between 7,000 and 10,000 years ago during the Neolithic period, humans began to domesticate animals, began to farm grains and to replace stone tools with ones made of metal. These innovations allowed nomadic hunter-gatherers to settle down. Villages formed along the
hieroglyphs in Egypt in 3,500 B.C. and writing in Egypt in 2,560 B.C. and in Minoa
and China around 1,450 B.C.) enabled ideas to be preserved for extended periods to spread extensively. In all, Neolithic developments, augmented by writing as an information tool, laid the groundwork for the advent of civilization.
The Scientific Age began in the period between Galileo's 1543 proof that the planets orbit the Sun and Newton's publication of the laws of motion and gravity in Principia in 1697. This age of discovery continued through the 18th century, accelerated by widespread use of the moveable type printing press by Johannes Gutenberg.
The Industrial Age began in Great Britain in 1760 and continued into the mid-19th century. The invention of machines such as the mechanical textile weaver by Edmund Cartwrite, the rotating shaft steam engine by James Watt and the cotton gin by Eli Whitney, along with processes for mass manufacturing, came to serve the needs of a growing global population. The Industrial Age harnessed steam and waterpower to reduce the dependence on animal and human physical labor as the primary means of production. Thus, the core of the Industrial Revolution was the generation and distribution of energy from coal and water to produce steam and, later in the 20th century, electricity.
The Information Age also requires electricity to power the global networks of computers that process and store data. However, what dramatically accelerated the pace of The Information Age’s adoption, as compared to previous ones, was the speed by which knowledge could be transferred and pervaded the entire human family in a few short decades. This acceleration came about with the adoptions of a new form of power. Beginning in 1972, engineers devised ways to harness light to convey data through fiber optic cable. Today, light-based optical networking systems at the heart of telecom networks and the Internet span the globe and carry most of the information traffic to and from users and data storage systems.