Bit
Units of information |
Information-theoretic |
---|
Data storage |
Quantum information |
The bit is the most basic
. These values are most commonly represented as either "1" or "0", but other representations such as true/false, yes/no, on/off, or +/− are also widely used.The relation between these values and the physical states of the underlying
A contiguous group of binary digits is commonly called a
In
The symbol for the binary digit is either "bit", per the
History
The encoding of data by discrete bits was used in the
Physical representation
A bit can be stored by a digital device or other physical system that exists in either of two possible distinct
Bits can be implemented in several forms. In most modern computing devices, a bit is usually represented by an electrical voltage or current pulse, or by the electrical state of a flip-flop circuit.
For devices using
Transmission and processing
Bits are transmitted one at a time in
Storage
In the earliest non-electronic information processing devices, such as Jacquard's loom or Babbage's
In the 1950s and 1960s, these methods were largely supplanted by
In modern
Unit and symbol
The bit is not defined in the International System of Units (SI). However, the International Electrotechnical Commission issued standard IEC 60027, which specifies that the symbol for binary digit should be 'bit', and this should be used in all multiples, such as 'kbit', for kilobit.[11] However, the lower-case letter 'b' is widely used as well and was recommended by the IEEE 1541 Standard (2002). In contrast, the upper case letter 'B' is the standard and customary symbol for byte.
|
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Orders of magnitude of data |
Multiple bits
Multiple bits may be expressed and represented in several ways. For convenience of representing commonly reoccurring groups of bits in information technology, several units of information have traditionally been used. The most common is the unit byte, coined by Werner Buchholz in June 1956, which historically was used to represent the group of bits used to encode a single character of text (until UTF-8 multibyte encoding took over) in a computer[2][12][13][14][15] and for this reason it was used as the basic addressable element in many computer architectures. The trend in hardware design converged on the most common implementation of using eight bits per byte, as it is widely used today.[as of?] However, because of the ambiguity of relying on the underlying hardware design, the unit octet was defined to explicitly denote a sequence of eight bits.
Computers usually manipulate bits in groups of a fixed size, conventionally named "words". Like the byte, the number of bits in a word also varies with the hardware design, and is typically between 8 and 80 bits, or even more in some specialized computers. In the 21st century, retail personal or server computers have a word size of 32 or 64 bits.
The
Information capacity and information compression
This article needs to be updated. The reason given is: it cites a fact about global information content in computers from 2007.(October 2018) |
When the information capacity of a storage system or a communication channel is presented in bits or bits per second, this often refers to binary digits, which is a
For example, it is estimated that the combined technological capacity of the world to store information provides 1,300
Bit-based computing
Certain bitwise computer processor instructions (such as bit set) operate at the level of manipulating bits rather than manipulating data interpreted as an aggregate of bits.
In the 1980s, when
In most computers and programming languages, when a bit within a group of bits, such as a byte or word, is referred to, it is usually specified by a number from 0 upwards corresponding to its position within the byte or word. However, 0 can refer to either the
Other information units
Similar to torque and energy in physics; information-theoretic information and data storage size have the same dimensionality of units of measurement, but there is in general no meaning to adding, subtracting or otherwise combining the units mathematically, although one may act as a bound on the other.
Units of information used in information theory include the shannon (Sh), the natural unit of information (nat) and the hartley (Hart). One shannon is the maximum amount of information needed to specify the state of one bit of storage. These are related by 1 Sh ≈ 0.693 nat ≈ 0.301 Hart.
Some authors also define a binit as an arbitrary information unit equivalent to some fixed but unspecified number of bits.[18]
See also
- Byte
- Integer (computer science)
- Primitive data type
- Trit(Trinary digit)
- Qubit (quantum bit)
- Bitstream
- Entropy (information theory)
- Bit rate and baud rate
- Binary numeral system
- Ternary numeral system
- Shannon (unit)
- Nibble
References
- (PDF) from the original on May 26, 2016. Retrieved August 25, 2019.
- ^ IBM 360used 8-bit characters, although not ASCII directly. Thus Buchholz's "byte" caught on everywhere. I myself did not like the name for many reasons. […]
- ^ Anderson, John B.; Johnnesson, Rolf (2006), Understanding Information Transmission
- ^ Haykin, Simon (2006), Digital Communications
- ^ IEEE Std 260.1-2004
- ^ "Units: B". Archived from the original on 2016-05-04.
- McGraw-Hill.
- ^ J. W. Tukey.
- .
- ISBN 0-252-72548-4. Archived from the original(PDF) on 1998-07-15.
- ^ National Institute of Standards and Technology (2008), Guide for the Use of the International System of Units. Online version. Archived 3 June 2016 at the Wayback Machine
- alphanumericwork, or to handle bytes of only one bit for logical analysis, or to offset the bytes by any number of bits. […]
- System/360took over many of the Stretch concepts, including the basic byte and word sizes, which are powers of 2. For economy, however, the byte size was fixed at the 8 bit maximum, and addressing at the bit level was replaced by byte addressing. […]
- LCCN 61-10466, archived from the original(PDF) on 2017-04-03, retrieved 2017-04-03
- S2CID 36115735.
- ^ a b Information in small bits Information in Small Bits is a book produced as part of a non-profit outreach project of the IEEE Information Theory Society. The book introduces Claude Shannon and basic concepts of Information Theory to children 8 and older using relatable cartoon stories and problem-solving activities.
- ^ "The World's Technological Capacity to Store, Communicate, and Compute Information" Archived 2013-07-27 at the Wayback Machine, especially Supporting online material Archived 2011-05-31 at the Wayback Machine, Martin Hilbert and Priscila López (2011), Science, 332(6025), 60-65; free access to the article through here: martinhilbert.net/WorldInfoCapacity.html
- ISBN 978-0-07059117-2. Archivedfrom the original on 2017-03-27.
External links
- Bit Calculator – a tool providing conversions between bit, byte, kilobit, kilobyte, megabit, megabyte, gigabit, gigabyte
- BitXByteConverter – a tool for computing file sizes, storage capacity, and digital information in various units