Bit
A bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents one of two logical states with one of two possible values; these values being 0 and 1, but can be represented as false or true respectively.
The symbol for the binary digit is either "bit" or "b". It is important to note, that a byte is represented by the uppercase "B".
Representations
|
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Orders of magnitude of data | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
A contiguous group of binary digits is commonly called a bit string, bit vector or a single-dimensional (or multi-dimensional) bit array. A group of eight bits is called a byte, while a group of four bits is called a nibble. Most programming languages associate the Boolean data type to the underlying storage of a single bit.
Multiple bits may be represented in a variety of ways. The most common representation is the byte; however, because of ambiguity between different architectures, the unit octet was defined to specifically represent eight bits.
Computers commonly use groups of bits called words, which are usually multiples of eight, but not always. An example of standard word size in modern computing, is 32-bit and 64-bit length words.