Bit

From Fox Labs Wiki

A bit is the most basic unit of information in computing and digital communication. The name is a portmanteau of binary digit. The bit represents one of two logical states with one of two possible values; these values being 0 and 1, but can be represented as false or true respectively.

The symbol for the binary digit is either "bit" or "b". It is important to note, that a byte is represented by the uppercase "B".

Representations

Decimal
Value Metric
1000 kbit kilobit
10002 Mbit megabit
10003 Gbit gigabit
10004 Tbit terabit
10005 Pbit petabit
10006 Ebit exabit
10007 Zbit zettabit
10008 Ybit yottabit
10009 Rbit ronnabit
100010 Qbit quettabit
Binary
Value IEC Memory
1024 Kibit kibibit Kbit Kb kilobit
10242 Mibit mebibit Mbit Mb megabit
10243 Gibit gibibit Gbit Gb gigabit
10244 Tibit tebibit
10245 Pibit pebibit
10246 Eibit exbibit
10247 Zibit zebibit
10248 Yibit yobibit
Orders of magnitude of data

A contiguous group of binary digits is commonly called a bit string, bit vector or a single-dimensional (or multi-dimensional) bit array. A group of eight bits is called a byte, while a group of four bits is called a nibble. Most programming languages associate the Boolean data type to the underlying storage of a single bit.

Multiple bits may be represented in a variety of ways. The most common representation is the byte; however, because of ambiguity between different architectures, the unit octet was defined to specifically represent eight bits.

Computers commonly use groups of bits called words, which are usually multiples of eight, but not always. An example of standard word size in modern computing, is 32-bit and 64-bit length words.

See also