Słownik terminologii komputerowej

Skorzystaj z wyszukiwarki lub indeksu alfabetycznego.
Przykłady: pci, /dev/null, functional unit, embedded system, pseudo-tty, nfs.


1 definition found From The Free On-line Dictionary of Computing (05 January 2017) [foldoc]: bit (b) binary digit. The unit of information; the amount of information obtained by asking a yes-or-no question; a computational quantity that can take on one of two values, such as false and true or 0 and 1; the smallest unit of storage - sufficient to hold one bit. A bit is said to be "set" if its value is true or 1, and "reset" or "clear" if its value is false or 0. One speaks of setting and clearing bits. To toggle or "invert" a bit is to change it, either from 0 to 1 or from 1 to 0. The term "bit" first appeared in print in the computer-science sense in 1949, and seems to have been coined by the eminent statistician, John Tukey. Tukey records that it evolved over a lunch table as a handier alternative to "bigit" or "binit". See also flag, trit, mode bit, byte, word. [Jargon File] (2002-01-22)