bit

n.

[from the mainstream meaning and "Binary digIT"]

  1. [techspeak] The unit of information; the amount of information obtained from knowing the answer to a yes-or-no question for which the two outcomes are equally probable.

  2. [techspeak] A computational quantity that can take on one of two values, such as true and false or 0 and 1.

  3. A mental flag: a reminder that something should be done eventually. "I have a bit set for you." (I haven't seen you for a while, and I'm supposed to tell or ask you something.)

  4. More generally, a (possibly incorrect) mental state of belief. "I have a bit set that says that you were the last guy to hack on EMACS." (Meaning "I think you were the last guy to hack on EMACS, and what I am about to say is predicated on this, so please stop me if this isn't true.") "I just need one bit from you" is a polite way of indicating that you intend only a short interruption for a question that can presumably be answered yes or no.

A bit is said to be set if its value is true or 1, and reset or clear if its value is false or 0. One speaks of setting and clearing bits. To toggle or invert a bit is to change it, either from 0 to 1 or from 1 to 0. See also flag, trit, mode bit.

The term bit first appeared in print in the computer-science sense in a 1948 paper by information theorist Claude Shannon, and was there credited to the early computer scientist John Tukey (who also seems to have coined the term software). Tukey records that bit evolved over a lunch table as a handier alternative to bigit or binit, at a conference in the winter of 1943-44.