Home

 › 

Articles

 › 

What Is a Nibble in Computing, and What Does It Equal?

nibble

What Is a Nibble in Computing, and What Does It Equal?

Despite how it may seem on the outside, you don’t actually need to be a computer engineer to understand the different terms and units of information in computing.

Whether it be a terabyte, a gigabyte, a megabyte, or something as tiny as a nibble — yes, a nibble — units of information in computing are a lot less complex than their names would suggest.

What exactly is a nibble in computing? And how much does a nibble equal compared to these other units of information in computing? Let’s break it all down below.

A Nibble in Computing Explained

In computing, a nibble can be spelled as nybl, nyble, or nybble; not to mention tetrade or half-byte, which the unit also goes by from time to time. The last of these alternative names clues us into what kind of unit of information a nibble actually is: half of a byte.

However, if you don’t know what a byte amounts to, then this doesn’t do you much good, does it? To add to the confusion, you can also think of a nibble as four times the size of a bit, or double the size of a crumb.

nibble
The term nibble comes from its representation of “half a byte”, with byte a homophone of the English word bite.

Here’s how it all breaks down: a bit is the smallest unit of information in computing, equivalent to just two digits: 0 and 1. A crumb, by this metric, is equivalent to two bits. A nibble, by extension, is equivalent to four bits.

A byte, then, would have to be eight bits. You can continue multiplying like this ad nauseam and getting a new unit of information every time, but for the purpose of this piece, we’ll cut it off here at the nibble.

The term “nibble” dates back to the late 1950s, at least. According to Washington State University computer scientist David B. Benson, he recalls hearing and using the term for half a byte as early as 1958.

However, it’s entirely possible the term “nibble” could pre-date even this early recollection. The alternative spellings, such as “nybble” and “nyble,” began sprouting up in subsequent decades to more closely match the spelling of “byte.”

Alternative Uses of Nibble in Computing

At various points in the nibble’s history, there have been some alternative uses (with alternative meanings) that differ from this official definition above. Let’s take a look at some of the various alternate takes on the nibble throughout computer history below. Worth noting is the fact that both of these alternative uses come from Apple.

The Apple II

Apple II PC
The Apple II is an 8-bit home computer

.

In 1977, with the official launch of the Apple II line of microcomputers, a nibble was alternatively used to represent five bits or six bits. Even stranger? The fact that both uses are found interchangeably. This is one of few times that nibble has stood for something other than the traditionally relied upon definition of four bits.

The Integrated Woz Machine

Oddly enough, the Integrated Woz Machine — a single-chip floppy disk controller for Apple and Macintosh products in the early 1980s — was frequently referred to as an 8-bit nibble in its technical documentation. This is quite an uncommon usage, as “byte” is traditionally the 8-bit unit in computing. Apple seemed to really struggle with sticking to one usage.

Frequently Asked Questions

How much is a bit in computing?

In computing, a bit equates to a binary digit: either one (1) or zero (0). No more, no less. It’s often viewed as an equivalent to a switch or a power button, existing in either “on” or “off” but never both at the same time.

How much is a crumb in computing?

If a bit is equivalent to the binary digit, then a crumb is the same as saying two bits. You might also see a crumb referred to as a quarter byte. In truth, it’s just two ways of saying the same thing: That a crumb is worth two bits.

How much is a nibble in computing?

A nibble is worth four bits, two crumbs, or half a byte. Any one of these definitions holds true in computing, as it’s all dependent on what kind of context the unit is being used in. No matter how you say it, though, a nibble is always equivalent to four bits.

How much is a byte in computing?

A byte is equivalent to eight binary digits, or “bits.” (The unit “bit” is a portmanteau of this concept of binary digits, where the unit is worth either 1 or 0 but never both.) You might also see a byte referred to as four crumbs or two nibbles. A thousand bytes equals one kilobyte, and a thousand kilobytes equals one megabyte.

What is the biggest unit of information in computing?

While technically there’s no limit to number of units of information out there — just as there’s technically no limit to the number of numbers period — the International System of Units recognizes the yottabyte as the largest unit of information in computing today. It is equivalent to a quadrillion gigabytes, or one septillion bytes. (That’s a lot of nibbles.)

To top