Gigabit Explained

The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale),[1] and therefore

1 gigabit = = .

The gigabit has the unit symbol Gbit or Gb.

Using the common byte size of 8 bits, 1 Gbit is equal to 125 megabytes (MB) or approximately 119 mebibytes (MiB).

The gigabit is closely related to the gibibit, a unit multiple derived from the binary prefix gibi (symbol Gi) of the same order of magnitude,[2] which is equal to =, or approximately 7% larger than the gigabit.

See also

Notes and References

  1. http://physics.nist.gov/cuu/Units/prefixes.html The NIST Reference on Constants, Units, and Uncertainty: SI prefixes
  2. http://physics.nist.gov/cuu/Units/binary.html The NIST Reference on Constants, Units, and Uncertainty: Prefixes for binary multiples