The term gigabyte is commonly used to mean either 10003 bytes or 10243 bytes. This originated as compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) is approximately 1000 (103), roughly corresponding to SI multiples, it was used for binary multiples as well.
In 1998 the International Electrotechnical Commission (IEC) published standards for binary prefixes, requiring that the gigabyte strictly denote 10003 bytes and gibibyte denote 10243 bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST, and in 2009 it was incorporated in the International System of Quantities. Nevertheless, the term gigabyte continues to be widely used with the following two different meanings:
Base 10 definition[edit]
- 1 GB = 1000000000 bytes (= 10003 B = 109 B) is the definition recommended by the International System of Units (SI) and the International Electrotechnical Commission (IEC).[2] This definition is used in networking contexts and most storage media, particularly hard drives, Flash-based storage,[3][4] and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The Mac OS X file manager from version 10.6 and higher is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units.[5]
Base 2 definition[edit]
- 1 GiB = 1073741824 bytes (= 10243 B = 230 B) is the definition used by Microsoft Windows in reference to computer memory (e.g., RAM). This definition is synonymous with the unambiguous IEC standard name gibibyte.
Read full article from Gigabyte - Wikipedia, the free encyclopedia
No comments:
Post a Comment