Today, I learned that:
The common expression “gigabyte” so often used today when talking about a quantity of data is not always a gigabyte, here is the story:
Some time ago, I discovered that there was a new type of USB flash drive (‘pen drive’) in the market, the handy little memory device that lets one move chunks of data from one computer to another. This particular device would be specially suited for transferring data from desktop/laptop computer to a smartphone, and vice-versa. The reason for this is that the drive has two different ports, one with a regular USB-A male connector and the other with a micro-USB AB receptacle. I decided to buy the smallest one, specified at 16 Gbytes of memory.
When it arrived, before even saving any file onto the drive, I investigated if it really could hold 16 Gbytes as specified. Well, it could not! My Mac computer showed that the maximum capacity when using a file format called Extended File Allocation Table (exFAT), which would permit compatibility between OS X, Windows and Android, was 15 549 300 736 bytes, also informed as 15,55 GB by the Mac. In Windows, it was even less, 14,4 GB. So how come this difference, more than 1 GB?
The reason is that Apple uses the decimal definition of gigabyte, where 1 gigabyte (GB) = 1000 x 1000 x 1000 bytes = 1 000 000 000 bytes, and Microsoft uses the binary definition, that states that 1 gigabyte (or more correctly 1 gibibyte – GiB) = 1024 x 1024 x 1024 = 1 073 741 824 bytes.
The recommended international standard, in unison with the International System of Units (SI), is to use only the prefixes kilo, mega, giga, etc. as multiples of 1000, and kibi, mebi, gibi, etc. as multiples of 1024.
The word gibi is also used in Brazil to denote a comics magazine, in Swedish called ‘serietidning’.
… That’s what I learned in school!