Actually, that's the thing... there is no "true" definition because half the industry uses one standard while the other half uses another. The technical definition of gigabyte is one billion bytes (in base 10 that's 10^9), but since computers count in base 2, a gigabyte is 2^30 (or 1,073,741,824 bytes). That's why the IEEE is trying to adopt the IEEE 1541 Standard which has different names for base 2 sizes. For example, 2^30, commonly referred to as gigabyte, would now be called gibibyte (abbreviation "GiB"). The rest is outlined in the link.
Incidentally, this is the result you get if you look up the definition of gigabyte at www.dictionary.com. The definitions all contradict each other. Until this is all resolved, people will continue to be confused.