Correct me if I’m wrong but I believe the difference lies between the way the size is reported. Computers use binary, people use decimal.

Traditionally computer people used binary. Since the equipment they were working with used binary it made it easier to understand if they used the same number system.

In binary 1 Kilobyte = 2^10 bytes = 1024 bytes.

In binary 1 Megabyte = 2^20 bytes = 1048576 bytes.

In binary 1 Gigabyte = 2^30 bytes = 1073741824 bytes.

Somewhere along the line someone must have decided to start rating their products using decimal rather than binary.

In decimal 1 Kilobyte = 10^3 bytes = 1,000 bytes.

In decimal 1 Megabyte = 10^6 bytes = 1,000,000 bytes.

In decimal 1 Gigabyte = 10^9 bytes = 1,000,000,000 bytes.

So a 250 Gigabyte when rated in decimal is about 232.8 Gigayte in decimal.

(or when purchased from the salesman as 250 Gig is reported as 232 by windows)

The math:

250 * 10^9 = 250,000,000,000 bytes decimal.

1 Gig binary = 1073741824 binary.

In binary bytes than new 250 Gig drive is 250,000,000,000 / 1073741824 = about 232.83 Gig as seen by the computer you puchased the drive for.

And that 4.7 Gigabyte DVDR you bought to store binary data on.

4.7 Gigabytes = 4,700,000,000 bytes in decimal.

Or 4,700,000,000 / 1073741824 = about 4.3772 actual computer Gigabytes. Hence, your burning program reports the capacity as 4.38 Gig rather than 4.7 Gig.

I think all floppy disks all are still rated in binary bytes. And optical disks are rated in binary up though the 700 MB CDRs, which are actually 80 minute disks and I could be wrong but isn’t that why they really hold about 702 MB?

I’m not sure at what point hard drives started being rated in decimal rather than binary. I do know my own 40 meg really was 40 megs.

So why the switch to decimal? To make it easier for folks to understand how many bytes their disk can hold? Probably the official explanation.

My more cynical explanation is it was simply a marketing strategy. Makes the capacity seem greater. For instance, two manufacturers have two identical capacity drives(or optical disks) on the shelf. One is rated in decimal as 4.7 Gig, the other in binary as 4.38 Gig. Which will outsell the other? Or a 250 Gig decimal hard drive and a 232 binary hard drive.

Myself, if I knew they were the same, I would consider the manufacturer that labeled their media in binary to be more honest. But they can’t do that and compete. One manufacturer does it they all must.