It would be one thing if there were no overlap between computer and scientific
uses, but that just isn't so. For example, the PCI bus can burst transfer at a
theoretical maximum frequency of 33.0MHz times 4 bytes/transfer. So you would
think that this totals 132.0 MB/second, and indeed that's the way it is usually
reported. But that MB is million, not 2^20. If we insist on MB meaning 2^20,
we only get 125.885MB/second. A similar problem occurs in other cases where
mega is used in a frequency or 1/time context, such as Fast SCSI-2 being 10.0
mega-transfers/second. With MB = 2^20 Fast SCSI-2 is only good for 9.54
MB/second. We can't just move "mega" around without thinking in these
contexts.
Since there's no truly consistent way to view this, I don't really care that
much which way it's reported, so long as a footnote clearly specifies which
meaning pertains. Since I've been working with SCSI a lot lately, I'm just as
happy with MB = million for storage devices since it keeps the time/space
computation easy.
Leonard