I know that a byte is 8 bits, but I glanced at soemthing womewhere that the Mac's smallest size that it can process is 16 bits <shrug> Maybe I'm wrong?
You mean: 16 bits? One "Byte"? At least this is against all definitions I ever saw in textbooks.
I have an idea, perhaps there could be platform dependant types, for example: VAR MyVar : PCByte; This way it would be treated as 8-bits on all platforms, the natural types can stay, all one has to do is use the platform type to compensate for it if necessary. This way, 1) the data works consistently across all platforms because (for example) they know that a PCByte is 8 bits, 2) If a byte is a different size on another platform, then it doesn't matter because the PCByte is the same no matter where it's compiled. I guess this could be accomplished by making the PCByte a natural Byte, but in the code, anything above the 8 bits is not used.
It would not be difficult to implement a compiler switch (*$16-bit *) (or ... GPC, it should not be 16 bits, but 64 bits!
-- See ya! Orlando Llanes