Frank Heckenbach wrote:
Prof A Olowofoyeku (The African Chief) wrote:
I have built gpc-20040516 for AMD64 (platform: Fedora Core 2, 64-bit).
Binaries can be obtained here: http://gnu-pascal.de/contrib/chief/fedora-2/
Thanks.
Built and tested on an Athlon-64 3.2+, the testsuite ran fine, except for these: TEST az20.pas: ./test_run: line 334: 4929 Segmentation fault ./"$A_OUT" "$1" TEST confarr5.pas: ./test_run: line 334: 9516 Segmentation fault ./"$A_OUT" "$1" TEST fjf746.pas: ./test_run: line 334: 14662 Segmentation fault ./"$A_OUT" "$1" TEST fjf746.pas: ./test_run: line 334: 14662 Segmentation fault ./"$A_OUT" "$1" TEST martin3.pas: ./test_run: line 334: 27275 Segmentation fault ./"$A_OUT" "$1" TEST pack10.pas: ./test_run: line 334: 30277 Segmentation fault ./"$A_OUT" "$1" TEST pack5.pas: ./test_run: line 334: 30396 Segmentation fault ./"$A_OUT" "$1"
Known problems with packed arrays (all) on certain platforms, AFAIK.
One curious thing. "Writeln (sizeof (Integer))" prints "4", as does "Writeln (sizeof (Cardinal))". Shouldn't these be 64-bit?
GPC follows the C model here. Apparently `int' is 32 bit, and `long int' (`MedInt' in GPC) is probably 64 bits.
I'm not sure if this is optimal (I don't know exactly if this and other 64 bit processors can do 32 bit operations efficiently), but it would probably be more problematic to deviate from the C types, as far as C interfaces are concerned ... :-/
Frank
To reach 32 bit operations in AMD 64, you have to use a mode prefix (from 64 bit mode). Hence, there is an instruction length penalty.
For the idea of a "mudstuck integer", it would not be the first time; lots of x86 implementations left integer at 16 bit when the architecture moved forward.
In any case, the ISO 7185 standard is clear: The range of integer is -maxint to maxint, and all calculations shall occur in that range, ie., integer is the maximum range type on any system. The program is supposed to use maxint to adjust its specific behavior, as required, or specifically declare a subrange, perhaps to give a compile time message about it being larger than the target is capable of.
By the way, the AMD ABI or calling/sizing convention which gcc is based on, and requires ints remain 32 bits, violates the standard for C as well.