[wplug] Linux vs OS X version of GCC

Logan ascii at psu.edu
Thu Apr 20 01:30:35 EDT 2006


For class, I've been using my OS X laptop to do a bunch of  
assignments, but a recent one had a Segmentation Fault when the  
professor ran it on the Linux server. We had to modify that lab, so I  
found the error, here's the offending line:

args[arg_count][strlen(args[arg_count++])] = 0;//null terminate (with  
a self-referential strlen())
	/* ok, here was the bug in Mac vs linux
	*  I had "args[arg_count++][strlen(args[arg_count])] = 0;"
	*  You can see the change above, apparently Linux increments my  
arg_count before getting that second []
	*  and mac gets both [][] before incrementing arg_count.
	*  Either that, or it's gcc version 4.0.0 vs version 3.2.3.
	*  Thus Linux segfaults when we try to get strlen(args[invalid]).
	*  I thought C was supposed to be portable. Who's correct here?
	*/

So which is correct? Or is this an ambiguity of C, that I should just  
avoid?
The code there works in both versions of gcc, so I suppose if I just  
increment on the second [] it's "portable enough" for class ;-)
I'd still like to know about the first argument bit though.

The Linux server runs gcc version 3.2.3 20030502 (Red Hat Linux  
3.2.3-54).
My Mac runs gcc version 4.0.0 20041026 (Apple Computer, Inc. build  
4061). - I believe that once warned me that this version generates  
broken code on occasion.



More information about the wplug mailing list