[wplug] Linux vs OS X version of GCC

Eric Cooper ecc at cmu.edu
Thu Apr 20 12:19:15 EDT 2006


On Thu, Apr 20, 2006 at 08:04:01AM -0400, Jonathan S. Billings wrote:
> Logan wrote:
> >For class, I've been using my OS X laptop to do a bunch of assignments, 
> >but a recent one had a Segmentation Fault when the professor ran it on 
> >the Linux server. We had to modify that lab, so I found the error, 
> >here's the offending line:
> >
> >args[arg_count][strlen(args[arg_count++])] = 0;//null terminate (with a 
> >self-referential strlen())
> >    /* ok, here was the bug in Mac vs linux
> >    *  I had "args[arg_count++][strlen(args[arg_count])] = 0;"
> >    *  You can see the change above, apparently Linux increments my 
> >arg_count before getting that second []
> >    *  and mac gets both [][] before incrementing arg_count.
> >    *  Either that, or it's gcc version 4.0.0 vs version 3.2.3.
> >    *  Thus Linux segfaults when we try to get strlen(args[invalid]).
> >    *  I thought C was supposed to be portable. Who's correct here?
> >    */

This kind of ambiguity is often left deliberately unspecified in
programming language semantics, so that compilers have more freedeom
in the code they generate.

The first online version of a C standard I could find via Google is this draft
spec of C-99: http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1124.pdf

In a footnote to section 6.5, "Expressions", it points out that
statements such as
    i = ++i + 1;
    a[i++] = i;
are undefined.

So if you want portable code, use explicit assignments to intermediate
variables to disambiguate these kinds of constructs.

-- 
Eric Cooper             e c c @ c m u . e d u


More information about the wplug mailing list