Re: Formal Semantics for C

In an earlier message, Corky Cartwright wrote:

> All of the "semantics" that I have seen for C cheat on the issue of
> unsafe operations.  Either they codify the behavior of a particular
> C implementation (including a complete machine-level memory model)
> or they punt when an unsafe operation is performed.
> Compiler optimization researchers perform non-trivial analyses of C
> programs (e.g., a pointer alias analysis) by ducking the issue of
> unsafe operations: they simply assume that unsafe operations are
> never executed.

And this is a perfectly reasonable assumption; they don't have to
preserve any particular behaviour when a program is undefined, so they
need only check that their optimisation satisfies the following

  program P doesn't exhibit undefined behaviour 


  (behaviour(P) = behaviour(optimised(P)))

This is a strong and very useful assumption, but I don't think it
constitutes "ducking" the issue.  

A concrete, though contrived, example of the sort of optimisation that
would be possible, might be turning

    int f(int y, int z) 
       int x = y / z;
       if (z != 0) { S } else { T }
       return E;

    int f(int y, int z) 
       int x = y / z;
       return E;

The test of z is redundant because z can't be zero if the program is
not undefined, and the standard explicitly allows optimisers to do
this because the undefined behaviour that might arise from dividing by
zero could just as well include the action of the new program.

The various formal semantics for C would give you a basis for proving
the implication above in this and other examples.