[Prev][Next][Index][Thread]

Re: [very] basic question




It's been a few years, but here we go:

In the realm of formal semantics, types are sets with a bit of
structure.  A set is a collection of objects, period.  To make
programming language semantics work out (eg, to ensure the existence of
fixed points), types are required to be sets with an ordering relation
that satisfies certain properties.  Such a set together with its
ordering relation that satisfies the necessary properties (see below) is
usually called a Domain.

The relation is often called something like is-less-defined-than and is
typically written with a less-than-or-equal-to operator, often with the
less-than symbol squared off (though sometimes confusingly written like
the subset symbol).  I'll write <= in ASCII.  Like
less-than-or-equal-to, the relation is reflexive, antisymmetric, and
transitive.  This relation is NOT less-than.  For example, 3 and 4 are
both elements of the INT type (integers).  For type theorists, 3 and 4
are not comparable (neither is <= the other).  Likewise, TRUE and FALSE
are elements of BOOLEAN and are incomparable.

Exactly what properties are required of the sets and ordering relation
are a subject of some research and the object of much ingenuity in the
world of programming language semantics.  Some early semantics required
that domains be complete lattices, but we know that if all types are
CPOs (complete partial orders), then things work very neatly.  A CPO is
a set and ordering relation such that every subset of domain has a least
element in the domain.  For simple sets like the integers, we can make a
domain, like INT, by 

 1. defining the partial order so that all the elements of the set
    (integer) are not comparable, and 
 2. adding a new element called Bottom (written like the symbol for is
    perpendicular to) that is defined to be less-defined-than all the
    other elements.

Bottom represents the `undefined integer' if you like, and is the
semantic value of a non-terminating computation.

The structure of the domain of functions becomes very interesting and is
a long topic all its own.

In fact, you don't need CPOs to make sure fixed points always exist and
are well-defined, but the idea is always the same: whenever there is a
possibility of taking a fixed point of something, the chain of elements
from the set must have a least element, otherwise the fixed point does
not exist.  A CPO makes sure *any* subset of elements of *any* domain
has a GLB (greatest lower bound), so it is easy to see that if domains
are CPOs, we win.  Less restrictive structures take advantage of the
fact that a programming language's semantics won't take the fixed point
of an arbitrary subset of elements of a type but only of certain chains
of values.  In the early days of denotational semantice, researchers
required domains to be complete lattices (top and bottom of every subset
of elements exist).

At this point my memory is fuzzy, and what I know is probably out of
date (people have figured out less restrictive mathematical models that
still guarantee fixed points).  And this may be more than you want to
know anyway :-)

-Mark