Lambda calculus w/theory of computation

This question is a bit off topic, but I hope it is of sufficient 
interest [because of lambda calculus] to this group anyway.

I will be teaching our Theory of Computation course this coming fall, 
again.  The topics are the standard: regular sets and finite automata, 
context-free languages and push-down automata, Turing machines and 

I personally find Turing machines to be a very unsatisfying model of 
computation, and would prefer to rely on other models.  My favorites 
would be lambda calculus, RAM's, and imperative language models (e.g., 
the LOOP language).  My belief is that students would find the three of 
these easier to relate to than Turing machines.  However, I have been 
unable to find a book for our undergraduate (junior-level) course that 
covers these models.  Occasionally, a book will cover the RAM model as a 
side-light, but I have been unable to find a book, for example, that 
covers lambda calculus.

Does anyone have any suggestions for books (or even detailed lecture 
notes) that cover the basic topics relating to languages (regular and 
context-free grammars and machines) and undecidablility as well as a 
less "traditional" approach to computability?  [Though of course the 
lambda calculus was contemporaneous - or even earlier than - Turing 
machines.]  Reply directly to me and I will summarize later if there is 

	Kim Bruce
	Williams College