Brief sketches of the courses are included below. For more details come by to see me or visit Penn's Blackboard site.

A basic appreciation of probabilistic ideas is essential in any graduate program in engineering and the sciences. This course is intended to provide graduate students with little or no exposure to probability theory a broad introductory outline of the fundamental ideas. Students with adequate undergraduate preparation in probability are encouraged to seek out the more challenging course ESE 530 instead.

This course covers basic principles of probability: discrete and continuous probability spaces; combinatorial probabilities; conditional probability and independence; Bayes rule and the theorem of total probability; arithmetic and lattice distributions; the Bernoulli scheme: binomial, Poisson, geometric, negative binomial, hypergeometric, and multinomial distributions; continuous distributions: densities related to the uniform, exponential, and normal; random variables and vectors; distribution functions, probability density functions, and probability mass functions; independent random variables; measures of central tendency mean, median, mode; mathematical expectation; conditional expectation; moment generating functions and characteristic functions; tail inequalities Markov, Chebyshev, Chernoff; the law of large numbers; the central limit theorem; recurrent events; Markov chains.

This rapidly moving graduate course provides a largely non-measure theoretic but rigourous development of fundamental ideas in probability theory and random processes. This course is a prerequisite for subsequent courses in communication theory and telecommunications such as ESE 576 and Tcom 501. The course is also suitable for students seeking a broad graduate-level exposure to probabilistic ideas and principles with applications in divers settings.

The menu from which course topics are selected includes: discrete and continuous probability spaces; combinatorial probabilities; conditional probability and independence; Bayes rule and the theorem of total probability; the Poisson paradigm the principle of inclusion and exclusion, Bonferroni's inequalities, the Lovasz local lemma, Janson's inequalities; arithmetic and lattice distributions: the Bernoulli scheme: distributions related to the binomial and the Poisson; the de Moivre-Laplace limit theorem; continuous distributions: the uniform, exponential, normal, and related densities; order statistics and limit theorems; what is randomness? congruential random number generators and tests for randomness; random variables and vectors; distribution functions, probability density functions, and probability mass functions; independent random variables; Borel's normal law and unexpected connections with number theory; measures of central tendency mean, median, mode; mathematical expectation from naive expectation to the Lebesgue theory; moments; conditional expectation; moment generating functions and characteristic functions; tail inequalities Markov, Chebyshev, Chernoff; the weak and strong law of large numbers; probability sieves, auxiliary randomisation, and the Stein-Chen method; the central limit theorem; random processes; Gaussian and Poisson processes; stationarity and ergodicity; correlation functions; spectral densities; filtered random processes; bandlimited processes and the sampling theorem.

A solid foundation in undergraduate probability at the level of Stat 430 or Sys 301 is required. Undergraduates will need to see the instructor for permission to register.

The theory of *information* made
its debut in Claude Shannon's epochal paper in 1948. The theory
revolutionised how we view the world and is central to the information
revolution of the late 20th century.

The central concept in information theory is that of entropy as a mathematical measure of information. The theory is rich and has application to a variety of areas including reliable communication, complexity theory, statistical physics, learning theory, universal gambling and investment, hypothesis testing, data compression, disordered systems, and large deviations in chance experiments. We will focus on statistical communication as a major theme, and in particular, explore the ramifications of the theory of information to reliable communication. In several digressions, however, we will also explore the ramifications of information theoretic ideas to divers fields including: Kolmogorov complexity, competitively optimal data compression, the horse race and investment strategies, and hypotheis testing.

A solid foundation in probability theory is required ENM 503, ESE 530, or Stat 530 will fit the bill.