CIS 515, Fall 2020
Fundamentals of Linear Algebra and Optimization
August 3 2020 (Page under construction)
Instructor: Jean H.
Gallier, Levine 476, 8-4405, firstname.lastname@example.org
Jean: zoom link and time TBA
** Welcome to CIS 515 !**
Since a lot of material for the fully online version of this course,
MCIT 515, is available online, I plan to make use of this material,
supplemented by extra slides. Consequently, I plan to cover
substantially more material
this Fall 2020
than I used to cover in the past. In particular,
I will cover some elements of optimization theory
(the Lagrangian framework, ADMM) and some
topics from machine
Hard Margin Support Vector Machines (SVM)
Soft Margin Support Vector Machines (SVM)
Solving Hard Margin SVM and Soft Margin SVM using ADMM
Linear Regression; Learning a Linear or an Affine Function
Solving Lasso Regression using ADMM
Solving Elastic Net using ADMM
This course will be fully taught online.
In order to increase the level of interaction between
the students and the instructor(s) I propose to use the following
Online lessons will be available
- Every student is expected to listen to
recorded lessons and read the corresponding material
in the book before every class.
- A list of the material to be listened to and read
will be available on this web page a week before
the actual lecture.
During lecture time, I intend to
answer questions about the material presented online
for the lesson.
Occasionally present important proofs.
In general, attempt to motivate, demistify, and
put in context the material of the lesson.
Give an idea about applying the material to solve
the homework problems.
Typically, I will not
lecture during class time, although I may occasionally use some time
to do this.
Consequently, there will be a heavier
burden and a greater requirement of
placed on the student to listen to and read the lessons
to keep up with the
On the other, you will have greater flexibility in deciding
when to listen and read the lessons in preparation for the
actual class, which I hope, will be more of an
We will try this learning mode. If it does not work we will switch
back to a more traditional lecturing mode.
There will be no midtems, no final exam,
but instead homework
problems (some challenging)
and (Matlab) projects (about seven)
The official textbooks are
Linear Algebra and Optimization with Applications to Machine Learning, Vol I and Vol II, by Gallier and Quaintance, World Scientific (2020).
Relevant Chapters will be available as needed;
see Slides and Notes
A Word of Advice :
Expect to be held to high standards, and conversely!
In addition to slides, I will post
lecture notes. Please, read the course notes regularly, and
start working early on the problems sets. They will be hard!
Take pride in your work. Be clear, rigorous, neat, and concise.
Preferably, use a good text processor, such as LATEX, to
write up your solutions.
Due to the difficulty of the homework problems and in order to
give you an opportunity to learn how to collaborate
more effectively (I do not mean "copy"), I will allow you
to work in small groups.
A group consists of AT MOST THREE students.
You are allowed to collaborate
with the same person(s) an unrestricted number of times.
Only one homework submission per group.
All members of a group
will get the SAME grade on a homework or a project
(please, list all names in a group).
It is forbidden to use solutions of problems posted on the internet.
If you use resources other than the textbook (or the recommended textbooks)
or the class notes, you must cite these references.
I assume that you are all responsible adults.
Copying old solutions verbatim or blatantly
isomorphic solutions are easily detectable.
DO NOT copy solutions from old solution
sheets, from books, from solutions posted on the internet, or from friend!
Either credit will be split among the perpetrators, or worse!