Math 247A 2010
Tu Thur 2-3.15 in APM 2402
This page contains mostly things you can read if you like.
Some will serve as backup to the lectures.
There is nothing that appropriate as a coheret exposition of the topics
so these readings standing alone are inefficient as a direct path to
What is posted now gives some idea,
but not a good one of what we will covered.
Indeed,more will be added.
It is well known that:
You cant thrill all of the students all of the time
but hopefully all student will find some readings helpful.
Some slides Introducing one to sum of squares
vs LMIs and applications of such to Lyapunov function calculation
See Lyapunov reading below for more details.
COMPLETELY POSITIVE MAPS:
These are the natural notion of positive linear functionals
for noncommutative situations, ie. situations with matrix unknowns,
A quicki wiki exposition of COMPLETELY POSITIVE MAPS (in 3 parts):
This is the definition of C^* algebra; it is almost a fancy way
of talking about an algebra A of operators on Hilbert space H
which is invariant under taking adjoints of operators in A and
which is closed in the operator norm. The notion of C^* algebra
just distills the basic rules of algebraic manipulation you are
allowed to use when doing calculations with matrices or
more generally operators on Hilbert space.
1. Take a look,
especially at the examples, but do not get hung up on this stuff:
2. Intro to completely positive maps.
You can STOP reading this at the Kraus Theorem.
Once you know the definition of completely positive map the following
3. a pretty notation lean exposition.
ALTERNATIVE EXPOSITION OF COMPLETELY POSITIVE MAPS:
A long thorough systematic exposition of the basics, which repeats a lot of the above:
A Dutch masters thesis that spends a lot
(more than you need) of time on the background
definitions and setup.
More than most of you want to know about complete positivity:
At last we get to one of THE MAIN RESULTS:
Matrix convexity is equivalent to having an LMI,
the proof uses the nc Hahn Banach Theorem.
CLASS Followed some of this (SEC 3,4 5) in the first 3 weeks.
We will do later chapters soon in class
A survey article on real algebraic geometry both commutative
and nc. It should be a bit modular, so if you do not like
the first part skip it.
RAG c and nc with brief description of some applications.
COMPUTING LYAPUNOV FUNCTIONS using RAG
Markov Processes on Graphs and Convex Optimization
Convex optimization of eigenvalues of Laplacian on Graphs
Finding the LOWEST RANK MATRIX in a given subspace of matrices:
Fairly readable first paper on the subject
The LOW RANK MATRIX COMPLETION PROBLEM specifies a particular type of subspace.
Here is a simple algorithm maybe somebody could say what it is
and the claimed properties:
A simple computer algorithm for low rank matrix completion solution.
Low Rank Matrix Completion Analysis of success.
E. Candes - T. Tao Error Estimates on Success