Young man, in mathematics you don't understand things. You just get used to them. (John von Neumann)

My notes (slides) since my PhD study (2017). Email me if you catch a mistake/typo.

Algebra

• Tropical semiring / max-plus semiring

• The max-plus semiring

Continuous Optimization

• $$\ddot{X} + \gamma(t) \dot{X} + \nabla f( X) = 0$$

• Principle of least action, Euler-Lagrange equation of motion and Lagrangian of $$\ddot{X} + \gamma(t) \dot{X} + \nabla f( X) = 0$$

• Primal-dual method

• Lagrangian Mutiplier, Convex conjugate, Deriving the dual problem using convex conjugate

• duality, KKT, Slater’s constraint qualifications

• Augmented Lagrangian Method, Douglas-Rachford splitting algorithm, ADMM \$ - ADMM with consensus constraint for problems with multiple indicator functions - Chambolle-Pock Primal-dual method, Accelerated Chambolle-Pock Primal-dual method with precondition

• Multiple-point/pseudo-Newton methods and linear sub-space of sequence

• Non-convex Optimizations

• Proximal operator is non-expansive and firmly non-expansive

• Polyak-Łojasiewicz inequality, Kurdyka-Łojasiewicz property

• Convergence of PALM on non-convex problem, part 2 : generated sequence converges to a critical point

• Inertial Proximal Alternating Linearized Method (iPALM)

• Nonconvex problems that are “nice”

• Strict saddle point property and escaping saddle point, trust region method

• General Acceleration Strategies for optimization algorithms

• Acceleration by extrapolation or linear combination of sequence

• Acceleration by domain transformation : preconditioning

• Acceleration by subset sampling : radomization approach, multi-grid approach, safe feature removal approach

• Acceleration by hardware : parallization and distributived computing

• Linear programming, Semidefinite programming, Polynomial Optimization

• Semidefinite programming

• Polynomial programming

• Polynomial as linear combination of monomials

• Square matricial representation of polynomial

• Incomplete Basis and Newton’s Polytope

• Geometry of the spectrahedron

Matrix Completion

• Theory

• $$\partial \| \mathbf{X} \|_* = \Big \{ \mathbf{U}\mathbf{V}^\top + \mathbf{W} \, \Big | \, \mathbf{W} \in \mathbb{R}^{m \times n}, \mathbf{U}^\top \mathbf{W} = \mathbf{0}, \mathbf{W}\mathbf{V} = \mathbf{0}, \| \mathbf{W} \|_2 \leq 1 \Big \}$$

• The Schatten–von-Neumann norm

• Algorithm

• MC by Majorization-Minimization, by Proximal Point Algorithm / primal-dual method, by Augmented Lagrangian Method, by Douglas-Rachford spitting algorithm, by ADMM, by Iterative Reweighted Least Squares

Machine Learning

• Machine Learning applications

On software engineering

• Version control

• On efficient coding on experiments comparing multiple algorithms

• On using LaTeX

• On writing static webpage