一些数值线性代数的资源

  |  

摘要: 本文介绍一些数值线性代数的资源

【对算法,数学,计算机感兴趣的同学,欢迎关注我哈,阅读更多原创文章】
我的网站:潮汐朝夕的生活实验室
我的公众号:算法题刷刷
我的知乎:潮汐朝夕
我的github:FennelDumplings
我的leetcode:FennelDumplings


fastai 数值线性代数教程

这份教程来自于旧金山大学的分析学硕士 2017 暑期课程(为毕业班致力成为数据科学家的学生开设的)。该课程使用 python 和 jupyter 笔记本进行教学,在大多数课程中使用 scikit learn 和 numpy 等库,以及在一些课程中使用 numba(一个将 python 编译为 C 以提高性能的库)和 pytorch(一个替代 numpy 的 GPU 库)。

主要内容:

  • Course Logistics

    • My background
    • Teaching Approach
    • Importance of Technical Writing
    • List of Excellent Technical Blogs
    • Linear Algebra Review Resources
  • Why are we here?

    • We start with a high level overview of some foundational concepts in numerical linear algebra.
    • Matrix and Tensor Products
    • Matrix Decompositions
    • Accuracy
    • Memory use
    • Speed
    • Parallelization & Vectorization
  • Topic Modeling with NMF and SVD

    • We will use the newsgroups dataset to try to identify the topics of different posts. We use a term-document matrix that represents the frequency of the vocabulary in the documents. We factor it using NMF, and then with SVD.
    • Topic Frequency-Inverse Document Frequency (TF-IDF)
    • Singular Value Decomposition (SVD)
    • Non-negative Matrix Factorization (NMF)
    • Stochastic Gradient Descent (SGD)
    • Intro to PyTorch
    • Truncated SVD
  • Background Removal with Robust PCA

    • Another application of SVD is to identify the people and remove the background of a surveillance video. We will cover robust PCA, which uses randomized SVD. And Randomized SVD uses the LU factorization.
    • Load and View Video Data
    • SVD
    • Principal Component Analysis (PCA)
    • L1 Norm Induces Sparsity
    • Robust PCA
    • LU factorization
    • Stability of LU
    • LU factorization with Pivoting
    • History of Gaussian Elimination
    • Block Matrix Multiplication
  • Compressed Sensing with Robust Regression

    • Compressed sensing is critical to allowing CT scans with lower radiation— the image can be reconstructed with less data. Here we will learn the technique and apply it to CT images.
    • Broadcasting
    • Sparse matrices
    • CT Scans and Compressed Sensing
    • L1 and L2 regression
  • Predicting Health Outcomes with Linear Regressions

    • Linear regression in sklearn
    • Polynomial Features
    • Speeding up with Numba
    • Regularization and Noise
  • How to Implement Linear Regression

    • How did Scikit Learn do it?
    • Naive solution
    • Normal equations and Cholesky factorization
    • QR factorization
    • SVD
    • Timing Comparison
    • Conditioning & Stability
    • Full vs Reduced Factorizations
    • Matrix Inversion is Unstable
  • PageRank with Eigen Decompositions

    • We have applied SVD to topic modeling, background removal, and linear regression. SVD is intimately connected to the eigen decomposition, so we will now learn how to calculate eigenvalues for a large matrix. We will use DBpedia data, a large dataset of Wikipedia links, because here the principal eigenvector gives the relative importance of different Wikipedia pages (this is the basic idea of Google’s PageRank algorithm). We will look at 3 different methods for calculating eigenvectors, of increasing complexity (and increasing usefulness!).
    • SVD
    • DBpedia Dataset
    • Power Method
    • QR Algorithm
    • Two-phase approach to finding eigenvalues
    • Arnoldi Iteration
  • Implementing QR Factorization

    • Gram-Schmidt
    • Householder
    • Stability Examples

Numerical Linear Algebra

作者 Trefethen 和 Bau。是数值线性代数最有名的书。

主要内容

  • Part I. Fundamental:
    1. Matrix-vector multiplication
    2. Orthogonal vectors and matrices
    3. Norms
    4. The singular value decomposition
    5. More on the SVD
  • Part II. QR Factorization and Least Squares:
    1. Projectors
    2. QR factorization
    3. Gram-Schmidt orthogonalization
    4. MATLAB
    5. Householder triangularization
    6. Least squares problems
  • Part III. Conditioning and Stability:
    1. Conditioning and condition numbers
    2. Floating point arithmetic
    3. Stability
    4. More on stability
    5. Stability of householder triangularization
    6. Stability of back substitution
    7. Conditioning of least squares problems
    8. Stability of least squares algorithms
  • Part IV. Systems of Equations:
    1. Gaussian elimination
    2. Pivoting
    3. Stability of Gaussian elimination
    4. Cholesky factorization
  • Part V. Eigenvalues:
    1. Eigenvalue problems
    2. Overview of Eigenvalue algorithms
    3. Reduction to Hessenberg or tridiagonal form
    4. Rayleigh quotient, inverse iteration
    5. QR algorithm without shifts
    6. QR algorithm with shifts
    7. Other Eigenvalue algorithms
    8. Computing the SVD
  • Part VI. Iterative Methods:
    1. Overview of iterative methods
    2. The Arnoldi iteration
    3. How Arnoldi locates Eigenvalues
    4. GMRES
    5. The Lanczos iteration
    6. From Lanczos to Gauss quadrature
    7. Conjugate gradients
    8. Biorthogonalization methods
    9. Preconditioning

Advanced Linear Algebra: Foundations to Frontiers

本课程与上面的书的内容高度重合,可以参考。


Numerical Method Design Analysis and Computer Implementation of Algorithms

参考书。


Coding The Matrix: Linear Algebra Through Computer Science Applications

注意本书的代码有点老,现在不一定 work 了。


Share