### Calculus

β¦

#### Leibnitzβs Rule for Differentiating Integrals

Theorem: Assume that and are continuous and that and are differentiable. Let

Then is differentiable and

β¦

#### Improper Integral

Let be an interval and consider

the integral is Improper if either

- is unbounded

- is unbounded on

for the first case, we can introduce the limits, and the improper integral converges iff each of the parts has a finite limit.

Comparison Test: Many time we simply concerned with whether an improper integral converges, as opposed to calculating its value explicitly.

Assume and are bounded, have at most finitely many discontinuities, and such that for all .

- if converges, then converges

- if diverges, then diverges

Useful comparisons:

- converges iff

- converges iff

- Exponentials crush polynomials
- Fact: given any there exist constants such that

- Polynomials crush logs
- Fact: given any there exist constants such that

#### Lagrange Multipliers

Suppose that and are smooth functions and is a constant. If has a local max or min subject to at , then at least one of the following two conditions holds:

- there is such that

### ODEs

#### Separable

thus

#### First Order Linear ODEs

format

find a

then

#### Second Order Linear ODEs

General form

the homogeneous form is

- if are solutions of , and are constants, then is a solution of

- if are solutions of , and is a solution of , then is a solution of

- if are solutions of , then is a solution of

To solve

Case(I):

Case(II):

for Case(I),

the equation has two roots

- if , then

- if , then

- if , then

for Case(II),

assume and , compute

Substitution into yields

the equation , has two roots

- if , then

- if , then

- if , then

Β

### Matrices & Linear Algebra

β¦

#### Basic Properties of the Transpose

Let , scalar

β¦

#### Subspaces

A set is called a subspace of provided that and

for all and all scalars

Given , there are two important subspaces of

- Null space

- Range

For the system of equations

- it has a solution iff . If , it will be solvable for every

- if , the system of equations has at most one solution

#### Linear Combination and Spans

Consider a list of vectors in ,

- A linear combination of is

- The set of all linear combinations of is called the span of , denoted by . is a subspace of

- For the subspace , the smallest positive integer to enable is called the dimension of , denoted by
- let , then

#### Rank of a Matrix

If , then

- is called the rank of , thus

- when , this matrix is full rank

Properties of Rank: Let

- if has real entries then

β¦

#### Inverse

An matrix is said to be invertible if there exists such that

Properties: ( is invertible)

- if is nonzero scalar,

- if is invertible,

- is invertible iff

#### Determinants

Only square matrix can be input into the

*determinant function*Let , is a scalar

- If is diagonal then

- iff the columns of are linearly independent
- is invertible (β )

Row operations and Determinants

- If is obtained by multiplying one row of by a scalar then

- If is obtained by interchanging two rows of then

- If is obtained by adding a scalar multiple of one row of to another row then

#### Trace of a Matrix

Let , the trace of is defined by

let and a scalar

#### Eigenvalues

Only apply for square matrices.

Def. A scalar is said to be an eigenvalue for provided there exists a nonzero such that . Any nonzero such that is called an eigenvector for associated with the eigenvalue

- is called the characteristic equation

- is called the characteristic equation for

for eigenvalues

#### Similar Matrices

. If there exists an invertible such that

then and are similar. And then

- have the same eigenvalues

#### Positive Definite and Positive Semidefinite Matrices

Let be a real matrix with

- is positive definite iff
- for all and
- All eigenvalues of are strictly positive
- for all

- is positive semidefinite iff
- for all
- All eigenvalues of are nonnegative

Every covariance matrix is positive semidefinite

#### Minimization and Convex Functions

The Hessian matrix is defined by

The Taylorβs theorem

Theorem: is convex iff is positive semidefinite for every .

Theorem: Assume is convex and satisties , then

i.e. attains a global minimum at

#### Constrained Optimization Problem

Let be given and assume that is real and . Constraints

define by

Let be the eigenvalues of arranged so that

Theorem: Under the preceding assumptions, we have

- and the maximum is attained at each satisfying

- and the minimum is attained at each satisfying

Proof:

let . Then .

Since the set is nonempty, closed, and bounded and the function is continuous, we know that attains a maximum and a minimum on . According to the Lagrange multiplier theorem, if a maximum or minimum is attained at , then there is a scalar such that

since

thus β must be a eigenvector for . And the corresponding maximum/minimum is

therefore, , and

#### Orthogonality and Projections

Definition: Let be given, we say that and are orthogonal if .

For a list of vectors to be orthonormal iff

Every orthonormal list of vectors is linearly independent.

Theorem: Let be a subspace of , then there is an orthonormal list such that

Theorem: Let be a subspace of , then there is exactly one matrix satisfying

- for all and

For

- is orthogonal to for every

is called the orthogonal projection matrix for .

#### Gram-Schmidt Procedure

Given a linearly independent list of vectors , itβs possible to produce an orthonormal list such that

Constructing a Projection Matrix

Let be a subspace of and choose an orthonormal list of vectors such that

Let be the matrix whose -th column is . Then

#### L-U Decompositions

Let , a decomposition of into a product of the form

Suppose this holds, and . Then

#### Cholesky Decomposition

A special type of L-U decomposition.

Let . Assume that has real entries, , and is positive definite. Then there exists exactly one lower triangular with positive entries on the diagonal such that

If is symmetric and positive semidefinite, then it still has a Cholesky decomposition provided we allow some diagonal elements to be 0. In this case, the matrix is not necessarily unique.

General Cholesky Algorithm

- : the upper left block of

- : the first entries in column of

- : the first entries in column of

Then

and for we solve

for , then put

Β

*End of Content*

Loading Comments...