Linear algebra: matrices Horacio Rodríguez. Introduction Some of the slides are reused from my course on graph- based methods in NLP (U. Alicante, 2008)

Slides:



Advertisements
Presentaciones similares
Valores y Vectores Propios
Advertisements

Clase 9 TRANSFORAMCIONES ORTOGONALES Y UNITARIAS
Espacios de dimensión infinita
Sistemas de Ecuaciones Diferenciales Lineales
Computación Científica
UNIVERSIDAD DE PANAMÁ FACULTAD DE CIENCIAS NATURALES Y EXACTAS DEPARTAMENTO DE MATEMATICA COLOQUIOS MATEMÁTICOS OPERADORES EN ESPACIOS DE HILBERT. REPRESENTACION.
Subject Pronouns Los Pronombres Sujetos Diane Brooks FL490.
Los verbos regulares – ar What is an infinitive? An infinitive in both Spanish and English is the base form of the verb. In English, the infinitive.
Stem-changing verbs.
Calentamiento Write about the following images using the verb conocer from yesterday: ej. Yo conozco a Sra. Bender 1)2) 3)4)
Anna vs Elsa ¿Cómo te llamas? vs ¿Cómo se llama? ¿Cómo estás? vs ¿Cómo está?
1 Librería secuencial de Álgebra Lineal Densa LAPACK Domingo Giménez Javier Cuenca Facultad de Informática Universidad de Murcia.
Librería secuencial de Álgebra Lineal Densa LAPACK
1 Solving Systems of Equations and Inequalities that Involve Conics PROBLEM 4 PROBLEM 1 Standard 4, 9, 16, 17 PROBLEM 3 PROBLEM 2 PROBLEM 5 END SHOW PROBLEM.
La Hora... Telling Time in Spanish. ¿Que hora es? The verb ser is used to express the time of day. Use es when referring to "one o'clock" and use son.
4.1 Continuidad en un punto 4.2 Tipos de discontinuidades 4.3 Continuidad en intervalos.
Algebra Lineal.
Objective: I can recognize and accurately use gender agreement. Do Now: Match the following Spanish and English words: 1. Pelirroja a. Good-looking 2.
Notes #18 Numbers 31 and higher Standard 1.2
-AR Verbs In Spanish, there are three classes (or conjugations) of verbs: those that end in –AR, those that end in –ER, and those that end in –IR. This.
DIRECT OBJECT PRONOUNS. DIRECT OBJECTS The object that directly receives the action of the verb is called the direct object. Mary kicked the ball. "Ball"
1 Using Definite Articles with Specific Things 2 Los sustantivos (nouns): l Nouns name people, animals, places, or things. l In español, all nouns have.
HYPERBOLAS Standard 4, 9, 16, 17 DEFINITION OF A HYPERBOLA
Econ. Juan Daniel Morocho Ruiz
Write the letter of the correct definition or sentence next to the word below. 1.________ Algebraic Expression 2.________ Equation 3.________ Exponent.
Español 3 Sra. Carpinella.  Because each tense is used for very specific things, there are some key words that indicate whether you would use the imperfect.
DEFINITION OF A ELLIPSE STANDARD FORMULAS FOR ELLIPSES
What has to be done today? It can be done in any order. Make a new ALC form Do the ALC Get two popsicle sticks Get 16 feet of yarn. That is 4 arms width.
(Los verbos poner, salir y traer) The Verbs poner, salir and traer Three verbs that are irregular only in their yo forms Modified by M. Sincioco.
Subject Pronouns and Regular –AR Verb Conjugation The most important grammar presentation you will have this school year.
1 SOLVING RATIONAL EQUATIONS SIMPLIFYING RATIONAL EXPRESSIONS Standards 4, 7, 15, 25 ADDING RATIONAL EXPRESSIONS PROBLEM 1 RATIONAL EXPRESSIONS PROBLEM.
Hacer Ahora Lunes, el 7 de febrero
12- 9 Solve Multi-Step Eqns Solve using the properties and inverse operations. Check your answers Ann earns 1.5 times her normal.
ANÁLISIS MULTIVARIANTE
Digital Photography: Selfie Slides
Definite & indefinite articles
Digital Photography: Selfie Slides Liliana Martinez 10/27/14 4b.
Digital Photography: Selfie Slides Your Name Date Class Period.
At the Amusement Park ¡Qué divertido! ¡Qué miedo! la montaña rusa los autitos chocadores el boleto tener miedo la vuelta al mundo UNIDAD 7 Lección 2 Table.
Digital Photography: Selfie Slides By: Essence L. Thomas.
JRLeon Geometry Chapter 9.1 HGHS Lesson 9.1 In a right triangle, the side opposite the right angle is called the hypotenuse. The other two sides are called.
Digital Photography: Selfie Slides Your Name Date Class Period.
The Verb Tener Spanish Tener Let’s look at the verb tener (“to have”). It features two verb changes that we will see very soon.
Tecnología y Estructura de Costos. Technologies u A technology is a process by which inputs are converted to an output. u E.g. labor, a computer, a projector,
Digital Photography: Selfie Slides Your Name Date Class Period.
Digital Photography: Selfie Slides Anaiyah holiday 10/23/2014 6th.
Digital Photography: Selfie Slides
Digital Photography: Selfie Slides Caidyn Tanton 10/23/14 Period: 1.
CONJUGATION.
Los vectores propios son vectores representativos NO UNICOS, (se obtienen de las bases) Existen infinitos vectores propios Nota: El Ov no puede ser un.
Matrices.
Lorena Chavez JESICA BRASSEL
Essential ?: How do I use these irregular verbs? How are they different than the verbs I already know?
El presente indicativo ESPAÑOL 1. A. What is the present tense? It is when the action of a verb occurs at the moment. Verbs can be divided into two categories:
Capítulo 1 20 of The name of a person, place, or thing is a _____. In Spanish, every noun has a _______, either masculine or feminine. Almost all.
Two-dimensional Shapes Las formas bidimensionales
5. Repaso de matrices (© Chema Madoz, VEGAP, Madrid 2009)
What are nouns? What is different about nouns in Spanish vs. nouns in English? All nouns have gender. ( i.e. masculine & feminine ) el muchacho (masculino)
Las Horas del Día hora hora o’clock §The word hora means time in asking the time of the day. In standing time the word hora is understood. There is not.
Unidad I, Lección 4 La América Central y México. 23/9 Bellringer Take down the vocabulary notes which are located on the next 3 slides. The first two.
Forming Questions ¡Aprenda! Forming Questions By Patricia Carl October 2013.
Los Artículos Los Nombres (Nouns)  Name of a person, place or thing is a noun  In Spanish, every noun has a gender, either masculine or feminine 
Nuestra escuela. Un proyecto de mandatos y localizaciones.
Campanada guidelines in your composition notebook ¿Cómo es tu familia? (10 points) Ex. Hoy es Miercoles el 27 de enero First line will start with the date.
Profesora: Milagros Coraspe Realizado por: Almérida, Gissell C.I.: Valladares, Angélica C.I.: Universidad De Oriente Núcleo Monagas.
EQUILIBRIUM OF A PARTICLE IN 2-D Today’s Objectives: Students will be able to : a) Draw a free body diagram (FBD), and, b) Apply equations of equilibrium.
Linear algebra: matrices
Youden Analysis. Introduction to W. J. Youden Components of the Youden Graph Calculations Getting the “Circle” What to do with the results.
Magnitudes vectoriales
Magnitudes vectoriales
Transcripción de la presentación:

Linear algebra: matrices Horacio Rodríguez

Introduction Some of the slides are reused from my course on graph- based methods in NLP (U. Alicante, 2008) – –so, some of the slides are in Spanish Material can be obtained from wikipedia (under the articles on matrices, linear algebra,...) Another interesting source is Wolfram MathWorld –( Several mathematical software packages provide implementation of the matrix operations and decompositions: –Matlab (I have tested some features) –Mapple –Mathematica Some of the slides are reused from my course on graph- based methods in NLP (U. Alicante, 2008) – –so, some of the slides are in Spanish Material can be obtained from wikipedia (under the articles on matrices, linear algebra,...) Another interesting source is Wolfram MathWorld –( Several mathematical software packages provide implementation of the matrix operations and decompositions: –Matlab (I have tested some features) –Mapple –Mathematica

Vectorial Spaces –dimension –Bases –Sub-spaces –Kernel –Image –Linear maps –Ortogonal base Metric Spaces –Ortonormal base Matrix representation of a Linear map Basic operations on matrices

Basic concepts –Matriz hermítica (autoadjunta) A = A *, A es igual a la conjugada de su traspuesta Una matriz real y simétrica es hermítica –A * = A T Una matriz hermítica es normal Todos los valores propios son reales Los vectores propios correspondientes a valores propios distintos son ortogonales Es posible encontrar una base compuesta sólo por vectores propios –Matriz normal A * A = AA * si A es real, A T A = AA T –Matriz unitaria A * A = AA * = I n si A es real, A unitaria  ortogonal –Matriz hermítica (autoadjunta) A = A *, A es igual a la conjugada de su traspuesta Una matriz real y simétrica es hermítica –A * = A T Una matriz hermítica es normal Todos los valores propios son reales Los vectores propios correspondientes a valores propios distintos son ortogonales Es posible encontrar una base compuesta sólo por vectores propios –Matriz normal A * A = AA * si A es real, A T A = AA T –Matriz unitaria A * A = AA * = I n si A es real, A unitaria  ortogonal

Transpose of a matrix –The transpose of a matrix A is another matrix A T created by any one of the following equivalent actions: write the rows of A as the columns of A T write the columns of A as the rows of A T reflect A by its main diagonal (which starts from the top left) to obtain A T

Positive definite matrix –For complex matrices, a positive-definite matrix is a (Hermitian) matrix if z * Mz > 0 for all non-zero complex vectors z. The quantity z * Mz is always real because M is a Hermitian matrix. –For real matrices, an n × n real symmetric matrix M is positive definite if z T Mz > 0 for all non-zero vectors z with real entries (i.e. z ∈ R n ). –A Hermitian (or symmetric) matrix is positive-definite iff all its eigenvalues are > 0.

Bloc decomposition Algunos conceptos a recordar de Álgebra Matricial –Descomposición de una matriz en bloques bloques rectangulares Algunos conceptos a recordar de Álgebra Matricial –Descomposición de una matriz en bloques bloques rectangulares

Bloc decomposition –Descomposición de una matriz en bloques Suma directa A  B, A m  n, B p  q Block diagonal matrices (cuadradas) –Descomposición de una matriz en bloques Suma directa A  B, A m  n, B p  q Block diagonal matrices (cuadradas)

Matrix decomposition –Different decompositions are used to implement efficient matrix algorithms.. –For instance, when solving a system of linear equations Ax = b, the matrix A can be decomposed via the LU decomposition. The LU decomposition factorizes a matrix into a lower triangular matrix L and an upper triangular matrix U. The systems L(Ux) = b and Ux = L − 1 b are much easier to solve than the original. –Matrix decomposition at wikipedia: Decompositions related to solving systems of linear equations Decompositions based on eigenvalues and related concepts

LU decomposition –Descomposiciones de matrices LU –A matriz cuadrada compleja n  n –A = LU –L lower triangular –U upper triangular LDU –A = LDU –L unit lower triangular (las entradas de la diagonal son 1) –U unit upper triangular (las entradas de la diagonal son 1) –D matriz diagonal LUP –A = LUP –L lower triangular –U upper triangular –P matriz permutación »sólo 0 ó 1 con un solo 1 en cada fila y columna –Descomposiciones de matrices LU –A matriz cuadrada compleja n  n –A = LU –L lower triangular –U upper triangular LDU –A = LDU –L unit lower triangular (las entradas de la diagonal son 1) –U unit upper triangular (las entradas de la diagonal son 1) –D matriz diagonal LUP –A = LUP –L lower triangular –U upper triangular –P matriz permutación »sólo 0 ó 1 con un solo 1 en cada fila y columna

LU decomposition –Existence An LUP decomposition exists for any square matrix A When P is an identity matrix, the LUP decomposition reduces to the LU decomposition. If the LU decomposition exists, the LDU decomposition does too. –Applications The LUP and LU decompositions are useful in solving an n-by-n system of linear equations Ax = b –Existence An LUP decomposition exists for any square matrix A When P is an identity matrix, the LUP decomposition reduces to the LU decomposition. If the LU decomposition exists, the LDU decomposition does too. –Applications The LUP and LU decompositions are useful in solving an n-by-n system of linear equations Ax = b

Cholesky decomposition –Descomposiciones de matrices Cholesky –A hermítica, definida positiva »y, por lo tanto, a matrices cuadradas,reales, simétricas, definidas positivas –A = LL * o equivalentemente A = U * U –L lower triangular con entradas en la diagonal estrictamente positivas –the Cholesky decomposition is a special case of the symmetric LU decomposition, with L = U * (or U=L * ). –the Cholesky decomposition is unique

Cholesky decomposition –Cholesky decomposition in Matlab A must be positive definite; otherwise, MATLAB displays an error message. Both full and sparse matrices are allowed syntax –R = chol(A) –L = chol(A,'lower') –[R,p] = chol(A) –[L,p] = chol(A,'lower') –[R,p,S] = chol(A) –[R,p,s] = chol(A,'vector') –[L,p,s] = chol(A,'lower','vector')

Cholesky decomposition –Example The binomial coefficients arranged in a symmetric array create an interesting positive definite matrix. n = 5 X = pascal(n) X =

Cholesky decomposition –Example It is interesting because its Cholesky factor consists of the same coefficients, arranged in an upper triangular matrix. R = chol(X) R =

Cholesky decomposition –Example Destroy the positive definiteness by subtracting 1 from the last element. X(n,n) = X(n,n)-1 X = Now an attempt to find the Cholesky factorization fails.

QR decomposition –QR –A real matrix m  n –A = QR –R upper triangular m  n –Q ortogonal (QQ T = I) m  m –similarmente »QL »RQ »LQ –Si A es no singular (invertible) la factorización es única si los elementos de la diagonal principal de R han de ser positivos –Proceso de ortonormalización de Gram-Schmidt

QR decomposition –QR in matlab: Syntax –[Q,R] = qr(A) (full and sparse matrices) –[Q,R] = qr(A,0) (full and sparse matrices) –[Q,R,E] = qr(A) (full matrices) –[Q,R,E] = qr(A,0) (full matrices) –X = qr(A) (full matrices) –R = qr(A) (sparse matrices) –[C,R] = qr(A,B) (sparse matrices) –R = qr(A,0) (sparse matrices) –[C,R] = qr(A,B,0) (sparse matrices)

QR decomposition –example: A = [ ] This is a rank-deficient matrix; the middle column is the average of the other two columns. The rank deficiency is revealed by the factorization: [Q,R] = qr(A) Q = R = The triangular structure of R gives it zeros below the diagonal; the zero on the diagonal in R(3,3) implies that R, and consequently A, does not have full rank.

Projection –Proyección P tal que P 2 = P (idempotente) Una proyección proyecta el espacio W sobre un subespacio U y deja los puntos del subespacio inalterados –x  U, rango de la proyección: Px = x –x  V, espacio nulo de la proyección: Px = 0 W = U  V, U y V son complementarios Los únicos valores propios son 0 y 1, W 0 = V, W 1 = U Proyecciones ortogonales: U y V son ortogonales –Proyección P tal que P 2 = P (idempotente) Una proyección proyecta el espacio W sobre un subespacio U y deja los puntos del subespacio inalterados –x  U, rango de la proyección: Px = x –x  V, espacio nulo de la proyección: Px = 0 W = U  V, U y V son complementarios Los únicos valores propios son 0 y 1, W 0 = V, W 1 = U Proyecciones ortogonales: U y V son ortogonales

Centering matrix matriz simétrica e idempotente que multiplicada por un vector tiene el mismo efecto que restar a cada componente del vector la media de sus componentes I n matriz identidad de tamaño n 1 vector columna de n unos C n = I n -1/n 11 T matriz simétrica e idempotente que multiplicada por un vector tiene el mismo efecto que restar a cada componente del vector la media de sus componentes I n matriz identidad de tamaño n 1 vector columna de n unos C n = I n -1/n 11 T

Eigendecomposition –especial case of linear map are endomorphisms i.e. maps f: V → V. –In this case, vectors v can be compared to their image under f, f(v). Any vector v satisfying λ · v = f(v), where λ is a scalar, is called an eigenvector of f with eigenvalue λ –v is an element of kernel of the difference f − λ · I –In the finite-dimensional case, this can be rephrased using determinants f having eigenvalue λ is the same as det (f − λ · I) = 0 characteristic polynomial of f –The vector space V may or may not possess an eigenbasis, i.e. a basis consisting of eigenvectors. This phenomenon is governed by the Jordan canonical form of the map. –The spectral theorem describes the infinite-dimensional case

Eigendecomposition –Decomposition of a matrix A into eigenvalues and eigenvectors –Each eigenvalue is paired with its corresponding eigenvector –This decomposition is often named matrix diagonalization –nondegenerate eigenvalues 1... n –D is the diagonal matrix formed with the set of eigenvalues –linearly independent eigenvectors X 1... X n –P is the matrix formed with the columns corresponding to the set of eigenvectors –AX = X –if the n eigenvalues are distinct, P is invertible –A = PDP -1 –Decomposition of a matrix A into eigenvalues and eigenvectors –Each eigenvalue is paired with its corresponding eigenvector –This decomposition is often named matrix diagonalization –nondegenerate eigenvalues 1... n –D is the diagonal matrix formed with the set of eigenvalues –linearly independent eigenvectors X 1... X n –P is the matrix formed with the columns corresponding to the set of eigenvectors –AX = X –if the n eigenvalues are distinct, P is invertible –A = PDP -1

Eigendecomposition –Teorema espectral condiciones para que una matriz sea diagonalizable A matriz hermítica en un espacio V (complejo o real) dotado de un producto interior – = Existe una base ortonormal de V consistente en vectores propios de A. Los valores propios son reales Descomposición espectral de A –para cada valor propio diferente V ={v  V: Av= v} –V es la suma directa de los V –Diagonalización si A es normal (y por tanto si es hermítica y por tanto si es real simétrica) entonces existe una descomposición –A = U  U * –  es diagonal, sus entradas son los valores propios de A –U es unitaria, sus columnas son los vectores propios de A –Teorema espectral condiciones para que una matriz sea diagonalizable A matriz hermítica en un espacio V (complejo o real) dotado de un producto interior – = Existe una base ortonormal de V consistente en vectores propios de A. Los valores propios son reales Descomposición espectral de A –para cada valor propio diferente V ={v  V: Av= v} –V es la suma directa de los V –Diagonalización si A es normal (y por tanto si es hermítica y por tanto si es real simétrica) entonces existe una descomposición –A = U  U * –  es diagonal, sus entradas son los valores propios de A –U es unitaria, sus columnas son los vectores propios de A

Eigendecomposition –Caso de matrices no simétricas r k right eigenvectors Ar k = r k l k left eigenvectors l k A = l k –Si A es real A T l k = l k –Si A es simétrica r k = l k –Caso de matrices no simétricas r k right eigenvectors Ar k = r k l k left eigenvectors l k A = l k –Si A es real A T l k = l k –Si A es simétrica r k = l k

Eigendecomposition –Eigendecomposition in Matlab –Syntax d = eig(A) d = eig(A,B) [V,D] = eig(A) [V,D] = eig(A,'nobalance') [V,D] = eig(A,B) [V,D] = eig(A,B,flag) –Eigendecomposition in Matlab –Syntax d = eig(A) d = eig(A,B) [V,D] = eig(A) [V,D] = eig(A,'nobalance') [V,D] = eig(A,B) [V,D] = eig(A,B,flag)

Jordan Normal Form –Jordan normal form una matriz cuadrada A n  n es diagonalizable ssi la suma de las dimensiones de sus espacios propios es n  tiene n vectores propios linealmente independientes No todas las matrices son diagonalizables dada A existe siempre una matriz P invertible tal que –A = PJP -1 –J tiene entradas no nulas sólo en la diagonal principal y la diagonal superior –J está en forma normal de Jordan –Jordan normal form una matriz cuadrada A n  n es diagonalizable ssi la suma de las dimensiones de sus espacios propios es n  tiene n vectores propios linealmente independientes No todas las matrices son diagonalizables dada A existe siempre una matriz P invertible tal que –A = PJP -1 –J tiene entradas no nulas sólo en la diagonal principal y la diagonal superior –J está en forma normal de Jordan

Jordan Normal Form –Example Consider the following matrix: The characteristic polynomial of A is: eigenvalues are 1, 2, 4 and 4 The eigenspace corresponding to the eigenvalue 1 can be found by solving the equation Av = v. So, the geometric multiplicity (i.e. dimension of the eigenspace of the given eigenvalue) of each of the three eigenvalues is one. Therefore, the two eigenvalues equal to 4 correspond to a single Jordan block, –Example Consider the following matrix: The characteristic polynomial of A is: eigenvalues are 1, 2, 4 and 4 The eigenspace corresponding to the eigenvalue 1 can be found by solving the equation Av = v. So, the geometric multiplicity (i.e. dimension of the eigenspace of the given eigenvalue) of each of the three eigenvalues is one. Therefore, the two eigenvalues equal to 4 correspond to a single Jordan block,

Jordan Normal Form –Example The Jordan normal form of the matrix A is the direct sum of the three Jordan blocs The matrix J is almost diagonal. This is the Jordan normal form of A. –Example The Jordan normal form of the matrix A is the direct sum of the three Jordan blocs The matrix J is almost diagonal. This is the Jordan normal form of A.

Schur Normal Form –Descomposiciones de matrices Schur –A matriz cuadrada compleja n  n –A = QUQ * –Q unitaria –Q * traspuesta conjugada de Q –U upper triangular –Las entradas de la diagonal de U son los valores propios de A –Descomposiciones de matrices Schur –A matriz cuadrada compleja n  n –A = QUQ * –Q unitaria –Q * traspuesta conjugada de Q –U upper triangular –Las entradas de la diagonal de U son los valores propios de A

SVD –Descomposiciones de matrices SVD –Generalización del teorema espectral –M matriz m  n –M = U  V * –U m  m unitary ortonormal input –V n  n unitary ortonormal output –V * transpuesta conjugada de V –  matriz diagonal con entradas no negativas valores propios –Mv =  u, M * u =  v,  valor propio, u left singular vector, v right singular vector –Las columnas de U son los vectores propios u –Las columnas de V son los vectores propios v –Aplicación a la reducción de la dimensionalidad Principal Components Analysis –Descomposiciones de matrices SVD –Generalización del teorema espectral –M matriz m  n –M = U  V * –U m  m unitary ortonormal input –V n  n unitary ortonormal output –V * transpuesta conjugada de V –  matriz diagonal con entradas no negativas valores propios –Mv =  u, M * u =  v,  valor propio, u left singular vector, v right singular vector –Las columnas de U son los vectores propios u –Las columnas de V son los vectores propios v –Aplicación a la reducción de la dimensionalidad Principal Components Analysis