[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]Contents

I find it may cost me so much time in doing such solutions to exercises and problems....I am sorry that I could not be persistent in doing it...Wish I could just recover it later on.

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]PrI.6.1

Given a basis $U=(u_1,cdots,u_n)$ not necessarily orthonormal, in $scrH$, how would you compute the biorthogonal basis $sex{v_1,cdots,v_n}$? Find a formula that expresses $sef{v_j,x}$ for each $xinscrH$ and $j=1,cdots,k$ in terms of Gram matrices.

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.10

Every $k imes k$ positive matrix $A=(a_{ij})$ can be realised as a Gram matrix, i.e., vectors $x_j$, $1leq jleq k$, can be found so that $a_{ij}=sef{x_i,x_j}$ for all $i,j$.

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.9

(Schur's Theorem) If $A$ is positive, then $$ex per(A)geq det A. eex$$

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.8

Prove that for any matrices $A,B$ we have $$ex |per (AB)|^2leq per (AA^*)cdot per (B^*B). eex$$ (The corresponding relation for determinants is an easy equality.)

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.7

Prove that for any vectors $$ex u_1,cdots,u_k,quad v_1,cdots,v_k, eex$$ we have $$ex |det(sef{u_i,v_j})|^2 leq detsex{sef{u_i,u_j}}cdot det sex{sef{v_i,v_j}}, eex$$ $$ex |per(sef{u_i,v_j})|^2 leq persex{sef{u_i,u_j}}cdot per sex{sef{v_i,v_j}}. eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.6

Let $A$ be a nilpotent operator. Show how to obtain, from aJordan basis for $A$, aJordan basis of $wedge^2A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.5

Show that the inner product $$ex sef{x_1vee cdots vee x_k,y_1vee cdotsvee y_k} eex$$ is equal to the permanent of the $k imes k$ matrix $sex{sef{x_i,y_j}}$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.4

If $dim scrH=3$, then $dim otimes^3scrH =27$, $dim wedge^3scrH =1$ and $dim vee^3scrH =10$. In terms of an orthonormal basis of $scrH$, write an element of $(wedge^3scrH )oplus vee^3scrH)^perp$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.3

Let $scrM$ be a $p$-dimensional subspace of $scrH$ and $scrN$ its orthogonal complement. Choosing $j$ vectors from $scrM$ and $k-j$ vectors from $scrN$ and forming the linear span of the antisymmetric tensor products of all such vectors, we get different subspaces of $wedge^kscrH$; for example, one of those is $vee^kscrM$. Determine all the subspaces thus obtained and their dimensionalities. Do the same for $vee^kscrH$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.2

The elementary tensors $xotimes cdots otimes x$, with all factors equal, are all in the subspace $vee^kscrH$. Do they span it?

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.1

Show that the inner product $$ex sef{x_1wedge cdots wedge x_k,y_1wedge cdotswedge y_k} eex$$ is equal to the determinant of the $k imes k$ matrix $sex{sef{x_i,y_j}}$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.6

Let $A$ and $B$ be two matrices (not necessarily of the same size). Relative to the lexicographically ordered basis on the space of tensors, the matrix for $Aotimes B$ can be written in block form as follows: if $A=(a_{ij})$, then $$ex Aotimes B=sex{a{ccc} a_{11}B&cdots&a_{1n}B\ vdots&ddots&vdots\ a_{n1}B&cdots&a_{nn}B ea}. eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.5

Suppose it is known that $scrM$ is an invariant subspace for $A$. What invariant subspaces for $Aotimes A$ can be obtained from this information alone?

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.4

(1). There is a natural isomorphism between the spaces $scrHotimes scrH^*$ and $scrL(scrH,scrK)$ in which the elementary tensor $kotimes h^*$corresponds to the linear map that takes a vector $u$ of $scrH$ to $sef{h,u}k$. This linear transformation has rank one and all rank one transformations can be obtained in this way.

 

(2). An explicit transformation of this isomorphism $varphi$ is outlined below. Let $e_1,cdots,e_n$ be an orthonormal basis for $scrH$ and for $scrH^*$. Let $f_1,cdots,f_m$ be an orthonormal basis of $scrK$. Identify each element of $scrL(scrH,scrK)$ with it matrix with respect to these bases. Let $E_{ij}$ be the matrix all whose entries are zero except the $(i,j)$-entry, which is $1$. Show that $varphi(f_iotimes e_j)=E_{ij}$ for all $1leq ileq m$, $1leq jleq n$. Thus, if $A$ is any $m imes n$ matrix with entries $a_{ij}$, then $$ex varphi^{-1}(A)=sum_{i,j}a_{ij}(f_iotimes e_j) =sum_{i,j}(Ae_j)otimes e_j. eex$$

 

(3). the space $scrL(scrH,scrK)$ is a Hilbert space with inner product $$ex sef{A,B}= r A^*B. eex$$ The set $E_{ij}$, $1leq ileq m$, $1leq jleq n$ is an orthonormal basis for this space. Show that the map $varphi$ is a Hilbert space isomorphism; i.e., $$ex sef{varphi^{-1}(A),varphi^{-1}(B)} =sef{A,B},quadforall A,B. eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.1

 

Let $x,y,z$ be linearly independent vectors in $scrH$. Find a necessary and sufficient condition that a vector $w$ mush satisfy in order that the bilinear functional $$ex F(u,v)=sef{x,u}sef{y,v}+sef{z,u}sef{w,v} eex$$ is elementary.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.7

For every matrix $A$, the matrix $$ex sex{a{cc} I&A\ 0&I ea} eex$$ is invertible and its inverse is $$ex sex{a{cc} I&-A\ 0&I ea}. eex$$ Use this to show that if $A,B$ are any two $n imes n$ matrices, then $$ex sex{a{cc} I&A\ 0&I ea}^{-1}sex{a{cc} AB&0\ B&0 ea} sex{a{cc} I&A\ 0&I ea}=sex{a{cc} 0&0\ B&BA ea}. eex$$ This implies that $AB$ and $BA$ have the same eigenvalues.(This last fact can be proved in another way as follows. If $B$ is invertible, then $AB=B^{-1}(BA)B$. So, $AB$ and $BA$ have the same eigenvalues. Since invertible matrices are dense in the space of matrices, and a general known fact in complex analysis is that the roots of a polynomial vary continuously with the coefficients, the above conclusion also holds in general.)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.6

If $A$ is a contraction, show that $$ex A^*(I-AA^*)^{1/2}=(I-A^*A)^{1/2}A^*. eex$$ Use this to show that if $A$ is a contraction on $scrH$, then the operators $$ex U=sex{a{cc} A&(I-AA^*)^{1/2}\ (I-A^*A)^{1/2}&-A^* ea}, eex$$ $$ex V=sex{a{cc} A&-(I-AA^*)^{1/2}\ (I-A^*A)^{1/2}&A^* ea} eex$$ are unitary operators on $scrHoplus scrH$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.1

Let $A=A_1oplus A_2$. Show that

(1). $W(A)$ is the convex hull of $W(A_1)$ and $W(A_2)$; i.e., the smallest convex set containing $W(A_1)cup W(A_2)$.

(2). $$eex ea sen{A}&=maxsed{sen{A_1},sen{A_2}},\ spr(A)&=maxsed{spr(A_1),spr(A_2)},\ w(A)&=maxsed{w(A_1),w(A_2)}. eea eeex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.10

(1). The numerical radius defines a norm on $scrL(scrH)$.

(2). $w(UAU^*)=w(A)$ for all $Uin U(n)$.

(3). $w(A)leq sen{A}leq 2w(A)$ for all $A$.

(4). $w(A)=sen{A}$ if (but not only if) $A$ is normal.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.9

(1). When $A$ is normal, the set $W(A)$ is the convex hull of the eigenvalues of $A$. For nonnormal matrices, $W(A)$ may be bigger than the convex hull of its eigenvalues. For Hermitian operators, the first statement says that $W(A)$ is the close interval whose endpoints are the smallest and the largest eigenvalues of $A$.

(2). If a unit vector $x$ belongs to the linear span of the eigenspaces corresponding to eigenvalues $lm_1,cdots,lm_k$ of a normal operator $A$, then $sef{x,Ax}$ lies in the convex hull of $lm_1,cdots,lm_k$. (This fact will be used frequently in Chapter III.)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.8

For any matrix $A$ the series $$ex exp A=I+A+frac{A^2}{2!}+cdots+frac{A^n}{n!}+cdots eex$$ converges. This is called the exponential of $A$. The matrix $A$ is always invertible and $$ex (exp A)^{-1}=exp(-A). eex$$ Conversely, every invertible matrix can be expressed as the exponential of some matrix. Every unitary matrix can be expressed as the exponential of a skew-Hermitian matrix.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.7

The set of all invertible matrices is a dense open subset of the set of all $n imes n$ matrices. The set of all unitary matrices is a compact subset of all $n imes n$ matrices. These two sets are also groups under multiplication. They are called the general linear group $GL(n)$ and the unitary group $U(n)$, respectively.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.6

If $sen{A}<1$, then $I-A$ is invertible, and $$ex (I-A)^{-1}=I+A+A^2+cdots, eex$$ aa convergent power series. This is called the Neumann series.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.5

Show that matrices with distinct eigenvalues are dense in the space of all $n imes n$ matrices. (Use the Schur triangularisation)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.4

(1). The singular value decomposition leads tot eh polar decomposition: Every operator $A$ can be written as $A=UP$, where $U$ is unitary and $P$ is positive. In this decomposition the positive part $P$ is unique, $P=|A|$. The unitary part $U$ is unique if $A$ is invertible.

(2). An operator $A$ is normal if and only if the factors $U$ and $P$ in the polar decomposition of $A$ commute.

(3). We have derived the polar decomposition from the singular value decomposition. Show that it is possible to derive the latter from the former.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.3

(1). Let $sed{A_al}$ be a family of mutually commuting operators. Then, there exists a common Schur basis for $sed{A_al}$. In other words, there exists a unitary $Q$ such that $Q^*A_al Q$ is upper triangular for all $al$.

(2). Let $sed{A_al}$ be a family of mutually commuting normal operators. Then, there exists a unitary $Q$ such that $Q^*A_al Q$ is diagonal for all $al$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.2

Show that the following statements are equivalent:

(1). $A$ is positive.

(2). $A=B^*B$ for some $B$.

(3). $A=T^*T$ for some upper triangular $T$.

(4). $A=T^*T$ for some upper triangular $T$ with nonnegative diagonal entries. If $A$ is positive definite, then the factorization in (4) is unique. This is called the Cholesky decomposition of $A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.1

For fixed basis of in $scrH$ and $scrK$, the matrix $A^*$ is the conjugate transpose of the matrix of $A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.3

Use the QR decomposition to prove Hadamard's inequality: if $X=(x_1,cdots,x_n)$, then $$ex |det X|leq prod_{j=1}^n sen{x_j}. eex$$ Equality holds here if and only if the $x_j$ are mutually orthogonal or some $x_j$ are zero. 

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.2

Let $X$ be nay basis of $scrH$ and let $Y$ be the basis biorthogonal to it. Using matrix multiplication, $X$ gives a linear transformation from $bC^n$ to $scrH$. The inverse of this is given by $Y^*$. In the special case when $X$ is orthonormal (so that $Y=X$), this transformation is inner-product preserving if the standard inner product is used on $bC^n$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.1

Given any $k$-tupel of linearly independent vectors $X$ as above, there exists a $k$-tuple $Y$ biorthognal to it. If $k=n$, this $Y$ is unique. 

原文地址:https://www.cnblogs.com/zhangzujin/p/4101248.html