On a bound on the norm of a matrix with a simple pole

  • I
  • Thread starter psie
  • Start date
  • Tags
    Norm
  • #1
psie
108
10
TL;DR Summary
In Ordinary Differential Equations by Andersson and Böiers, I'm reading about so-called weakly singular systems, that is, ##\pmb{x}'(z)=A(z)\pmb{x}(z)## where ##A(z)## is analytic except at the origin where it has a simple pole (this means all its entries are analytic with at most a simple pole). I'm confused about an estimate made in a proof in this section.
Let ##A(z)## be a matrix function with a simple pole at the origin; in other words, we can expand it into a Laurent series of the form ##\frac1{z}A_{-1}+A_0+zA_1+\ldots##, where ##A_i## are constant matrices and ##A_{-1}\neq 0##. Fix ##\theta_0\in[0,2\pi)## and ##c\in(0,1)## (here ##1## could also be any other real, finite number) and let ##0<s<c##. My textbook claims that $$\lVert A(se^{i\theta_0})\rVert\leq m|se^{i\theta_0}|^{-1}=\frac{m}{s},\qquad 0<s<c,$$ for some ##m>0## and that this should follow from the inequality ##\lVert A\rVert\leq \left(\sum_{j,k=0}^n |a_{jk}|^2\right)^{1/2}##. I do not understand this, because consider for instance $$\begin{bmatrix} \frac1{z}&1\\ 2&3 \end{bmatrix}=\frac1{z}\begin{bmatrix}1&0\\ 0&0\end{bmatrix}+\begin{bmatrix}0&1\\ 2&3\end{bmatrix}$$ I don't see how the claimed inequality follows from ##\lVert A\rVert\leq \left(\sum_{j,k=0}^n |a_{jk}|^2\right)^{1/2}## in this case, since it seems like we can't factor out ##\frac1{s}## from the sum.
 
Physics news on Phys.org
  • #2
If I am not mistaken, we could replace all entries that do not have poles with ##b_{jk}=\frac 1 s a_{jk}## and leave ##b_{jk} =a_{jk}## for entries with the poles. Then $$\lVert A\rVert\leq \left(\sum_{j,k=0}^n |a_{jk}|^2\right)^{1/2}\leq \left(\sum_{j,k=0}^n |b_{jk}|^2\right)^{1/2}$$ and we can factor out ##\frac 1 s## from the sum on the right.
 
  • Like
Likes psie
  • #3
Hill said:
If I am not mistaken, we could replace all entries that do not have poles with ##b_{jk}=\frac 1 s a_{jk}## and leave ##b_{jk} =a_{jk}## for entries with the poles. Then $$\lVert A\rVert\leq \left(\sum_{j,k=0}^n |a_{jk}|^2\right)^{1/2}\leq \left(\sum_{j,k=0}^n |b_{jk}|^2\right)^{1/2}$$ and we can factor out ##\frac 1 s## from the sum on the right.
If I understand you right, you mean that we simply write ##a_{jk}=\frac{s}{s}a_{jk}\leq\frac{c}{s}a_{jk}## and ##c## gets absorbed by ##a_{jk}##.
 
  • #4
psie said:
If I understand you right, you mean that we simply write ##a_{jk}=\frac{s}{s}a_{jk}\leq\frac{c}{s}a_{jk}## and ##c## gets absorbed by ##a_{jk}##.
Yes, this is another way to put it.
 
  • Like
Likes psie
  • #5
Careful, I don't think we can say, ##\frac{s}{s}a_{jk}\leq\frac{c}{s}a_{jk}##. But we can say, ##|\frac{s}{s}a_{jk}|\leq|\frac{c}{s}a_{jk}|##, and this is all we need.
 
  • Like
Likes psie
  • #6
How about this?

Set ##A=A_{-1}z^{-1} +B(z)## where ##\|A_{-1}\|=m'\, , \,\|B\|=b<\infty ## and ##m=m'+bc^2.## Then
\begin{align*}
\|A(z)\|&\leq \|A_{-1}\|\cdot \dfrac{1}{\|z\|} + \|B(z)\|\leq \|A_{-1}\|\cdot \dfrac{1}{\|z\|} + \|B\|\cdot\|z\|\\
&\leq \dfrac{m'}{s}+ b\cdot s \leq \dfrac{m'}{s}+ b\cdot c =\dfrac{m'+bsc}{s}\leq\dfrac{m}{s}
\end{align*}
 
  • Like
Likes psie
  • #7
I am not sure whether ##\|B(z)\|\leq \|B\|\cdot \|z\|## is true. Maybe we need something different as an upper bound for the entire part of the Laurent polynomial. There must be something so that it doesn't go to infinity, the radius of convergence.
 
  • #8
fresh_42 said:
I am not sure whether ##\|B(z)\|\leq \|B\|\cdot \|z\|## is true. Maybe we need something different as an upper bound for the entire part of the Laurent polynomial. There must be something so that it doesn't go to infinity, the radius of convergence.
Are you thinking about the inequality ##\lVert A x\rVert\leq \lVert A \rVert\cdot \lVert x\rVert## where ##x## is a vector? This inequality is e.g. listed here. Given ##B(z)##, I'm confused about what ##\|B\|## would be in this case.
 
  • #9
psie said:
Are you thinking about the inequality ##\lVert A x\rVert\leq \lVert A \rVert\cdot \lVert x\rVert## where ##x## is a vector? This inequality is e.g. listed here. Given ##B(z)##, I'm confused about what ##\|B\|## would be in this case.
Yes, that was my first impetus because it is a standard reflex to apply ##\|Mx\|\leq\|M\|\cdot\|x\|.## But it is probably wrong here.

Nevertheless, ##A(z)## has to be bounded outside its pole region since it would have another pole otherwise. And if ##A(z)## is bounded at points ##z=se^{i\theta}## with a certain distance to the pole, ##B(z)=A_0+A_1z+A_2z^2+\ldots## is bounded there, too.

Assume ##B## is not bounded on the compact disc ##D:=\{\|z\| \leq s\},## say at ##p\in D.## Then
$$
\|A(p)\| =\|A_{-1}(p)+ B(p)\| =\infty
$$
Since the only pole of ##A ## is ##p=0,## we have that ##B(0)=A_0## which is a constant and therefore bounded. And if ##B## is bounded, then
$$
\|B(z)\| \leq \sup_{z\in D}\|B(z)\| =:d<\infty .
$$
The corrected version is thus:
fresh_42 said:
Set ##A=A_{-1}z^{-1} +B(z)## where ##\|A_{-1}\|=m'## and ##m=m'+dc.## Then
\begin{align*}
\|A(z)\|&\leq \|A_{-1}\|\cdot \dfrac{1}{\|z\|} + \|B(z)\|\leq \|A_{-1}\|\cdot \dfrac{1}{\|z\|} + d\\
&=\|A_{-1}\| \cdot \dfrac{1}{s}+ \dfrac{ds}{s}=\dfrac{m'+ds}{s} <\dfrac{m}{s}
\end{align*}
 
  • Like
Likes psie

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
966
  • Linear and Abstract Algebra
Replies
12
Views
2K
Replies
24
Views
1K
Replies
27
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
536
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
910
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
1K
Back
Top