Math Equations

Thursday, November 26, 2015

Continuous matrices

This will be a naive discussion about functions roughly of the form $A: \mathbb{R}^2\to\mathbb{R}$ but allowing delta behaviour which I only know of informally so far. $A$ is a continuous matrix. Addition $A+B$ is done componentwise and multiplication $AB$ is carried out by integrating
$$(AB)(x,y)=\int_{-\infty}^{+\infty}A(x,z)B(z,y)dz.$$

The plot operator $P$, defined through $P_f(x,y)=\delta(x-f(y))$, creates a continuous matrix. One interesting property, when combined with the usual associativity and distributivity of matrix operations, is that
$$ P_fP_g=\int_{-\infty}^{+\infty}\delta(x-f(z))\delta(z-g(y))dz=\delta(x-f(g(y)))=P_{f\circ g}.$$ This property makes it possible to interpret an arbitrary continuous matrix as a superposition of graphs $A = \sum \alpha_iP_{f_i}$ of single variable functions $f_i$, and to interpret multiplication of two matrices $A,B$ as a weaving together of all possible compositions: $AB = \sum\alpha_i\beta_jP_{f_i\circ g_j}.$ But all this is just conceptual speculation.


Here is a new light on fourier transforms: $F(x,y)=e^{-ixy}$. The inversion formula $FFf(x)=\tau f(-x)$ can be written as $F^2=\tau\delta(x+y)$ and verified using the definition of matrix multiplication.


In simple cases of signal analysis a linear operator $T$ transforming the signal is characterized through its impulse response $h=T\delta$ in the fashion $Tx(t)=\int_{-\infty}^{+\infty}h(t-u)x(u)du$, i.e. through the convolution $Tx=h*x$. The $h*$ part can be written as a matrix. Indeed, for any functions $x,y$,
$$ x*y(t) = \int_{-\infty}^{+\infty}x(t-u)y(u)du = C_xy(t) $$ where $C_f(x,y)=f(x-y)$ in general. In this case we have $Tx=h*x=C_hx$ so that $T=C_h$ in the same sense that a finite-dimensional linear transformation is equal to its transformation matrix.


This reminds me of the familiar way to define complex multiplication:
$$\left(\begin{matrix}a\\b\end{matrix}\right)\left(\begin{matrix}c\\d\end{matrix}\right)=\left(\begin{matrix}a&-b\\b&a\end{matrix}\right)\left(\begin{matrix}c\\d\end{matrix}\right).$$ Maybe it's more common to use matrices in both steads but we see that with this formulation $zw=M_zw$ where $M_z$ is chosen in the obvious way. The similarity between $C$ and $M$ is that both are obtained by repeating data across diagonals.

Another similarity: In the same way that matrix models verify the associativity $M_uM_vM_w$ of complex number multiplication $uvw$, they verify the associativity $C_fC_gC_h$ of convolutions $f*g*h$. And the same holds for the distributive property.