This will be a naive discussion about functions roughly of the form $A: \mathbb{R}^2\to\mathbb{R}$ but allowing delta behaviour which I only know of informally so far. $A$ is a continuous matrix. Addition $A+B$ is done componentwise and multiplication $AB$ is carried out by integrating
(AB)(x,y)=∫+∞−∞A(x,z)B(z,y)dz.
The plot operator $P$, defined through $P_f(x,y)=\delta(x-f(y))$, creates a continuous matrix. One interesting property, when combined with the usual associativity and distributivity of matrix operations, is that
PfPg=∫+∞−∞δ(x−f(z))δ(z−g(y))dz=δ(x−f(g(y)))=Pf∘g. This property makes it possible to interpret an arbitrary continuous matrix as a superposition of graphs $A = \sum \alpha_iP_{f_i}$ of single variable functions $f_i$, and to interpret multiplication of two matrices $A,B$ as a weaving together of all possible compositions: $AB = \sum\alpha_i\beta_jP_{f_i\circ g_j}.$ But all this is just conceptual speculation.
Here is a new light on fourier transforms: $F(x,y)=e^{-ixy}$. The inversion formula $FFf(x)=\tau f(-x)$ can be written as $F^2=\tau\delta(x+y)$ and verified using the definition of matrix multiplication.
In simple cases of signal analysis a linear operator $T$ transforming the signal is characterized through its impulse response $h=T\delta$ in the fashion $Tx(t)=\int_{-\infty}^{+\infty}h(t-u)x(u)du$, i.e. through the convolution $Tx=h*x$. The $h*$ part can be written as a matrix. Indeed, for any functions $x,y$,
x∗y(t)=∫+∞−∞x(t−u)y(u)du=Cxy(t) where $C_f(x,y)=f(x-y)$ in general. In this case we have $Tx=h*x=C_hx$ so that $T=C_h$ in the same sense that a finite-dimensional linear transformation is equal to its transformation matrix.
This reminds me of the familiar way to define complex multiplication:
(ab)(cd)=(a−bba)(cd). Maybe it's more common to use matrices in both steads but we see that with this formulation $zw=M_zw$ where $M_z$ is chosen in the obvious way. The similarity between $C$ and $M$ is that both are obtained by repeating data across diagonals.
Another similarity: In the same way that matrix models verify the associativity $M_uM_vM_w$ of complex number multiplication $uvw$, they verify the associativity $C_fC_gC_h$ of convolutions $f*g*h$. And the same holds for the distributive property.