Keeping in reference this image,
I want to build a concrete understanding of interconnection between Digital Signal Processing and Computer Vision. What would be the applications of this co-relation?
Keeping in reference this image,
I want to build a concrete understanding of interconnection between Digital Signal Processing and Computer Vision. What would be the applications of this co-relation?
My attempt: You can treat an image as a discretization of the perspective projection of a continous 3D region, given a near and far plane. Using this discretization, filters used in signal processing can be used to modify the signal (a 2D array) for various purposes. These purposes include things like edge detection, image denoising, object detection, instance/semantic/general segmentation, and various other tasks. Notably 2-D wavelet transforms borne out of signal processing are used to transform image signals to another basis using function spaces spanned by a finite basis of orthogonal functions and perform filtering in this space, after which you invert the transform to achieve a modified image.
Actually, Signal processing is rather related to more so to Image processing then to Computer Vision. But with the advances of Machine learning, Computer vision uses Machine learning methods which draw from advanced statistical signal processing and estimation, detection, and classification. Signal processing is broad term, more so than Image processing. While Image processing might have 2D for image + Temporal dimension for video signal, there are application in Advanced signal processing where the signal is of high DOF, containing hyperspectral information in addition to spatial information. e.g. radar applications and distributed sensors processing and diagnostics
There's not a whole lot at this point, basic filtering works on images also but object detection is a complex problem that goes beyond typical correlation measurement that's often used in detection in DSP settings.
A lot of DSP is done on fairly basic signals, by that I mean that for things like synchronization the transmitted signal is generally simple and we'll defined.
I wish I could find some nice images on the web to illustrate. I might look for some to add later.
Digital Signal Processing is a very broad discipline that means processing an "analog" signal (in some sense) by digital means. Normally the signal is uniformly sampled (that is equally-spaced samples in time or space or whatever). I am most used to signal in time, but images can first be described as bitmaps, which are functions of two variables $x$ and $y$. (Or a vector function of two variables if color and hue are included along with intensity.)
$$ \begin{align} \text{intensity} \qquad & I(x,y) \\ \text{color saturation} \qquad & C_b(x,y) \\ \text{color hue} \qquad & C_r(x,y) \\ \end{align}$$
Now any algorithm that does anything to that bitmap signal, like processes it to create a new bitmap that looks different in some way, that is "digital signal processing" in a broad sense.
In a more specific sense, a simple, one-dimensional example of DSP on a single-variable function. Consider a raster-scan of a bitmap (intensity only) and the intensity $I(x.y)$ was fixed for a single value of $y$. This function of a single variable $I(x,y_0)$ is the scan across a single horizonal line of the image. One application of DSP would be in edge detection where a sudden change in the function might indicate an edge. That could give you an algorithm for tracing shapes in the two-dimensional image.
The two-dimensional FFT is a signal processing algorithm applied to bitmaps. There are other similar algorithms applied to bitmaps and to other descriptions of images. All of these can, in a sense, fall under the DSP discipline.