Signals#
An analog signal \( s \) is a finite real-valued function \( s(t) \) of a continuous variable \( t \) (called time), defined for all times on the interval \( -\infty < t < +\infty \).
A digital signal \( s \) is a bounded discrete-valued sequence \( S_n \) with a single index \( n \) (called discrete time), defined for all time \( n = -\infty, \cdots, +\infty \). Bounded discrete values are exactly the kind of numbers represented by computer words with some finite number of bits.
Finiteness is a physical requirement and comes in three varieties: finite signal value, finite energy, and finite bandwidth.
Definitions#
Continous Time Signals#
The energy of a signal \( s \) is defined as:
The power of a signal \( s \) is defined as:
The Root Mean Square (RMS) of a signal \( s \) is defined as:
Discrete Signals#
For discrete signals, the definitions are adjusted to sums instead of integrals.
The energy of a discrete signal \( S \) is defined as:
The power of a discrete signal \( S \) is defined as:
The Root Mean Square (RMS) of a discrete signal \( S \) is defined as:
Bandwidth#
The definition of bandwidth will be better discussed later. However, its understanding can be introduced now. It’s a measure of speed, not size. A signal that fluctuates rapidly has a higher bandwidth than one that only varies slowly. Requiring finite bandwidth imposes a smoothness constraint, disallowing sudden jump discontinuities and sharp corners.
Characteristics of Signals#
Signals are characterized as being deterministic or stochastic. If deterministic, they can be periodic or non-periodic. If stochastic, they can be stationary or non-stationary. Additionally, signals can be finite or infinite in time duration, and they can have finite bandwidth or a full spectrum.
The Vector Space of All Possible Signals#
Can all possible signals be described in terms of some set of basic signals?
To use signals as bases for a vector space, they must obey the basic axioms of vector spaces. These include:
Addition: Signal addition \( s = s_1 + s_2 \) according to \( s(t) = x(t) + y(t) \) and \( S_n = X_n + Y_n \).
Zero Vector: The constant signal \( S_n = 0 \) for all times \( n \).
Inverse: The inversion \( -s \) according to \( s(t) = -x(t) \) for all \( t \) and \( S_n = -X_n \) for all \( n \).
Scalar Multiplication: Multiplication by a real number \( a \) as \( s(t) = a x(t) \) for all \( t \) and \( S_n = a X_n \) for all \( n \).
Inner Product: The dot product as shown below:
For discrete signals:
Norm: The energy as defined below:
The norm (or energy) of a signal \( s \) is:
For discrete signals:
The norm (or energy) of a discrete signal \( S \) is:
Metric: The energy of the difference signal obeys all the requirements:
The metric (or distance) between two signals \( s_1 \) and \( s_2 \) is:
For discrete signals:
The metric (or distance) between two discrete signals \( S_1 \) and \( S_2 \) is:
Examples#
Analog Signal Example#
Consider an analog signal \( s(t) = A \sin(2 \pi f t) \) where \( A \) is the amplitude and \( f \) is the frequency. The energy of this signal is given by:
The power of the signal is:
Digital Signal Example#
Consider a digital signal \( S_n = A \sin(2 \pi f n) \) where \( A \) is the amplitude and \( f \) is the frequency. The energy of this signal is:
The power of the signal is:
Vector Space of Signals#
Since signals form a vector space, the theorems of linear algebra guarantee that there is a basis \( \{ \phi_k \} \), i.e., a set of signals in terms of which any signal \( s \) can be expanded:
From linear algebra, we know that every vector has a basis, but in general, this basis is not unique. Similarly, for the vector space of signals, there is a lot of flexibility in the choice of basis. The most common choices are based on signals we have already met, namely shifted unit impulses (SUIs) and sinusoids.
Shifted Unit Impulses (SUI)#
Shifted Unit Impulses \( S_n \) are defined as:
where \( \delta_{nm} \) equals one when \( n = m \), and zero for all other times.
Sinusoids#
Sinusoids are another common basis for signals. When we represent a signal by expanding it in the basis of SUIs, we say that the signal is in the time domain. When we use the basis of sinusoids, we say that the signal is in the frequency domain.
Expansion of Digital Signals#
We can expand any digital signal in terms of SUIs as follows:
This proves that these signals span the entire space. No two SUIs overlap; they are orthogonal and linearly independent (no \( S_{nm} \) can be expanded in terms of others), and so they form a valid basis.
Most Common Signals#
Signal Name |
Continuous Format |
Digital Format |
---|---|---|
Unit Constant |
\( s(t) = A \) |
\( S_n = A \) |
Unit Step |
\( s(t) = u(t) \) |
\( S_n = u[n] \) |
Unit Impulse |
\( s(t) = \delta(t) \) |
\( S_n = \delta[n] \) |
Square Wave |
\( s(t) = A \cdot \text{sgn}(\sin(\omega t)) \) |
\( S_n = A \cdot \text{sgn}(\sin(\omega n)) \) |
Sinusoid |
\( s(t) = A \cdot \sin(\omega t + \phi) \) |
\( S_n = A \cdot \sin(\omega n + \phi) \) |
Damped Sinusoid |
\( s(t) = A e^{-\alpha t} \sin(\omega t) \) |
\( S_n = A e^{-\alpha n} \sin(\omega n) \) |
Real Exponential |
\( s(t) = A e^{\alpha t} \) |
\( S_n = A e^{\alpha n} \) |
Complex Sinusoid |
\( s(t) = A e^{j(\omega t + \phi)} \) |
\( S_n = A e^{j(\omega n + \phi)} \) |
Damped Complex Sinusoid |
\( s(t) = A e^{-\alpha t} e^{j(\omega t + \phi)} \) |
\( S_n = A e^{-\alpha n} e^{j(\omega n + \phi)} \) |