Exploring Orthogonality: From Vectors to Functions

Keywords: orthogonality, vectors, functions, dot product, inner product, discrete, Python programming, data analysis, visualization

Orthogonality

Orthogonality is a mathematical principle that signifies the absence of correlation or relationship between two vectors (signals). It implies that the vectors or signals involved are mutually independent or unrelated.

Two vectors (signals) A and B are said to be orthogonal (perpendicular in vector algebra) when their inner product (also known as dot product) is zero.

\[ A \perp B \Leftrightarrow \left<A.B \right> = A_1 \cdot B_1 + A_2 \cdot B_2 + \cdots A_n \cdot B_n = 0\]

Example: Let’s show that the two vectors \(\overrightarrow{A} = \binom{-2}{3}\) and \(\overrightarrow{B} = \binom{3}{2}\) are orthogonal

\[\overrightarrow{A} \cdot \overrightarrow{B} = A_x B_x + A_y B_y = (-2)(3) + (3)(2) = 0 \]

Let verify if the angle between the vectors is \(90^{\circ}\)

\[ \theta = cos^{-1} \left( \frac{\overrightarrow{A} \cdot \overrightarrow{B}}{|\overrightarrow{A} | |\overrightarrow{B}|} \right) = cos^{-1}(0) = 90 ^{\circ} \]
Two vectors exhibiting orthogonality
Figure 1: Two vectors exhibiting orthogonality

To find the dot product of two vectors, you need to multiply their corresponding components and then sum the results. Here’s the general formula (in matrix notation) for checking the orthogonality of two complex valued vectors \(\vec{a}\) and \(\vec{b}\):

\[\vec{a} \perp \vec{b} \Rightarrow \left< \vec{a}, \vec{b} \right> = \begin{bmatrix} a_1^* & a_2^* & \cdots & a_n^* \\ \end{bmatrix} \begin{bmatrix} b_1 \\ b_2\\ \vdots \\ b_n \\ \end{bmatrix} = 0 \]

Here’s an example code snippet in Python that demonstrates to check if two vectors given as lists are orthogonal.

import numpy as np
import matplotlib.pyplot as plt

def dot_product(vector1, vector2):
    if len(vector1) != len(vector2):
        raise ValueError("Vectors must have the same length.")
    return sum(x * y for x, y in zip(vector1, vector2))

def are_orthogonal(vector1, vector2):
    result = dot_product(vector1, vector2)
    return result == 0

# Example vectors
vectorA = [-2, 3]
vectorB = [3, 2]

# Check if vectors are orthogonal
if are_orthogonal(vectorA, vectorB):
    print("The vectors are orthogonal.")
else:
    print("The vectors are not orthogonal.")

# Plotting the vectors
origin = [0], [0]  # Origin point for the vectors

plt.quiver(*origin, vectorA[0], vectorA[1], angles='xy', scale_units='xy', scale=1, color='r', label='Vector A')
plt.quiver(*origin, vectorB[0], vectorB[1], angles='xy', scale_units='xy', scale=1, color='b', label='Vector B')

plt.xlim(-5, 5)
plt.ylim(-5, 5)
plt.xlabel('x')
plt.ylabel('y')
plt.title('Plot of Vectors')
plt.grid(True)
plt.legend()
plt.show()

Orthogonality of Continuous functions

Orthogonality, in the context of functions, can be seen as a broader concept akin to the orthogonality observed in vectors. Geometrically, orthogonal vectors are perpendicular to each other since their dot product equals zero.

When computing the dot product of two vectors, their components are multiplied and summed. However, when considering the “dot” product of functions, a similar approach is taken. Functions are treated as if they were vectors with an infinite number of components, and the dot product is obtained by multiplying the functions together and integrating over a specific interval.

Let f(t) and g(t) are two continuous functions (imagined as two vectors) on the closed interval [a,b] (i.e a ≤ t ≤ b). For the functions to be orthogonal in the given interval, their dot product should be zero

\[ \left<f,g\right> = \int_a^b f(t) g(t) dt = 0 \Rightarrow \text{f(t) and g(t) are orthogonal}\]

Here is a small python script to check if two given functions are orthogonal

Python Script

import sympy
import numpy as np
import matplotlib.pyplot as plt

plt.style.use('seaborn-talk')
print(plt.style.available)

# Test the orthogonality of functions
x = sympy.Symbol('x')
f = sympy.sin(x)  # First function
g = sympy.cos(2*x)  # Second function
a = 0 # interval lower limit
b = 2*sympy.pi # interval upper limit
interval = (0, 2*sympy.pi)  # Integration interval
inner_product = sympy.integrate(f*g, (x, interval[0], interval[1]))

if sympy.N(inner_product) == 0:
    print("The functions",str(f),"and",str(g),"are orthogonal over the interval [",str(a), ",",str(b),"].")
else:
    print("The functions",str(f),"and",str(g),"are not orthogonal over the interval [",str(a), ",",str(b),"].")

# Plotting the functions
x_vals = np.linspace(float(interval[0]), float(interval[1]), 100)
f_vals = np.sin(x_vals)
g_vals = np.cos(2*x_vals)

plt.plot(x_vals, f_vals, label=str(f))
plt.plot(x_vals, g_vals, label=str(g))
plt.plot(x_vals, f_vals*g_vals, label=str(f)+str(g))
plt.xlabel('x')
plt.ylabel('Function values')
plt.legend()
plt.title('Plot of functions')
plt.grid(True)
plt.show()

Output

The functions sin(x) and cos(2*x) are orthogonal over the interval [ 0 , 2*pi ]

Orthogonality of discrete functions

To check the orthogonality of discrete functions, you can use the concept of the inner product (same as above). In discrete settings, the inner product can be thought of as the sum of the element-wise products of the function values at corresponding points.

Here’s an example code snippet in Python that demonstrates how to check the orthogonality of two discrete functions:

import numpy as np

def inner_product(f, g):
    if len(f) != len(g):
        raise ValueError("Functions must have the same length.")
    return np.sum(f * g)

def are_orthogonal(f, g):
    result = inner_product(f, g)
    return result == 0

# Example functions (discrete)
f = np.array([1, 0, -1, 0])
g = np.array([0, 1, 0, -1])

# Check if functions are orthogonal
if are_orthogonal(f, g):
    print("The functions are orthogonal.")
else:
    print("The functions are not orthogonal.")

References

[1] Smith, J.O. Mathematics of the Discrete Fourier Transform (DFT) with Audio Applications, Second Edition ↗

Orthogonality of OFDM

OFDM, known as Orthogonal Frequency Division Multiplexing, is a digital modulation technique that divides a wideband signal into several narrowband signals. By doing so, it elongates the symbol duration of each narrowband signal compared to the original wideband signal, effectively minimizing the impact of time dispersion caused by multipath delay spread.

OFDM is categorized as a form of multicarrier modulation (MCM), where multiple user symbols are transmitted simultaneously through distinct subcarriers having overlapping frequency bands, ensuring they remain orthogonal to each other.

OFDM implements the same number of channels as the traditional Frequency Division Multiplexing (FDM). Since the channels (subcarriers) are arranged in overlapping manner, OFDM significantly reduces the bandwidth requirement.

OFDM equation

Consider an OFDM system that transmits a user symbol stream \(s_i\) (rate \(R_u\)) over a set of \(N\) subcarriers. Therefore, the symbol rate of each subcarrier is \(R_s = \frac{R_u}{N}\) and the symbol duration is \(T_s = \frac{N}{R_u}\).

The incoming symbol stream is split into \(N\) symbols streams and each of the symbol stream is multiplied by a function \(\Phi_k\) taken from a family of orthonormal functions \(\Phi_k, k \in \left\{0,1, \cdots, N-1 \right\}\)

In OFDM, these orthogonormal functions are complex exponentials

\[\Phi_k (t) = \begin{cases} e^{j 2 \pi f_k t}, & \quad for \; t \in \left[ 0, T_s\right] \\ 0, & \quad otherwise \end{cases} \quad \quad \quad (1) \]

For simplicity lets assume BPSK modulation for the user symbol \(s_i \in \left\{-1,1 \right\} \) and \(g_i\) is the individual gain of each subchannels. The OFDM symbol is formed by multiplexing the symbols on each subchannels and combining them.

\[S (t) =\frac{1}{N} \sum_{k=0}^{N-1} s_k \cdot g_k \cdot \Phi_k(t) \quad \quad \quad (2)\]

The individual subcarriers are

\[s_n (t) = s_k \cdot g_k \cdot e^{j 2 \pi f_k t} \quad \quad \quad (3)\]

For a consecutive stream of input symbols \(m = 0,1, \cdots\) the OFDM equation is given by

\[S(t) = \sum_{m = 0 }^{ \infty} \left[\frac{1}{N} \sum_{k=0}^{N-1} s_{k,m} \cdot g_{k,m} \cdot \Phi_{k}(t – m T_s) \right] \quad \quad \quad (4)\]

With \(g_{k,m} = 1\), the OFDM equation is given by

\[S(t) = \sum_{m = 0 }^{ \infty} \left[\frac{1}{N} \sum_{k=0}^{N-1} s_{k,m} \cdot e^{j 2 \pi f_k \left(t – m T_s \right)} \right] \quad \quad \quad (5)\]

Orthogonality

The functions \(\Phi\) by which the symbols on the subcarriers are multiplied are orthonormal over the symbol period \(T_s\). That is

\[ \left< \Phi_p (t), \Phi_q (t) \right> = \frac{1}{T_s} \int_{0}^{Ts} \Phi_p (t) \cdot \Phi^*_q (t) dt = \delta_{p,q} \quad \quad \quad (6) \]

where, \(\delta_{p,q}\) is the Kronecker delta given by

\[\delta_{p,q} = \begin{cases} 1, & \quad p=q \\ 0, & \quad otherwise \end{cases} \]

The right hand side of equation (5) will be equal to 0 (satisfying orthogonality) if and only if \(2 \pi \left(f_p−f_q \right)T_s=2 \pi k\) where \(k\) is a non-zero integer. This implies that the distance between the two subcarriers, for them to be orthogonal, must be

\[\Delta f = f_p – f_q = \frac{k}{T_s} \quad \quad (7)\]

Hence, the smallest distance between two subcarriers, for them to be orthogonal, must be

\[ \Delta f = \frac{1}{T_s} \quad \quad (8)\]

This implies that each subcarrier frequency experiences \(k\) additional full cycles per symbol period compared to the previous carrier. For example, in Figure 1 that plots the real and imaginary parts of three OFDM subcarriers (with \(k=1\)), each successive subcarrier contain additional full cycle per symbol period compared to the previous carrier.

Figure 1: Three orthogonal subcarriers of OFDM

With \(N\) subcarriers, the total bandwidth occupied by one OFDM symbol will be \(B \approx N \cdot \Delta f \; (Hz) \)

Figure 2: OFDM spectrum illustrating 12 subcarriers

Benefits of orthogonality

Orthogonality ensures that each subcarrier’s frequency is precisely spaced and aligned with the others. This property prevents interference between subcarriers, even in a multipath channel, which greatly improves the system’s robustness against fading and other channel impairments.

The orthogonality property allows subcarriers to be placed close together without causing mutual interference. As a result, OFDM can efficiently utilize the available spectrum, enabling high data rates and maximizing spectral efficiency, making it ideal for high-speed data transmission in wireless communication systems.

Reference

[1] Chakravarthy, A. S. Nunez, and J. P. Stephens, “TDCSOFDM, and MC-CDMA: a brief tutorial,” IEEE Radio Commun., vol. 43, pp. 11-16, Sept. 2005.