A Unified Framework for Multi-Hop Wireless Relaying with Hardware Impairments

Ehsan Soleimani-Nasab, Sinem Coleri
Computer Science, Information Theory, Information Theory (cs.IT), Signal Processing (eess.SP)
2023-11-29 00:00:00
Relaying increases the coverage area and reliability of wireless communications systems by mitigating the fading effect on the received signal. Most technical contributions in the context of these systems assume ideal hardware (ID) by neglecting the non-idealities of the transceivers, which include phase noise, in-phase/quadrature mismatch and high power amplifier nonlinearities. These non-idealities create distortion on the received signal by causing variations in the phase and attenuating the amplitude. The resulting deterioration of the performance of wireless communication systems is further magnified as the frequency of transmission increases. In this paper, we investigate the aggregate impact of hardware impairments (HI) on the general multi-hop relay system using amplify-and-forward (AF) and decode-and-forward (DF) relaying techniques over a general H-fading model. H-fading model includes free space optics, radio frequency, millimeter wave, Terahertz, and underwater fading models. Closed-form expressions of outage probability, bit error probability and ergodic capacity are derived in terms of H-functions. Following an asymptotic analysis at high signal-to-noise ratio (SNR), practical optimization problems have been formulated with the objective of finding the optimal level of HI subject to the limitation on the total HI level. The analytical solution has been derived for the Nakagami-m fading channel which is a special case of H-fading for AF and DF relaying techniques. The overall instantaneous signal-to-noise-plus-distortion ratio has been demonstrated to reach a ceiling at high SNRs which has a reciprocal proportion to the HI level of all hops transceivers on the contrary to the ID.
PDF: A Unified Framework for Multi-Hop Wireless Relaying with Hardware Impairments.pdf
Empowered by ChatGPT