Bill Kay, Audun Myers, Thad Boydston, Emily Ellwein, Cameron Mackenzie, Erik Lentz
Keyword:
Computer Science, Information Theory, Information Theory (cs.IT), Discrete Mathematics (cs.DM)
journal:
--
date:
2023-12-01 00:00:00
Abstract
Shannon Entropy is the preeminent tool for measuring the level of uncertainty (and conversely, information content) in a random variable. In the field of communications, entropy can be used to express the information content of given signals (represented as time series) by considering random variables which sample from specified subsequences. In this paper, we will discuss how an entropy variant, the \textit{permutation entropy} can be used to study and classify radio frequency signals in a noisy environment. The permutation entropy is the entropy of the random variable which samples occurrences of permutation patterns from time series given a fixed window length, making it a function of the distribution of permutation patterns. Since the permutation entropy is a function of the relative order of data, it is (global) amplitude agnostic and thus allows for comparison between signals at different scales. This article is intended to describe a permutation patterns approach to a data driven problem in radio frequency communications research, and includes a primer on all non-permutation pattern specific background. An empirical analysis of the methods herein on radio frequency data is included. No prior knowledge of signals analysis is assumed, and permutation pattern specific notation will be included. This article serves as a self-contained introduction to the relationship between permutation patterns, entropy, and signals analysis for studying radio frequency signals and includes results on a classification task.