About this Digital Document
Adaptive algorithms for receivers employing antenna arrays have recently received significant attention for radar systems applications. In the majority of these algorithms, the covariance matrix for the clutter-plus-noise is characterized by using samples taken from range cells surrounding the test cell. If the underlying covariance matrix of the test cell is different from the average covariance matrix of the surrounding range cells, significant performance degradation may result. In this dissertation, approximate and exact expressions for performance are derived for such cases, when any of a set of popular space-time adaptive processing (STAP) algorithms are used. Equations for performance under these conditions have not existed prior to this dissertation. Numerical analysis illustrates how variations in the parameters of these equations affect probability of detection and probability of false alarm. Monte Carlo simulations are included which closely match the predictions of these expressions.;Man made interference, which is typically non-Gaussian, is another major source of performance degradation. Just as it is typical to assume that the covariance matrix of the surrounding range cells exactly matches that of the test cell, the non-Gaussian component of the interference is usually ignored in the analysis and design of adaptive radar and communication systems. To address this issue, a statistical noise model is developed in this dissertation from mathematical modeling of the physical mechanisms that generate interference and noise in receivers employing antenna arrays. Such models have been lacking for cases where the antenna observations may be statistically dependent from antenna to antenna. The model derived here is applicable to a wide variety of practical situations. It is both canonical in form and computationally manageable.