Prior asymptotic performance analyses are based on the series expansion of the moment-generating function (MGF) or the probability density function (PDF) of channel coefficients. However, these techniques fail for lognormal… Click to show full abstract
Prior asymptotic performance analyses are based on the series expansion of the moment-generating function (MGF) or the probability density function (PDF) of channel coefficients. However, these techniques fail for lognormal fading channels because the Taylor series of the PDF of a lognormal random variable is zero at the origin and the MGF does not have an explicit form. Although lognormal fading model has been widely applied in wireless communications and free-space optical communications, few analytical tools are available to provide elegant performance expressions for correlated lognormal channels. In this paper, we propose a novel framework to analyze the asymptotic outage probabilities of selection combining (SC), equal-gain combining (EGC), and maximum-ratio combining (MRC) over equally correlated lognormal fading channels. Based on these closed-form results, we show: 1) the outage probability of EGC or MRC becomes an infinitely small quantity compared to that of SC at high signal-to-noise ratio (SNR); 2) channel correlation can cause an infinite performance loss at high SNR; and 3) negatively correlated lognormal channels can outperform the independent lognormal channels. The analyses reveal insights into the long-standing problem of asymptotic performance analyses over correlated lognormal channels, and circumvent the time-consuming Monte Carlo simulation and numerical integration.
               
Click one of the above tabs to view related content.