This work proposes a cross-layered caching strategy for parameter estimation in wireless sensor networks (WSNs). Here, sensors first gather information about common parameters of interest and then forward the information… Click to show full abstract
This work proposes a cross-layered caching strategy for parameter estimation in wireless sensor networks (WSNs). Here, sensors first gather information about common parameters of interest and then forward the information to an edge server for final inference. The collaborative nature of this application enables the caching of linearly compressed information across sensors rather than individual observations. By assuming that the parameters are correlated over time, the estimation quality at the edge server can be improved by combining both present and past information, where the latter can be obtained from cached data. The data caching and accessing strategies are jointly designed to minimize the expected mean-squared-error (MSE) of the requested parameter estimates. We first consider a single-cache single-server scenario under ideal accessing assumptions and propose a greedy one-step-ahead (OSA) caching strategy that determines the optimal linear combination of observations to cache by minimizing the expected MSE of the requested parameter estimate in the next time slot. We adopt an alternating optimization approach where the combining coefficients at the cache and the linear estimator at the server are optimized in turn until convergence. Then, the proposed OSA caching strategy is extended to the multi-cache multi-server scenario with constraints on the accessing costs at both the caches and the edge servers. The alternating optimization subproblems in this case are non-convex due to the additional access constraints and, thus, are solved by adopting a successive convex approximation (SCA) procedure. Numerical simulations are provided to demonstrate the effectiveness of the proposed caching strategies.
               
Click one of the above tabs to view related content.