Emergent Mind

Abstract

In the operation of networked control systems, where multiple processes share a resource-limited and time-varying cost-sensitive network, communication delay is inevitable and primarily influenced by, first, the control systems deploying intermittent sensor sampling to reduce the communication cost by restricting non-urgent transmissions, and second, the network performing resource management to minimize excessive traffic and eventually data loss. In a heterogeneous scenario, where control systems may tolerate only specific levels of sensor-to-controller latency, delay sensitivities need to be considered in the design of control and network policies to achieve the desired performance guarantees. We propose a cross-layer optimal co-design of control, sampling and resource management policies for an NCS consisting of multiple stochastic linear time-invariant systems which close their sensor-to-controller loops over a shared network. Aligned with advanced communication technology, we assume that the network offers a range of latency-varying transmission services for given prices. Local samplers decide either to pay higher cost to access a low-latency channel, or to delay sending a state sample at a reduced price. A resource manager residing in the network data-link layer arbitrates channel access and re-allocates resources if link capacities are exceeded. The performance of the local closed-loop systems is measured by a combination of linear-quadratic Gaussian cost and a suitable communication cost, and the overall objective is to minimize a defined social cost by all three policy makers. We derive optimal control, sampling and resource allocation policies under different cross-layer awareness models, including constant and time-varying parameters, and show that higher awareness generally leads to performance enhancement at the expense of higher computational complexity.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.