Emergent Mind

Abstract

Human-leading truck platooning systems have been proposed to leverage the benefits of both human supervision and vehicle autonomy. Equipped with human guidance and autonomous technology, human-leading truck platooning systems are more versatile to handle uncertain traffic conditions than fully automated platooning systems. This paper presents a novel distributed stochastic model predictive control (DSMPC) design for a human-leading heavy-duty truck platoon. The proposed DSMPC design integrates the stochastic driver behavior model of the human-driven leader truck with a distributed formation control design for the following automated trucks in the platoon. The driver behavior of the human-driven leader truck is learned by a stochastic inverse reinforcement learning (SIRL) approach. The proposed stochastic driver behavior model aims to learn a distribution of cost function, which represents the richness and uniqueness of human driver behaviors, with a given set of driver-specific demonstrations. The distributed formation control includes a serial DSMPC with guaranteed recursive feasibility, closed-loop chance constraint satisfaction, and string stability. Simulation studies are conducted to investigate the efficacy of the proposed design under several realistic traffic scenarios. Compared to the baseline platoon control strategy (deterministic distributed model predictive control), the proposed DSMPC achieves superior controller performance in constraint violations and spacing errors.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.