LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

Toward Multiple-Phase MDP Model for Charging Station Recommendation

Photo from wikipedia

There is an increasing need for charging station recommendation to minimize the overall charging time for electric vehicles and balance load for the charging stations. To grant this need, we… Click to show full abstract

There is an increasing need for charging station recommendation to minimize the overall charging time for electric vehicles and balance load for the charging stations. To grant this need, we model the recommendation problem as a Markov Decision Process (MDP) problem. However, the traditional MDP model has the issue of ‘curse of dimensionality’. To address this issue, we propose an extension of MDP: multiple-phase MDP, in which the state transition of MDP is decomposing into several phases, so as to reduce the state space and state transition complexities. This is done by introducing two states other than the normal state defined in MDP: post decision state and intermediate decision state. Then, we propose an online learning based algorithm to solve the formulated multiple-phase MDP model. Thanks to the reduced complexities of the state space and state transition, the proposed online algorithm can converge fast. By comparing to other recommendation mechanisms, such as game theory based recommendation and Q-learning based recommendation, our simulation evaluation demonstrates that our proposition can bring good performance.

Keywords: state; recommendation; mdp; mdp model; multiple phase

Journal Title: IEEE Transactions on Intelligent Transportation Systems
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.