Intelligent Task Planning for Reconfigurable Mechanisms Using Reinforcement Learning Based Structural Switching Strategy

Authors

  • Daniel J. Carter Department of Mechanical Engineering, University of Cambridge, United Kingdom Author
  • Xiaoling Wu Department of Electrical and Electronic Engineering, Imperial College London, United Kingdom Author
  • Emily R. Davies Department of Mechanical Engineering, University of Cambridge, United Kingdom Author
  • Thomas H. Bennett Department of Electrical and Electronic Engineering, Imperial College London, United Kingdom Author
  • Olivia M. Clarke Department of Mechanical Engineering, University of Cambridge, United Kingdom Author

DOI:

https://doi.org/10.71465/fair361

Keywords:

reconfigurable mechanism, reinforcement learning, task planning, topology switching, intelligent control

Abstract

Reconfigurable mechanisms can achieve multi-task execution through structural switching, but determining the optimal reconfiguration strategy in complex environments remains a major challenge. This study proposes an intelligent task planning method based on reinforcement learning. The structural switching process is modeled as a Markov decision process, where the action space corresponds to topology changes and the reward function jointly considers task completion rate and energy consumption. A deep Q-network is employed to train the optimal switching strategy. Experiments conducted in 15 task environments demonstrate that the proposed method achieves a 35% improvement in task completion rate and an 18% reduction in average energy consumption compared with baseline search algorithms. Moreover, after training, the decision-making speed on the simulation platform is approximately 10 times faster than that of traditional search methods. These results confirm that reinforcement learning can significantly enhance both efficiency and adaptability in reconfigurable mechanisms, providing an effective pathway for intelligent control of reconfigurable robots and adaptive mechanical systems.

Downloads

Download data is not yet available.

Downloads

Published

2025-09-19