Academy & Industry Research Collaboration Center (AIRCC)

Volume 11, Number 16, October 2021

A Q-Learning based Fault-Tolerant Controller with Application to CSTH System

  Authors

Seyed Ali Hosseini and Karim Salahshoor, Petroleum University of Technology Ahwaz, Iran

  Abstract

Systems are continually subjected to faults or malfunctions because of age or sudden events, which might degrade the operation performance and even result in operation failure that is a quite important issue in safety-critical systems. Thus, this important problem is the main reason to use the Fault-Tolerant strategy to improve the system’s performance with the presence of faults. A fascinating property in Fault-Tolerant Controllers (FTCs) is adaptability to system changes as they evolve throughout system operations. In this paper, a Q-learning algorithm with a greedy policy was used to realize the FTC adaptability. Then, some fault scenarios are introduced in a Continuous Stirred Tank Heater (CSTH) to compare the closed-loop performance of the developed Q-learning-based FTC with concerning conventional PID controller and an RL-based FTC. The obtained results show the effectiveness of Q-learningbased FTC in different fault scenarios.

  Keywords

Reinforcement Learning, Q-learning Algorithm, Fault-Tolerant controller, Adaptive Controller.