Volume 17, Number 5
Adaptive Q-Learning-Based Routing with Context-Aware Metrics for Robust Manet Routing (AQLR)
Authors
P. Tamilselvi 1, S. Suguna Devi 1, M. Thangam 2 and P. Muthulakshmi 1, 1 Cauvery College for Women (Autonomous), India, 2 Mount Carmel college (Autonomous), India
Abstract
Mobile Ad Hoc Networks encounter persistent challenges due to dynamic topologies, limited resources and high routing load. As these problems continue, the network’s overall performance declines as the network scales. To address these challenges Adaptive Q-Learning-Based Routing with Context-Aware Metrics for Robust MANET Routing (AQLR), a routing protocol that uses context-aware data and reinforcement learning to choose the best route for connected mobile devices. AQLR considers four essential routing metrics such as Coverage Factor, RSSI-Based Link Stability, Energy Weighting and Broadcast Delay. AQLR uses Q-learning agent at each node to enable adaptive learning of optimal next-hop decisions based on past history. Composite Routing Metric (CRM) helps to obtain smart decision in the absence of prior learning. Simulation performed with OMNeT++ across varying node densities from 50 to500 the simulation results shows that AQLR outperforms recent machine learning-based routing protocols including QLAR, RL-DWA, and DRL-MANET. Specifically, AQLR achieves up to 95.8% packet delivery ratio, reduces average end-to-end delay by 25–35%, lowers routing overhead by 20–30%, and improves network lifetime by over 15% in dense scenarios. These results affirm the effectiveness of combining reinforcement learning with context-aware metric computation for scalable and energy-efficient MANET routing.
Keywords
Q-Learning, Context-Aware Metrics, Coverage Factor, Link Stability, Energy Aware Routing, Broadcast Delay, Reinforcement Learning, Adaptive Protocols