Robust Hierarchical RL for Fault-Tolerant Ad Traffic Management under Uncertain Demand

Authors

  • Allison Becker University of Wisconsin–Madison, USA Author
  • Daniel Kim University of Wisconsin–Madison, USA Author
  • Stephanie Wright University of Wisconsin–Madison, USA Author

DOI:

https://doi.org/10.71465/mrcis99

Keywords:

hierarchical reinforcement learning, fault tolerance, ad traffic management, uncertain demand, deep Q-networks

Abstract

In modern digital advertising ecosystems, managing ad traffic under uncertain demand while maintaining system fault tolerance presents significant challenges. This paper proposes a novel Robust Hierarchical Reinforcement Learning (RHRL) framework that addresses fault-tolerant ad traffic management in environments with uncertain demand patterns. The approach integrates Deep Q-Network architectures with hierarchical decision-making structures and robust optimization techniques to ensure system resilience against failures while adapting to dynamic traffic patterns. Our framework employs a two-tier architecture where high-level controllers manage strategic resource allocation decisions and low-level agents handle real-time traffic routing and fault recovery. The system utilizes convolutional neural networks for high-dimensional state processing and implements dual-objective optimization strategies similar to traffic signal control methodologies. Experimental results demonstrate convergent learning behavior with sustained performance improvements, achieving 34% better fault recovery time, 28% improved resource utilization efficiency, and 42% reduction in service degradation during peak uncertainty periods compared to conventional approaches.

Downloads

Published

2025-09-19