Hand gesture recognition (HGR) has gained significant attention in human-computer interaction, enabling touchless control in various domains, such as virtual reality, automotive systems, and healthcare. While deep learning approaches achieve high accuracy in gesture classification, their lack of interpretability hinders transparency and user trust in critical applications. To address this, we extend MIRA, an interpretable rule-based HGR system, with a novel gesture onset detection method that autonomously identifies the start of a gesture before classification. Our onset detection approach achieves 90.13% accuracy on average, demonstrating its robustness across users. By integrating signal processing techniques, MIRA enhances interpretability while maintaining real-time adaptability to dynamic environments. Additionally, we introduce a background class, enabling the system to differentiate between gesture and non-gesture frames and expand the dataset with new users and recordings to improve generalization. We further analyze how feature diversity affects performance, showing that low diversity can suppress personalization due to early misclassifications. Using a foundational and personalized rule framework, our approach correctly classifies up to 94.9% of gestures, reinforcing the impact of personalization in rule-based systems. These findings demonstrate that MIRA is a robust and interpretable alternative to deep learning models, ensuring transparent decision-making for real-world radar-based gesture recognition.