DriftMoE: A Mixture of Experts Approach to Handle Concept Drifts 

Publication: arXiv:2507.18464 

Authors: Miguel Aspis,Sebastián A. Cajas Ordónez,Andrés L. Suárez-Cetrulo,Ricardo Simón Carbajo

 

Learning from non-stationary data streams subject to concept drift requires models that can adapt on-the-fly while remaining resource-efficient. Existing adaptive ensemble methods often rely on coarsegrained adaptation mechanisms or simple voting schemes that fail to optimally leverage specialized knowledge. This paper introduces DriftMoE, an online Mixture-of-Experts (MoE) architecture that addresses these limitations through a novel co-training framework. DriftMoE features a compact neural router that is co-trained alongside a pool of incremental Hoeffding tree experts. The key innovation lies in a symbiotic learning loop that enables expert specialization: the router selects the most suitable expert for prediction, the relevant experts update incrementally with the true label, and the router refines its parameters using a multi-hot correctness mask that reinforces every accurate expert. This feedback loop provides the router with a clear training signal while accelerating expert specialization. We evaluate DriftMoE’s performance across nine state-of-the-art data stream learning benchmarks spanning abrupt, gradual, and real-world drifts, testing two distinct configurations: one where experts specialize on data regimes (multi-class variant), and another where they focus on single-class specialization (task-based variant). Our results demonstrate that DriftMoE achieves competitive results with state-of-the-art stream learning adaptive ensembles, offering a principled and efficient approach to concept drift adaptation. All code, data pipelines, and reproducibility scripts are available in our public GitHub repository: https://github.com/miguel-ceadar/drift-moe.

Relevance of the Paper to the O-CEI Project:

The methodology outlined in DriftMoE holds strong potential for adaptation within O-CEI Task 3.5 (Implementation of intra- and cross-domain data management, observability, and AI orchestration mechanisms), where CeADAR plays a leading role. The intent is to use router networks to recommend relevant models from O-CEI’s marketplace. Furthermore, CeADAR is investigating ways to compress a MoE, which would make it even more efficient for recommending resource-intensive models like LLMs, a key part of our future research efforts. The paper acknowledges O-CEI.

Intelligent Edge Computing and Machine Learning: A Survey of Optimization and Applications

Publication: Future Internet17(9), 417.

Authors: Sebastián A. Cajas, Jaydeep Samanta, Andrés L. Suárez-Cetrulo, Ricardo Simón Carbajo

Intelligent edge machine learning has emerged as a paradigm for deploying smart applications across resource-constrained devices in next-generation network infrastructures. This survey addresses the critical challenges of implementing machine learning models on edge devices within distributed network environments, including computational limitations, memory constraints, and energy-efficiency requirements for real-time intelligent inference. We provide a comprehensive analysis of soft computing optimization strategies essential for intelligent edge deployment, systematically examining model compression techniques, including pruning, quantization methods, knowledge distillation, and low-rank decomposition approaches. The survey explores intelligent MLOps frameworks tailored for network edge environments, addressing continuous model adaptation, monitoring under data drift, and federated learning for distributed intelligence while preserving privacy in next-generation networks. Our work covers practical applications across intelligent smart agriculture, energy management, healthcare, and industrial monitoring within network infrastructures, highlighting domain-specific challenges and emerging solutions. We analyse specialized hardware architectures, cloud offloading strategies, and distributed learning approaches that enable intelligent edge computing in heterogeneous network environments. The survey identifies critical research gaps in multimodal model deployment, streaming learning under concept drift, and integration of soft computing techniques with intelligent edge orchestration frameworks for network applications. These gaps directly manifest as open challenges in balancing computational efficiency with model robustness due to limited multimodal optimization techniques, developing sustainable intelligent edge AI systems arising from inadequate streaming learning adaptation, and creating adaptive network applications for dynamic environments resulting from insufficient soft computing integration. This comprehensive roadmap synthesizes current intelligent edge machine learning solutions with emerging soft computing approaches, providing researchers and practitioners with insights for developing next-generation intelligent edge computing systems that leverage machine learning capabilities in distributed network infrastructures.

Relevance of the Paper to the Project:

To achieve the O-CEI project’s goal of accelerating the uptake of innovative Cloud-Edge-IoT solutions and strengthening Europe’s strategic autonomy, a thorough understanding of the current AI landscape is paramount. This review paper provides a necessary up-to-date survey of the state-of-the-art in AI for edge and cloud applications. The knowledge within is foundational for developing some of the project’s core components, including its AIOps solutions and AI-driven marketplace, thereby ensuring that O-CEI is at the forefront of technological advancement.