menu_book Explore the article's raw data

IWDA: Importance Weighting for Drift Adaptation in Streaming Supervised Learning Problems

Abstract

Distribution drift is an important issue for practical applications of machine learning (ML). In particular, in streaming ML, the data distribution may change over time, yielding the problem of concept drift, which affects the performance of learners trained with outdated data. In this article, we focus on supervised problems in an online nonstationary setting, introducing a novel learner-agnostic algorithm for drift adaptation, namely (), with the goal of performing efficient retraining of the learner when drift is detected. incrementally estimates the joint probability density of input and target for the incoming data and, as soon as drift is detected, retrains the learner using importance-weighted empirical risk minimization. The importance weights are computed for all the samples observed so far, employing the estimated densities, thus, using all available information efficiently. After presenting our approach, we provide a theoretical analysis in the abrupt drift setting. Finally, we present numerical simulations that illustrate how competes and often outperforms state-of-the-art stream learning techniques, including adaptive ensemble methods, on both synthetic and real-world data benchmarks.

article Article
date_range 2023
language English
link Link of the paper
format_quote
Sorry! There is no raw data available for this article.
Loading references...
Loading citations...
Featured Keywords

Adaptation models
Random forests
Estimation
Detectors
Training
Monitoring
Ensemble learning
Concept drift
data drift
drift adaptation
importance weighting (IW)
nonstationarity
stream learning
Citations by Year

Share Your Research Data, Enhance Academic Impact