Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • Home
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
  • Explore by
    • Research Outputs
    • Researchers
    • Organizations
    • Projects
  • Communities & Collections
  • SDGs
  • Sign in
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://scholars.ntou.edu.tw/handle/123456789/25515
Title: Progressive Ensemble Learning for in-Sample Data Cleaning
Authors: Wang, Jung-Hua 
Lee, Shih-Kai
Wang, Ting-Yuan
Chen, Ming-Jer
Hsu, Shu-Wei
Keywords: Training;Data models;Cleaning;Noise measurement;Image classification;Complexity theory;Training data;Ensemble learning;Data integrity;Transfer learning;Convolutional neural networks;Noisy data;ensemble learning;data cleanline
Issue Date: 2024
Publisher: IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Journal Volume: 12
Start page/Pages: 140643-140659
Source: IEEE ACCESS
Abstract: 
We present an ensemble learning-based data cleaning approach (touted as ELDC) capable of identifying and pruning anomaly data. ELDC is characterized in that an ensemble of base models can be trained directly with the noisy in-sample data and can dynamically provide clean data during the iterative training. Each base model uses a random subset of the target dataset that may initially contain up to 40% of label errors. Following each training iteration, anomaly data are discriminated against clean ones by a majority voting scheme, and three different types of anomaly (mislabeled, confusing, and outliers) can be identified using a statistical pattern jointly determined by the prediction output of the base models. By iterating such a cycle of train-vote-remove, noisy in-sample data are progressively removed until a prespecified condition is reached. Comprehensive experiments, including out-sample data tests, are conducted to verify the effectiveness of ELDC in simultaneously suppressing bias and variance of the prediction output. The ELDC framework is highly flexible as it is not bound to a specific model and allows different transfer-learning configurations. Neural networks of AlexNet, ResNet50, and GoogleNet are used as based models and trained with various benchmark datasets, the results show that ELDC outperforms state-of-the-art cleaning methods.
URI: http://scholars.ntou.edu.tw/handle/123456789/25515
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2024.3468035
Appears in Collections:電機工程學系

Show full item record

Page view(s)

102
checked on Jun 30, 2025

Google ScholarTM

Check

Altmetric

Altmetric

Related Items in TAIR


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Explore by
  • Communities & Collections
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
Build with DSpace-CRIS - Extension maintained and optimized by Logo 4SCIENCE Feedback