<listing id="l9bhj"><var id="l9bhj"></var></listing>
<var id="l9bhj"><strike id="l9bhj"></strike></var>
<menuitem id="l9bhj"></menuitem>
<cite id="l9bhj"><strike id="l9bhj"></strike></cite>
<cite id="l9bhj"><strike id="l9bhj"></strike></cite>
<var id="l9bhj"></var><cite id="l9bhj"><video id="l9bhj"></video></cite>
<menuitem id="l9bhj"></menuitem>
<cite id="l9bhj"><strike id="l9bhj"><listing id="l9bhj"></listing></strike></cite><cite id="l9bhj"><span id="l9bhj"><menuitem id="l9bhj"></menuitem></span></cite>
<var id="l9bhj"></var>
<var id="l9bhj"></var>
<var id="l9bhj"></var>
<var id="l9bhj"><strike id="l9bhj"></strike></var>
<ins id="l9bhj"><span id="l9bhj"></span></ins>
Volume 44 Issue 1
Jan.  2022
Turn off MathJax
Article Contents
LIU Xing, ZHAO Jian-yin, ZHU Min, ZHANG Wei. Research on an improved lp-RWMKE-ELM fault diagnosis model[J]. Chinese Journal of Engineering, 2022, 44(1): 82-94. doi: 10.13374/j.issn2095-9389.2020.07.09.001
Citation: LIU Xing, ZHAO Jian-yin, ZHU Min, ZHANG Wei. Research on an improved lp-RWMKE-ELM fault diagnosis model[J]. Chinese Journal of Engineering, 2022, 44(1): 82-94. doi: 10.13374/j.issn2095-9389.2020.07.09.001

Research on an improved lp-RWMKE-ELM fault diagnosis model

doi: 10.13374/j.issn2095-9389.2020.07.09.001
More Information
  • Corresponding author: E-mail: xinghandeqipan@sina.com
  • Received Date: 2020-07-09
    Available Online: 2020-09-16
  • Publish Date: 2022-01-01
  • As the service time of military equipment increases, equipment failure data is continuously accumulated during events such as routine maintenance, training, and combat readiness exercises, and the data presented is often imbalanced to varying degrees and consists of small samples. In addition, due to fault tolerances of various electrical component parameters in the equipment and widespread nonlinearity and feedback loops of the circuit, it is often difficult to accurately express the fault mechanism using mathematical models. This poses new challenges for the fault diagnosis of equipment. To address the aforementioned problems, machine learning methods are widely used for fault diagnosis. The essence of such methods is that they transform a fault diagnosis problem into a pattern recognition problem. By learning the characteristic data of normal modes and various failure modes, a diagnosis model is constructed and, ultimately, a diagnosis strategy is formed. Aiming at the problems of the unbalanced distribution of various fault samples from equipment and low fault diagnosis accuracy of existing algorithms, in this paper, we define a regularized weighted multiple kernel ensemble under a p-norm constraint by introducing a p-norm constraint weighted multicore extreme learning machine and an ensemble learning strategy based on the AdaBoost fault diagnosis model of extreme learning machine. Under the p-norm constraint, the model performed two types of adaptive sample weight distribution based on the size of various fault samples; simultaneously, the model combines the multisource data fusion and extreme learning abilities of the multiple kernel learning machine with high efficiency. The weight of a sample, W , is integrated into the optimization objective function of the multiple kernel extreme learning machine. Through the Adaboost integration strategy, the information-rich sample in the model is adaptively improved. Thus, the weight of a sample significantly improves the accuracy of fault diagnosis. Taking 6 UCI public data sets and 1 actual installation case as examples, a fault diagnosis experiment was conducted. The results of the experiment show that the model constructed in this study has significantly improved diagnostic accuracy compared with other models such as kernel extreme learning machine, weighted kernel extreme learning machine ($ {{\boldsymbol{W}}^{\left( 1 \right)}} $ and $ {{\boldsymbol{W}}^{\left( 2 \right)}} $ weighting method), and weighted multiple kernel extreme learning machine under 1-norm constraint, and the model’s diagnostic performance impact is limited.

     

  • loading
  • [1]
    Ao Y C, Shi Y B, Zhang W, et al. An approximate calculation of ratio of normal variables and its application in analog circuit fault diagnosis. J Electron Test, 2013, 29(4): 555 doi: 10.1007/s10836-013-5382-z
    [2]
    Han H, Wang H J, Tian S L, et al. A new analog circuit fault diagnosis method based on improved mahalanobis distance. J Electron Test, 2013, 29(1): 95 doi: 10.1007/s10836-012-5342-z
    [3]
    Tang X F, Xu A Q. Practical analog circuit diagnosis based on fault features with minimum ambiguities. J Electron Test, 2016, 32(1): 83 doi: 10.1007/s10836-015-5561-1
    [4]
    Huang G B, Zhu Q Y, Siew C K. Extreme learning machine: a new learning scheme of feed forward neural networks // Proceedings of 2004 IEEE International Joint Conference on Neural Networks. Budapest, 2004: 985
    [5]
    Qin H Y, Zhou H P, Cao J W. Imbalanced learning algorithm based intelligent abnormal electricity consumption detection. Neurocomputing, 2020, 402: 112 doi: 10.1016/j.neucom.2020.03.085
    [6]
    He H B, Garcia E A. Learning from imbalanced data. IEEE Trans Knowl Data Eng, 2009, 21(9): 1263 doi: 10.1109/TKDE.2008.239
    [7]
    Akbani R, Kwek S, Japkowicz N. Applying support vector machines to imbalanced datasets // Proceedings of the 15th European Conference on Machine Learning. Pisa, 2004: 39
    [8]
    Zheng L K, Xiang Y, Sheng C X. Optimization-based improved kernel extreme learning machine for rolling bearing fault diagnosis. J Braz Soc Mech Sci Eng, 2019, 41: 619
    [9]
    Deng W Y, Zheng Q H, Chen L. Regularized extreme learning machine // IEEE Symposium on Computational Intelligence and Data Mining (CIDM 09). Nashville, 2009: 389
    [10]
    Zong W W, Huang G B, Chen Y Q. Weighted extreme learning machine for imbalance learning. Neurocomputing, 2013, 101: 229 doi: 10.1016/j.neucom.2012.08.010
    [11]
    Mirza B, Lin Z P, Toh K A. Weighted online sequential extreme learning machine for class imbalance learning. Neural Process Lett, 2013, 38(3): 465 doi: 10.1007/s11063-013-9286-9
    [12]
    Mao W T, Wang J W, Xue Z N. An ELM-based model with sparse-weighting strategy for sequential data imbalance problem. Int J Mach Learn Cybern, 2017, 8(4): 1333 doi: 10.1007/s13042-016-0509-z
    [13]
    Yu H Y, Sun X Y, Yan X Z. Sequential prediction for imbalanced data stream via weighted OS-ELM and dynamic GAN. Intell Data Anal, 2019, 23(6): 1191 doi: 10.3233/IDA-184377
    [14]
    Zhang Y, Liu B, Cai J, et al. Ensemble weighted extreme learning machine for imbalanced data classification based on differential evolution. Neural Comput Appl, 2017, 28: 259 doi: 10.1007/s00521-016-2342-4
    [15]
    張偉, 許愛強. 集成散度的MKL模型在模擬電路診斷中的應用. 計算機工程與應用, 2018, 54(9):5 doi: 10.3778/j.issn.1002-8331.1712-0425

    Zhang W, Xu A Q. Application of MKL model incorporated within-class scatter in analog circuit diagnosis. Comput Eng Appl, 2018, 54(9): 5 doi: 10.3778/j.issn.1002-8331.1712-0425
    [16]
    Ergul U, Bilgin G. MCK-ELM: multiple composite kernel extreme learning machine for hyper spectral images. Neural Comput Appl, 2020, 32(11): 6809 doi: 10.1007/s00521-019-04044-9
    [17]
    Yu H Y, Sun X Y, Wang J. Ensemble OS-ELM based on combination weight for data stream classification. Appl Intell, 2019, 49(6): 2382 doi: 10.1007/s10489-018-01403-2
    [18]
    Zhou Z H, Liu X Y. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Trans Knowl Data Eng, 2006, 18(1): 63 doi: 10.1109/TKDE.2006.17
    [19]
    Raghuwanshi B S, Shukla S. Class-specific cost-sensitive boosting weighted ELM for class imbalance learning. Memetic Comput, 2019, 11(3): 263 doi: 10.1007/s12293-018-0267-4
    [20]
    Mirza B, Lin Z P, Liu N. Ensemble of subset online sequential extreme learning machine for class imbalance and concept drift. Neurocomputing, 2015, 149: 316 doi: 10.1016/j.neucom.2014.03.075
    [21]
    Li K, Kong X F, Lu Z, et al. Boosting weighted ELM for unbalanced learning. Neurocomputing, 2014, 128: 15 doi: 10.1016/j.neucom.2013.05.051
    [22]
    Huang G B, Zhou H M, Ding X J, et al. Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B Cybern, 2012, 42(2): 513 doi: 10.1109/TSMCB.2011.2168604
    [23]
    Liu X W, Wang L, Huang G B, et al. Multiple kernel extreme learning machine. Neurocomputing, 2015, 149: 253 doi: 10.1016/j.neucom.2013.09.072
    [24]
    Vong C M, Ip W F, Wong P K, et al. Predicting minority class for suspended particulate matters level by extreme learning machine. Neurocomputing, 2014, 128: 136 doi: 10.1016/j.neucom.2012.11.056
    [25]
    Zhang P B, Yang Z X. A novel AdaBoost framework with robust threshold and structural optimization. IEEE Trans Cybern, 2018, 48(1): 64 doi: 10.1109/TCYB.2016.2623900
    [26]
    Phoungphol P, Zhang Y Q, Zhao Y C. Robust multiclass classification for learning from imbalanced biomedical data. Tsinghua Sci Technol, 2012, 17(6): 619 doi: 10.1109/TST.2012.6374363
  • 加載中

Catalog

    通訊作者: 陳斌, bchen63@163.com
    • 1. 

      沈陽化工大學材料科學與工程學院 沈陽 110142

    1. 本站搜索
    2. 百度學術搜索
    3. 萬方數據庫搜索
    4. CNKI搜索

    Figures(8)  / Tables(7)

    Article views (988) PDF downloads(32) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return
    久色视频