http://scholars.ntou.edu.tw/handle/123456789/20786
Title: | Evaluation of Machine Learning Methods for Ground Vibration Prediction Model Induced by High-Speed Railway | Authors: | Chen, C. J. Liu, C. H. Chen, Y. J. Shen, Y. J. |
Keywords: | TRAINS | Issue Date: | Jun-2016 | Publisher: | SPRINGER HEIDELBERG | Journal Volume: | 4 | Journal Issue: | 3 | Start page/Pages: | 283-290 | Source: | J VIB ENG TECHNOL | Abstract: | This study evaluates various machine learning methods as an automatic prediction model for ground vibration induced by Taiwan high-speed trains on bridge structures with shallow foundations. A wide variety of field-measured ground vibration data are used to develop the prediction model. The main characteristics that affect the overall vibration level are established based on the measurement results. These main influence factors include train speed, geological condition, measurement distance, and supported structure. Support vector machine (SVM), artificial neural network (ANN), and random forest algorithms are then adopted to predict the vibration level induced by high-speed trains. The prediction system also applies a voting-based ensemble learning algorithm to improve the prediction performance. The measured and predicted vibration levels are compared to verify the reliability of the prediction model. Analytical results show that ANN achieved the highest accuracy rates, and the random forest model generally outperformed the SVM model. The voting-based ensemble learning method also increases the accuracy rates. Generally, the machine learning model can reasonably predict the ground vibration level in various frequency ranges. The best prediction accuracy rates of the voting-based ensemble learning method are 64%, 80%, 80%, and 78% in low, middle, high, and overall frequencies, respectively. Prediction results are discussed in detail. |
URI: | http://scholars.ntou.edu.tw/handle/123456789/20786 | ISSN: | 2523-3920 |
Appears in Collections: | 14 LIFE BELOW WATER |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.