Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • Home
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
  • Explore by
    • Research Outputs
    • Researchers
    • Organizations
    • Projects
  • Communities & Collections
  • SDGs
  • Sign in
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 電機工程學系
Please use this identifier to cite or link to this item: http://scholars.ntou.edu.tw/handle/123456789/23877
DC FieldValueLanguage
dc.contributor.authorLin, Chih-Weien_US
dc.contributor.authorLin, Mengxiangen_US
dc.contributor.authorYang, Suhuien_US
dc.date.accessioned2023-06-20T05:58:24Z-
dc.date.available2023-06-20T05:58:24Z-
dc.date.issued2020-
dc.identifier.issn2169-3536-
dc.identifier.urihttp://scholars.ntou.edu.tw/handle/123456789/23877-
dc.description.abstractSurveillance cameras have been widely used in urban environments and are increasingly used in rural ones. Such cameras have mostly been used for security, but they can be applied to the problem of furnishing fine-grained measurements and predictions of precipitation intensity. In this study, we formulated a stacked order-preserving (OP) learning framework to train a network using time-series data. We constructed an OP module, which uses a three-dimensional (3D) convolution operation to extract features with spatial and temporal information and that are associated with ConvLSTM; this feature extraction is used to learn the short-term and OP time-series relationships between features. Furthermore, the OP modules are stacked to form a stacked OP network (SOPNet), which strengthens the relationship between features in long-term time-series image sequences. This SOPNet can be use to obtain fine-grained measurements and predictions of precipitation intensity from images captured by outdoor surveillance cameras. Our main contributions are threefold. First, the SOPNet strengthens the short-term and long-term time-series relationship between features. Second, the SOPNet simultaneously examines spatial and temporal information to measure and predict precipitation intensity. Third, we constructed a precipitation intensity database based on optical images captured by outdoor surveillance cameras. We experimentally evaluated our proposed architecture using our self-collected data set. We found that SOPNet yields better performance and greater accuracy relative to its well-known state-of-the-art counterparts with respect to various metrics.en_US
dc.language.isoen_USen_US
dc.relation.ispartofIEEE Accessen_US
dc.subjectPrecipitation intensityen_US
dc.subject3D convolutionen_US
dc.subjectConvLSTMen_US
dc.subjectorder-preservingen_US
dc.subjectforecastingen_US
dc.subjectACTION RECOGNITIONen_US
dc.subjectNEURAL-NETWORKen_US
dc.subjectVIDEOen_US
dc.subjectLSTMen_US
dc.titleSOPNet Method for the Fine-Grained Measurement and Prediction of Precipitation Intensity Using Outdoor Surveillance Camerasen_US
dc.typejournal articleen_US
dc.identifier.doi10.1109/ACCESS.2020.3032430-
dc.identifier.isiWOS:000584736100001-
dc.relation.journalvolume8en_US
dc.relation.pages188813-188824en_US
item.openairetypejournal article-
item.fulltextno fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
item.grantfulltextnone-
item.cerifentitytypePublications-
item.languageiso639-1en_US-
crisitem.author.deptNational Taiwan Ocean University,NTOU-
crisitem.author.deptCollege of Electrical Engineering and Computer Science-
crisitem.author.deptDepartment of Electrical Engineering-
crisitem.author.parentorgNational Taiwan Ocean University,NTOU-
crisitem.author.parentorgCollege of Electrical Engineering and Computer Science-
Appears in Collections:電機工程學系
Show simple item record

Page view(s)

129
checked on Jun 30, 2025

Google ScholarTM

Check

Altmetric

Altmetric

Related Items in TAIR


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Explore by
  • Communities & Collections
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
Build with DSpace-CRIS - Extension maintained and optimized by Logo 4SCIENCE Feedback