Skip navigation
  • 中文
  • English

DSpace CRIS

  • DSpace logo
  • Home
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
  • Explore by
    • Research Outputs
    • Researchers
    • Organizations
    • Projects
  • Communities & Collections
  • SDGs
  • Sign in
  • 中文
  • English
  1. National Taiwan Ocean University Research Hub
  2. 電機資訊學院
  3. 通訊與導航工程學系
Please use this identifier to cite or link to this item: http://scholars.ntou.edu.tw/handle/123456789/17114
Title: YOLOv3-Based Matching Approach for Roof Region Detection from Drone Images
Authors: Yeh, Chia-Cheng
Chang, Yang-Lang
Alkhaleefah, Mohammad
Hsu, Pai-Hui
Eng, Weiyong
Koo, Voon-Chet
Huang, Bormin
Chang, Lena 
Keywords: DEEP
Issue Date: Jan-2021
Publisher: MDPI
Journal Volume: 13
Journal Issue: 1
Source: REMOTE SENS-BASEL
Abstract: 
Due to the large data volume, the UAV image stitching and matching suffers from high computational cost. The traditional feature extraction algorithms-such as Scale-Invariant Feature Transform (SIFT), Speeded Up Robust Features (SURF), and Oriented FAST Rotated BRIEF (ORB)-require heavy computation to extract and describe features in high-resolution UAV images. To overcome this issue, You Only Look Once version 3 (YOLOv3) combined with the traditional feature point matching algorithms is utilized to extract descriptive features from the drone dataset of residential areas for roof detection. Unlike the traditional feature extraction algorithms, YOLOv3 performs the feature extraction solely on the proposed candidate regions instead of the entire image, thus the complexity of the image matching is reduced significantly. Then, all the extracted features are fed into Structural Similarity Index Measure (SSIM) to identify the corresponding roof region pair between consecutive image sequences. In addition, the candidate corresponding roof pair by our architecture serves as the coarse matching region pair and limits the search range of features matching to only the detected roof region. This further improves the feature matching consistency and reduces the chances of wrong feature matching. Analytical results show that the proposed method is 13x faster than the traditional image matching methods with comparable performance.
URI: http://scholars.ntou.edu.tw/handle/123456789/17114
ISSN: 2072-4292
DOI: 10.3390/rs13010127
Appears in Collections:通訊與導航工程學系
11 SUSTAINABLE CITIES & COMMUNITIES

Show full item record

WEB OF SCIENCETM
Citations

4
Last Week
0
Last month
0
checked on Jun 27, 2023

Page view(s)

194
Last Week
0
Last month
1
checked on Jun 30, 2025

Google ScholarTM

Check

Altmetric

Altmetric

Related Items in TAIR


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Explore by
  • Communities & Collections
  • Research Outputs
  • Researchers
  • Organizations
  • Projects
Build with DSpace-CRIS - Extension maintained and optimized by Logo 4SCIENCE Feedback