Faculty Supervisor

Ghassem Tofighi, Sheridan College

Date of Defense

Fall 12-2-2022

Program Name

Honours Bachelor of Computer Science (Mobile Computing)


Applied Computing


neural networks, deep learning, machine learning, unet, object detection, semantic segmentation, disaster relief, natural disasters, search and rescue


Faculty of Applied Science & Technology (FAST)


The purpose of this research is to create a deep learning tent detection system using UNet, that can be used to guide disaster relief efforts using satellite imagery. If the tent density in a given location can be detected following a natural disaster, this may be indicative of displaced people in need of aid and can guide search and rescue teams. In this paper we produce a tent detection system utilizing UNet, which achieved an overall accuracy of 80% when compared with the ground truth, or accuracies of 86% and 67% on the training and validation subsets respectively. We also discover that the Jaccard index may not be the best metric to determine similarity when we have no tents present in the ground truth, as this causes misleading results where if UNet generates even a single pixel when the ground truth is empty, the accuracy will be reported as 0\%. Overall, UNet performed very well—and while there were some false positives, most of these false positives were scenarios in which it was visually difficult to tell if we were looking at a tent or if it was something else entirely. Furthermore we found that UNet can be trained on consumer hardware and doesn't require a GPU for training if one doesn't mind waiting roughly thirty minutes per epoch. Thus we conclude that UNet is well suited to the task of tent detection in satellite imagery, particularly so when we have low resolution images—such as when we wish to view an expansive region.

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.