Main Article Content


Tourism is one of the sectors that have been hit hard during the current Covid-19 pandemic. However, in 2021 the tourism sector is slowly rising marked by the number of tourist attractions that are starting to open to the public with the implementation of strict health protocols. However, in practice there are still many cases of violations of health protocols in various tourist attractions, ranging from the undisciplined use of masks, not keeping a distance (social distancing), and the absence of setting the ideal number of visitors during a pandemic like the current one. Managers of tourist attractions tend to ignore restrictions on the number of visitors due to the absence of an management system to run. There needs to be an application that can inform the condition of visitors to tourist attractions in real-time so that people have self-awareness of their personal health so that people can make early plans before traveling. In good governance, this application contributes to controlling the risk of Covid-19 transmission originating from crowded centers (tourist spots). The government can easily monitor the distribution of tourist visitor density in the area so that it can then be used as a basis for carrying out policies and handling in the field.

This study proposes the development of an application platform (software and hardware) monitoring the density of visitors to tourist attractions using the concept of object detection based on image processing and deep learning. The integration of hardware with web-based software will provide information on the density of visitors to tourist attractions that are easily accessible to the public so that they are expected to contribute to the handling of the current Covid-19 pandemic.

Article Details

How to Cite
Rosyida, N., Atmoko, R. A., Handoko, G. P., Widia, I. D. M., & Asriningtyas, S. R. (2021). Tourist Attractions Visitor Density Monitoring Platform. Indonesian Journal of Engineering Research, 2(2), 68-74.


"The next normal arrives: Trends that will define 2021—and beyond". 4 Januari 2021. 13 April 2021.
Benli, Y. Motai and J. Rogers, "Human Behavior-Based Target Tracking With an Omni-Directional Ther-mal Camera," in IEEE Transactions on Cognitive and Developmental Systems, vol. 11, no. 1, pp. 36-50, March 2019, doi: 10.1109/TCDS.2017.2726356.
Booranawong, N. Jindapetch and H. Saito, "A System for Detection and Tracking of Human Movements Us-ing RSSI Signals," in IEEE Sensors Journal, vol. 18, no. 6, pp. 2531-2544, 15 March15, 2018, doi: 10.1109/JSEN.2018.2795747.
D. K. Shin, M. U. Ahmed and P. K. Rhee, "Incremental Deep Learning for Robust Object Detection in Un-known Cluttered Environments," in IEEE Access, vol. 6, pp. 61748-61760, 2018, doi: 10.1109/ACCESS.2018.2875720.
H. Li, G. Cui, L. Kong, G. Chen, M. Wang and S. Guo, "Ro-bust Human Targets Tracking for MIMO Through-Wall Radar via Multi-Algorithm Fusion," in IEEE Journal of Selected Topics in Applied Earth Obser-vations and Remote Sensing, vol. 12, no. 4, pp. 1154-1164, April 2019, doi: 10.1109/JSTARS.2019.2901262.
Li, H. Xie, W. Yan, Y. Chang and X. Qu, "Detection of Road Objects With Small Appearance in Images for Au-tonomous Driving in Various Traffic Situations Us-ing a Deep Learning Based Approach," in IEEE Ac-cess, vol. 8, pp. 211164-211172, 2020, doi: 10.1109/ACCESS.2020.3036620.
Lin and K. Huang, "Collaborative Pedestrian Tracking and Data Fusion With Multiple Cameras," in IEEE Transactions on Information Forensics and Securi-ty, vol. 6, no. 4, pp. 1432-1444, Dec. 2011, doi: 10.1109/TIFS.2011.2159972.
M. Gong and Y. Shu, "Real-Time Detection and Motion Recognition of Human Moving Objects Based on Deep Learning and Multi-Scale Feature Fusion in Video," in IEEE Access, vol. 8, pp. 25811-25822, 2020, doi: 10.1109/ACCESS.2020.2971283.
P. Paral, A. Chatterjee and A. Rakshit, "Vision Sensor-Based Shoe Detection for Human Tracking in a Human–Robot Coexisting Environment: A Photo-metric Invariant Approach Using DBSCAN Algo-rithm," in IEEE Sensors Journal, vol. 19, no. 12, pp. 4549-4559, 15 June15, 2019, doi: 10.1109/JSEN.2019.2897989.
Q. Xu, R. Lin, H. Yue, H. Huang, Y. Yang and Z. Yao, "Re-search on Small Target Detection in Driving Sce-narios Based on Improved Yolo Network," in IEEE Access, vol. 8, pp. 27574-27583, 2020, doi: 10.1109/ACCESS.2020.2966328.
T. Nguyen, T. N. Nguyen, H. Kim and H. Lee, "A High-Throughput and Power-Efficient FPGA Implemen-tation of YOLO CNN for Object Detection," in IEEE Transactions on Very Large Scale Integration (VLSI) Systems, vol. 27, no. 8, pp. 1861-1873, Aug. 2019, doi: 10.1109/TVLSI.2019.2905242.
W. Fang, L. Wang and P. Ren, "Tinier-YOLO: A Real-Time Object Detection Method for Constrained Envi-ronments," in IEEE Access, vol. 8, pp. 1935-1944, 2020, doi: 10.1109/ACCESS.2019.2961959.
Will, P. Vaishnav, A. Chakraborty and A. Santra, "Human Target Detection, Tracking, and Classification Us-ing 24-GHz FMCW Radar," in IEEE Sensors Journal, vol. 19, no. 17, pp. 7283-7299, 1 Sept.1, 2019, doi: 10.1109/JSEN.2019.2914365.
Wu, Z. Yang, Z. Zhou, X. Liu, Y. Liu and J. Cao, "Non-Invasive Detection of Moving and Stationary Hu-man With WiFi," in IEEE Journal on Selected Areas in Communications, vol. 33, no. 11, pp. 2329-2342, Nov. 2015, doi: 10.1109/JSAC.2015.2430294.
Y. Lee, Z. Tang and J. Hwang, "Online-Learning-Based Human Tracking Across Non-Overlapping Camer-as," in IEEE Transactions on Circuits and Systems for Video Technology, vol. 28, no. 10, pp. 2870-2883, Oct. 2018, doi: 10.1109/TCSVT.2017.2707399.
Zhu, X. Yan, H. Tang, Y. Chang, B. Li and X. Yuan, "Moving Object Detection With Deep CNNs," in IEEE Access, vol. 8, pp. 29729-29741, 2020, doi: 10.1109/ACCESS.2020.2972562.