Lokalisasi Mobile Robot berdasarkan Citra Kamera OMNI menggunakan Fitur Surf

Penulis

  • Susijanto Tri Rasmana Universitas Dinamika
  • Harianto Harianto Universitas Dinamika
  • Pauladie Susanto Universitas Dinamika
  • Anan Pepe Abseno Universitas Dinamika
  • Zendi Zakaria Raga Permana Universitas Dinamika

DOI:

https://doi.org/10.25126/jtiik.2020712539

Abstrak

Deteksi lokasi diri atau lokalisasi diri adalah salah satu kemampuan yang harus dimiliki oleh mobile robot. Kemampuan lokalisasi diri digunakan untuk menentukan posisi robot di suatu daerah dan sebagai referensi untuk menentukan arah perjalanan selanjutnya. Dalam penelitian ini, lokalisasi robot didasarkan pada data citra yang ditangkap oleh kamera omnidirectional tipe catadioptric. Jumlah fitur terdekat antara citra 360o yang ditangkap oleh kamera Omni dan citra referensi menjadi dasar untuk menentukan prediksi lokasi. Ekstraksi fitur gambar menggunakan metode Speeded-Up Robust Features (SURF). Kontribusi pertama dari penelitian ini adalah optimasi akurasi deteksi dengan memilih nilai Hessian Threshold dan jarak maksimum fitur yang tepat. Kontribusi kedua optimasi waktu deteksi menggunakan metode yang diusulkan. Metode ini hanya menggunakan fitur 3 gambar referensi berdasarkan hasil deteksi sebelumnya. Optimasi waktu deteksi, untuk lintasan dengan 28 gambar referensi, dapat mempersingkat waktu deteksi sebesar 8,72 kali. Pengujian metode yang diusulkan dilakukan menggunakan omnidirectional mobile robot yang berjalan di suatu daerah. Pengujian dilakukan dengan menggunakan metode recall, presisi, akurasi, F-measure, G-measure, dan waktu deteksi. Pengujian deteksi lokasi juga dilakukan berdasarkan metode SIFT untuk dibandingkan dengan metode yang diusulkan. Berdasarkan pengujian, kinerja metode yang diusulkan lebih baik daripada SIFT untuk pengukuran dengan recall 89,67%, akurasi 99,59%, F-measure 93,58%, G-measure 93,87%, dan waktu deteksi 0,365 detik. Metode SIFT hanya lebih baik pada presisi 98,74%.

 

Abstract

Self-location detection or self-localization is one of the capabilities that must be possessed by the mobile robot. The self-localization ability is used to determine the robot position in an area and as a reference to determine the next trip direction. In this research, robot localization was by vision-data based, which was captured by catadioptric-types omnidirectional cameras. The number of closest features between the 360o image captured by the Omni camera and the reference image was the basis for determining location predictions. Image feature extraction uses the Speeded-Up Robust Features (SURF) method. The first contribution of this research is the optimization of detection accuracy by selecting the Hessian Threshold value and the maximum distance of the right features. The second contribution is the optimization of detection time using the proposed method. This method uses only the features of 3 reference images based on the previous detection results. Optimization of detection time, for trajectories with 28 reference images, can shorten the detection time by 8.72 times. Testing the proposed method was done using an omnidirectional mobile robot that walks in an area. Tests carried out using the method of recall, precision, accuracy, F-measure, G-measure, and detection time. Location detection testing was also done based on the SIFT method to be compared with the proposed method. Based on testing, the proposed method performance is better than SIFT for measurements with recall 89.67%, accuracy 99.59%, F-measure 93.58%, G-measure 93.87%, and detection time 0.365 seconds. The SIFT method is only better at precision 98.74%.


Downloads

Download data is not yet available.

Referensi

BAY, H., ESS, A., TUYTELAARS, T., dan VAN GOOL, L., 2008. Speeded-Up Robust Features (Surf). Computer Vision And Image Understanding, 110(3), 346–359.

BROWN, M., dan LOWE, D., 2002. Invariant Features From Interest Point Groups. Procedings Of The British Machine Vision Conference 2002, 23.1-23.10.

DAYOUB, F., CIELNIAK, G., dan DUCKETT, T., 2011. A Sparse Hybrid Map For Vision-Guided Mobile Robots. In 5th European Conference On Mobile Robots (Ecmr 2011).

FENG, H. M., CHEN, C. Y., dan HORNG, J. H., 2010. Intelligent Omni-Directional Vision-Based Mobile Robot Fuzzy Systems Design And Implementation. Expert Systems With Applications, 37(5), 4009–4019.

HARIANTO, H., RIVAI, M., dan PURWANTO, D., 2013. Implementation Of Electronic Nose In Omni-Directional Robot. International Journal Of Electrical And Computer Engineering (Ijece), 3(3).

KUSWADI, S., O, A. N. G., TAMARA, M. N., dan S, I. A., 2018. Optimasi Sistem Navigasi Robot Bencana Dengan Algoritma Bug Dan Jaringan Syaraf Tiruan Optimisation Of Disaster Robot Navigation Systems Using Bug. Jurnal Teknologi Informasi Dan Ilmu Komputer (JTIIK), 5(5), 635–642.

QU, X., SOHEILIAN, B., HABETS, E., dan PAPARODITIS, N., 2016. Evaluation Of Sift And Surf For Vision Based Localization. The International Archives Of The Photogrammetry, Remote Sensing And Spatial Information Sciences, 685–692.

RÖHRIG, C., HESS, D., KIRSCH, C., dan KÜNEMUND, F., 2010. Localization Of An Omnidirectional Transport Robot Using IEEE 802.15.4a Ranging And Laser Range Finder. 2010 IEEE/RSJ International Conference On Intelligent Robots And Systems, 3798–3803.

SU, Z., ZHOU, X., CHENG, T., ZHANG, H., XU, B., dan CHEN, W., 2017. Global Localization Of A Mobile Robot Using Lidar And Visual Features. 2017 IEEE International Conference On Robotics And Biomimetics (ROBIO), 2377–2383.

THEODORIDIS, T., dan HU, H., 2012. Toward Intelligent Security Robots: A Survey. IEEE Transactions On Systems, Man And Cybernetics Part C: Applications And Reviews, 42(6), 1219–1230.

VALGREN, C., dan LILIENTHAL, A. J., 2010. SIFT, SURF &Amp; Seasons: Appearance-Based Long-Term Localization In Outdoor Environments. Robotics And Autonomous Systems, 58(2), 149–156.

WAJIANSYAH, A., SUPRIADI, S., NUR, S., dan A. BRAMANTO W.P., 2018. Implementasi Fuzzy Logic Pada Robot Line Follower. Jurnal Teknologi Informasi Dan Ilmu Komputer (JTIIK),5(4), 395. <https://doi.org/10.25126/jtiik.201854747>

YANG, S., SCHERER, S. A., YI, X., dan ZELL, A., 2017. Multi-Camera Visual SLAM For Autonomous Navigation Of Micro Aerial Vehicles. Robotics And Autonomous Systems, 93, 116–134.

Diterbitkan

08-10-2020

Terbitan

Bagian

Ilmu Komputer

Cara Mengutip

Lokalisasi Mobile Robot berdasarkan Citra Kamera OMNI menggunakan Fitur Surf. (2020). Jurnal Teknologi Informasi Dan Ilmu Komputer, 7(5), 1079-1088. https://doi.org/10.25126/jtiik.2020712539