[Home ] [Archive]   [ فارسی ]  
:: Main :: About :: Current Issue :: Archive :: Search :: Submit :: Contact ::
Main Menu
Home::
Browse::
Journal Info::
Guide for Authors::
Submit Manuscript::
Articles archive::
For Reviewers::
Contact us::
Site Facilities::
Reviewers::
::
Search in website

Advanced Search
..
Receive site information
Enter your Email in the following box to receive the site news and information.
..
:: Volume 12, Issue 4 (6-2023) ::
JGST 2023, 12(4): 1-19 Back to browse issues page
Development of a method for investigating flying robot positioning algorithms using inertia navigation system and point and linear features of images
Mohammad Mahdi Abbaspour * , Ali Hosseini naveh
Abstract:   (853 Views)
With the advancement of technology, flying robots are used in many applications such as mapping, inspection and safety, transportation of goods and military operations. In order to navigate these drones, there is an urge for real-time positioning. In outdoor environments, this is often done using satellite positioning systems. But in indoor environments, gnss are not operational. Therefore, in order to navigate flying robots inside built environments, other positioning methods should be implemented, such as the use of video cameras. It is also possible to use the data of the inertia sensor as a complement to image data. In interior area, due to the lack of texture and lighting problems, in addition to the effects of image points, the effects of image lines can also be used. Based on the above, researchers have proposed many algorithms to determine the position of flying robots. However, in order to compare these algorithms, it is sufficient to examine the general error of determining the position and time of implementation of the algorithm and to analyze the reasons for increasing or decreasing the geometric accuracy or the reason for the failure of algorithms in challenging environments. In this research, the positioning of EuRoC drones is performed using selected techniques, namely PL-SVO, ORB-SLAM 2 and vins fusion algorithms, and their results have been analyzed and compared. Also, the reasons for the success or failure of algorithms in robot positioning operations in different conditions are stated by referring to the robot physical information and radiometric properties of images. Among the mentioned algorithms, vins fusion with RMSE of less than one meter for the two tests and about 1.2 meters for the other test has the highest accuracy. Also, the ORB-SLAM 2 algorithm has an RMSE of less than one meter in the parts of the flight that have succeeded in estimating the position of the flying robot.
Article number: 1
Keywords: Visual SLAM, Visual odometry, Visual localization flying robot, Inertia sensor
Full-Text [PDF 1173 kb]   (334 Downloads)    
Type of Study: Research | Subject: Photo&RS
Received: 2020/10/13
References
1. H. C. Longuet-Higgins, "A computer algorithm for reconstructing a scene from two projections," Nature, vol. 293, pp. 133-135, 1981. [DOI:10.1038/293133a0]
2. C. G. Harris and J. Pike, "3D positional integration from image sequences," Image and Vision Computing, vol. 6, pp. 87-90, 1988. [DOI:10.1016/0262-8856(88)90003-0]
3. J.-M. Frahm, P. Fite-Georgel, D. Gallup, T. Johnson, R. Raguram, C. Wu, et al., "Building rome on a cloudless day," in European Conference on Computer Vision, 2010, pp. 368-381. [DOI:10.1007/978-3-642-15561-1_27]
4. F. Fraundorfer and D. Scaramuzza, "Visual odometry: Part i: The first 30 years and fundamentals," IEEE Robotics and Automation Magazine, vol. 18, pp. 80-92, 2011. [DOI:10.1109/MRA.2011.943233]
5. D. Nistér, O. Naroditsky, and J. Bergen, "Visual odometry," in Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., 2004, pp. I-I.
6. H. Durrant-Whyte and T. Bailey, "Simultaneous localization and mapping: part I," IEEE robotics & automation magazine, vol. 13, pp. 99-110, 2006. [DOI:10.1109/MRA.2006.1638022]
7. B. Gao, H. Lang, and J. Ren, "Stereo Visual SLAM for Autonomous Vehicles: A Review," in 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2020, pp. 1316-1322. [DOI:10.1109/SMC42975.2020.9283161] [PMID] [PMCID]
8. C. Debeunne and D. Vivet, "A review of visual-LiDAR fusion based simultaneous localization and mapping," Sensors, vol. 20, p. 2068, 2020. [DOI:10.3390/s20072068] [PMID] [PMCID]
9. A. Pfrunder, P. V. Borges, A. R. Romero, G. Catt, and A. Elfes, "Real-time autonomous ground vehicle navigation in heterogeneous environments using a 3D LiDAR," in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017, pp. 2601-2608. [DOI:10.1109/IROS.2017.8206083]
10. H. P. Moravec, "Obstacle avoidance and navigation in the real world by a seeing robot rover," Stanford Univ CA Dept of Computer Science1980.
11. K. Yousif, A. Bab-Hadiashar, and R. Hoseinnezhad, "An overview to visual odometry and visual SLAM: Applications to mobile robotics," Intelligent Industrial Systems, vol. 1, pp. 289-311, 2015. [DOI:10.1007/s40903-015-0032-7]
12. J. Zhang and S. Singh, "LOAM: Lidar Odometry and Mapping in Real-time," in Robotics: Science and Systems, 2014. [DOI:10.15607/RSS.2014.X.007]
13. M. O. Aqel, M. H. Marhaban, M. I. Saripan, and N. B. Ismail, "Review of visual odometry: types, approaches, challenges, and applications," Springerplus, vol. 5, p. 1897, 2016. [DOI:10.1186/s40064-016-3573-7] [PMID] [PMCID]
14. T. Taketomi, H. Uchiyama, and S. Ikeda, "Visual SLAM algorithms: a survey from 2010 to 2016," IPSJ Transactions on Computer Vision and Applications, vol. 9, pp. 1-11, 2017. [DOI:10.1186/s41074-017-0027-2]
15. R. A. Newcombe, S. J. Lovegrove, and A. J. Davison, "DTAM: Dense tracking and mapping in real-time," in Computer Vision (ICCV), 2011 IEEE International Conference on, 2011, pp. 2320-2327. [DOI:10.1109/ICCV.2011.6126513]
16. J. Engel, T. Schöps, and D. Cremers, "LSD-SLAM: Large-scale direct monocular SLAM," in European Conference on Computer Vision, 2014, pp. 834-849. [DOI:10.1007/978-3-319-10605-2_54]
17. A. Concha Belenguer and J. Civera Sancho, "DPPTAM: Dense piecewise planar tracking and mapping from a monocular sequence," in Proc. IEEE/RSJ Int. Conf. Intell. Rob. Syst., 2015. [DOI:10.1109/IROS.2015.7354184]
18. G. Klein and D. Murray, "Parallel tracking and mapping for small AR workspaces," in Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, 2007, pp. 225-234. [DOI:10.1109/ISMAR.2007.4538852] [PMCID]
19. C. D. Herrera, K. Kim, J. Kannala, K. Pulli, and J. Heikkilä, "DT-SLAM: deferred triangulation for robust SLAM," in 3D Vision (3DV), 2014 2nd International Conference on, 2014, pp. 609-616. [DOI:10.1109/3DV.2014.49]
20. R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, "ORB-SLAM: a versatile and accurate monocular SLAM system," IEEE transactions on robotics, vol. 31, pp. 1147-1163, 2015. [DOI:10.1109/TRO.2015.2463671]
21. R. Mur-Artal and J. D. Tardos, "ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras," IEEE Transactions on Robotics, vol. 33, pp. 1255-1262, 2017. [DOI:10.1109/TRO.2017.2705103]
22. C. Forster, M. Pizzoli, and D. Scaramuzza, "SVO: Fast semi-direct monocular visual odometry," in Robotics and Automation (ICRA), 2014 IEEE International Conference on, 2014, pp. 15-22. [DOI:10.1109/ICRA.2014.6906584] [PMID]
23. C. Forster, Z. Zhang, M. Gassner, M. Werlberger, and D. Scaramuzza, "SVO: Semidirect visual odometry for monocular and multicamera systems," IEEE Transactions on Robotics, vol. 33, pp. 249-265, 2016. [DOI:10.1109/TRO.2016.2623335]
24. L. Heng and B. Choi, "Semi-direct visual odometry for a fisheye-stereo camera," in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 4077-4084. [DOI:10.1109/IROS.2016.7759600]
25. R. Gomez-Ojeda, J. Briales, and J. Gonzalez-Jimenez, "PL-SVO: Semi-direct Monocular Visual Odometry by combining points and line segments," in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2016, pp. 4211-4216. [DOI:10.1109/IROS.2016.7759620]
26. R. Gomez-Ojeda and J. Gonzalez-Jimenez, "Robust stereo visual odometry through a probabilistic combination of points and line segments," in 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, pp. 2521-2526. [DOI:10.1109/ICRA.2016.7487406]
27. A. Pumarola, A. Vakhitov, A. Agudo, A. Sanfeliu, and F. Moreno-Noguer, "PL-SLAM: Real-time monocular visual SLAM with points and lines," in 2017 IEEE international conference on robotics and automation (ICRA), 2017, pp. 4503-4508. [DOI:10.1109/ICRA.2017.7989522]
28. R. Gomez-Ojeda, F.-A. Moreno, D. Zuñiga-Noël, D. Scaramuzza, and J. Gonzalez-Jimenez, "Pl-slam: a stereo slam system through the combination of points and line segments," IEEE Transactions on Robotics, 2019. [DOI:10.1109/TRO.2019.2899783]
29. X. Feng, W. Hao, and B.-f. FANG, "Research on Unmanned Vehicle Positioning Technology Based on Multi-sensor Fusion," DEStech Transactions on Computer Science and Engineering, 2018.
30. R. G. Von Gioi, J. Jakubowicz, J.-M. Morel, and G. Randall, "LSD: A fast line segment detector with a false detection control," IEEE transactions on pattern analysis and machine intelligence, vol. 32, pp. 722-732, 2008. [DOI:10.1109/TPAMI.2008.300] [PMID]
31. E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF," in 2011 International conference on computer vision, 2011, pp. 2564-2571. [DOI:10.1109/ICCV.2011.6126544]
32. M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, et al., "The EuRoC micro aerial vehicle datasets," The International Journal of Robotics Research, vol. 35, pp. 1157-1163, 2016. [DOI:10.1177/0278364915620033]
Send email to the article author

Add your comments about this article
Your username or Email:

CAPTCHA



XML   Persian Abstract   Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Abbaspour M M, hosseini naveh A. Development of a method for investigating flying robot positioning algorithms using inertia navigation system and point and linear features of images. JGST 2023; 12 (4) : 1
URL: http://jgst.issgeac.ir/article-1-986-en.html


Rights and permissions
Creative Commons License This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Volume 12, Issue 4 (6-2023) Back to browse issues page
نشریه علمی علوم و فنون نقشه برداری Journal of Geomatics Science and Technology