systemhsystemc的SystemSystem
轉載請注明出處:http://blog.csdn.net/c602273091/article/details/54933760
這里包含了整個SLAM系統所需要的一起,通過看這個文件可以對ORB-SLAM系統有什么有一個大概的了解。不過之前我們需要對于多線程了解一些基本的東西——信號量【1】和多線程【2】。
具體注釋如下:
/*** This file is part of ORB-SLAM2.** Copyright (C) 2014-2016 Ra煤l Mur-Artal <raulmur at unizar dot es> (University of Zaragoza)* For more information see <https://github.com/raulmur/ORB_SLAM2>** ORB-SLAM2 is free software: you can redistribute it and/or modify* it under the terms of the GNU General Public License as published by* the Free Software Foundation, either version 3 of the License, or* (at your option) any later version.** ORB-SLAM2 is distributed in the hope that it will be useful,* but WITHOUT ANY WARRANTY; without even the implied warranty of* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the* GNU General Public License for more details.** You should have received a copy of the GNU General Public License* along with ORB-SLAM2. If not, see <http://www.gnu.org/licenses/>.*/#ifndef SYSTEM_H#define SYSTEM_H#include<string>#include<thread>#include<opencv2/core/core.hpp>#include "Tracking.h"#include "FrameDrawer.h"#include "MapDrawer.h"#include "Map.h"#include "LocalMapping.h"#include "LoopClosing.h"#include "KeyFrameDatabase.h"#include "ORBVocabulary.h"#include "Viewer.h"namespace ORB_SLAM2{class Viewer; // 畫圖class FrameDrawer; // 畫每一幀class Map; // 對map進行操作class Tracking; // 追蹤的過程class LocalMapping; // 局部地圖class LoopClosing; // 閉環檢測class System{public: // Input sensor // 輸入設備:單目、立體視覺、RGBD enum eSensor{ MONOCULAR=0, STEREO=1, RGBD=2 };public: // Initialize the SLAM system. It launches the Local Mapping, Loop Closing and Viewer threads. // 對SLAM系統的初始化:包含局部地圖、閉環檢測、視圖三個線程 System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer = true); // PRoccess the given stereo frame. Images must be synchronized and rectified. // Input images: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Returns the camera pose (empty if tracking fails). cv::Mat TrackStereo(const cv::Mat &imLeft, const cv::Mat &imRight, const double ×tamp); // Process the given rgbd frame. Depthmap must be registered to the RGB frame. // Input image: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Input depthmap: Float (CV_32F). // Returns the camera pose (empty if tracking fails). cv::Mat TrackRGBD(const cv::Mat &im, const cv::Mat &depthmap, const double ×tamp); // Proccess the given monocular frame // Input images: RGB (CV_8UC3) or grayscale (CV_8U). RGB is converted to grayscale. // Returns the camera pose (empty if tracking fails). // 對單目的圖片進行追蹤 cv::Mat TrackMonocular(const cv::Mat &im, const double ×tamp); // This stops local mapping thread (map building) and performs only camera tracking. // 暫停局部地圖的構建,只進行追蹤,這個名字很有迷惑性 void ActivateLocalizationMode(); // This resumes local mapping thread and performs SLAM again. // 重新開啟局部地圖的線程 // 在這里使用的是mutex信號量的多線程編程 void DeactivateLocalizationMode(); // Reset the system (clear map) // 復位清楚地圖 void Reset(); // All threads will be requested to finish. // It waits until all threads have finished. // This function must be called before saving the trajectory. // 等到所有線程結束任務的時候關閉每個線程 void Shutdown(); // Save camera trajectory in the TUM RGB-D dataset format. // Call first Shutdown() // See format details at: http://vision.in.tum.de/data/datasets/rgbd-dataset void SaveTrajectoryTUM(const string &filename); // Save keyframe poses in the TUM RGB-D dataset format. // Use this function in the monocular case. // Call first Shutdown() // See format details at: http://vision.in.tum.de/data/datasets/rgbd-dataset void SaveKeyFrameTrajectoryTUM(const string &filename); // Save camera trajectory in the KITTI dataset format. // Call first Shutdown() // See format details at: http://www.cvlibs.net/datasets/kitti/eval_odometry.php // kitti數據集的保存位姿的方法 void SaveTrajectoryKITTI(const string &filename); // TODO: Save/Load functions // SaveMap(const string &filename); // LoadMap(const string &filename);private: // Input sensor // 輸入的傳感器類型 eSensor mSensor; // 用于特征匹配和閉環檢測的字典 // ORB vocabulary used for place recognition and feature matching. ORBVocabulary* mpVocabulary; // KeyFrame database for place recognition (relocalization and loop detection). // 關鍵幀存儲的地方 KeyFrameDatabase* mpKeyFrameDatabase; // Map structure that stores the pointers to all KeyFrames and MapPoints. // 所有的關鍵幀和點云存儲的地方 Map* mpMap; // Tracker. It receives a frame and computes the associated camera pose. // It also decides when to insert a new keyframe, create some new MapPoints and // performs relocalization if tracking fails. // 追蹤器。接受一幀并且計算相機位姿,并決定何時插入關鍵幀,關鍵點。 // 當追蹤失敗以后進行重定位 Tracking* mpTracker; // Local Mapper. It manages the local map and performs local bundle adjustment. // 構建局部地圖并對局部地圖使用BA。 LocalMapping* mpLocalMapper; // Loop Closer. It searches loops with every new keyframe. If there is a loop it performs // a pose graph optimization and full bundle adjustment (in a new thread) afterwards. // 閉環檢測,每插入一個關鍵幀就計算是否有閉環并且進行全局的BA。 LoopClosing* mpLoopCloser; // The viewer draws the map and the current camera pose. It uses Pangolin. // 使用Pangolin庫看地圖和相機位姿。 Viewer* mpViewer; FrameDrawer* mpFrameDrawer; MapDrawer* mpMapDrawer; // System threads: Local Mapping, Loop Closing, Viewer. // The Tracking thread "lives" in the main execution thread that creates the System object. // 追蹤這個線程是在main函數里面,這里另外開了局部地圖、局部閉環檢測、顯示地圖三個線程 std::thread* mptLocalMapping; std::thread* mptLoopClosing; std::thread* mptViewer; // Reset flag std::mutex mMutexReset; bool mbReset; // Change mode flags std::mutex mMutexMode; bool mbActivateLocalizationMode; bool mbDeactivateLocalizationMode;};}// namespace ORB_SLAM#endif // SYSTEM_H這樣對于ORB-SLAM我們有了一個大致的認識。
接下來我們看System::System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer).
使用understand的control flow,一劍封喉,直接看到各個部分的聯系。流程圖出來了,感覺看起來很爽。
在這個基礎上,我再對整個流程進行注釋。
System::System(const string &strVocFile, const string &strSettingsFile, const eSensor sensor, const bool bUseViewer):mSensor(sensor),mbReset(false),mbActivateLocalizationMode(false), mbDeactivateLocalizationMode(false){ // Output welcome message cout << endl << "ORB-SLAM2 Copyright (C) 2014-2016 Raul Mur-Artal, University of Zaragoza." << endl << "This program comes with ABSOLUTELY NO WARRANTY;" << endl << "This is free software, and you are welcome to redistribute it" << endl << "under certain conditions. See LICENSE.txt." << endl << endl; cout << "Input sensor was set to: "; // 判斷什么類型的傳感器 if(mSensor==MONOCULAR) cout << "Monocular" << endl; else if(mSensor==STEREO) cout << "Stereo" << endl; else if(mSensor==RGBD) cout << "RGB-D" << endl; //Check settings file // 檢查配置文件是否存在 // cv::FileStorage對xml/YML的配置文件進行操作,讀取配置文件 // yml的配置文件已經讀入fsSettings了 cv::FileStorage fsSettings(strSettingsFile.c_str(), cv::FileStorage::READ); if(!fsSettings.isOpened()) { cerr << "Failed to open settings file at: " << strSettingsFile << endl; exit(-1); } // 加載ORB的字典 //Load ORB Vocabulary cout << endl << "Loading ORB Vocabulary. This could take a while..." << endl; // 加載字典到mpVocabulary mpVocabulary = new ORBVocabulary(); bool bVocLoad = mpVocabulary->loadFromTextFile(strVocFile); if(!bVocLoad) { cerr << "Wrong path to vocabulary. " << endl; cerr << "Falied to open at: " << strVocFile << endl; exit(-1); } cout << "Vocabulary loaded!" << endl << endl; //Create KeyFrame Database // 創建關鍵幀的數據庫 mpKeyFrameDatabase = new KeyFrameDatabase(*mpVocabulary); //Create the Map // 創建地圖 mpMap = new Map(); //Create Drawers. These are used by the Viewer // 創建視圖 mpFrameDrawer = new FrameDrawer(mpMap); // 創建畫圖器 mpMapDrawer = new MapDrawer(mpMap, strSettingsFile); //Initialize the Tracking thread //(it will live in the main thread of execution, the one that called this constructor) mpTracker = new Tracking(this, mpVocabulary, mpFrameDrawer, mpMapDrawer, mpMap, mpKeyFrameDatabase, strSettingsFile, mSensor); //Initialize the Local Mapping thread and launch mpLocalMapper = new LocalMapping(mpMap, mSensor==MONOCULAR); mptLocalMapping = new thread(&ORB_SLAM2::LocalMapping::Run,mpLocalMapper); //Initialize the Loop Closing thread and launch // 初始化局部圖線程 mpLoopCloser = new LoopClosing(mpMap, mpKeyFrameDatabase, mpVocabulary, mSensor!=MONOCULAR); mptLoopClosing = new thread(&ORB_SLAM2::LoopClosing::Run, mpLoopCloser); //Initialize the Viewer thread and launch // 初始化顯示線程 mpViewer = new Viewer(this, mpFrameDrawer,mpMapDrawer,mpTracker,strSettingsFile); if(bUseViewer) mptViewer = new thread(&Viewer::Run, mpViewer); mpTracker->SetViewer(mpViewer); //Set pointers between threads mpTracker->SetLocalMapper(mpLocalMapper); mpTracker->SetLoopClosing(mpLoopCloser); mpLocalMapper->SetTracker(mpTracker); mpLocalMapper->SetLoopCloser(mpLoopCloser); mpLoopCloser->SetTracker(mpTracker); mpLoopCloser->SetLocalMapper(mpLocalMapper);}接下來對每個類看一下初始化的效果。從字典類,關鍵幀類,地圖類,局部圖類等等看看它們如何進行初始化。
參考鏈接: 【1】mutex: http://www.cplusplus.com/reference/mutex/ 【2】thread: http://www.cplusplus.com/reference/thread/thread/
新聞熱點
疑難解答