1. I Synchronize the rgb and depth images.
2. For calibration file I update it from parameters of this page.
But I dont know where to insert the d0,d1,d2, d3,d4 in the calibration file ?
3. What is the RTAB-Map parameters that can be update to improve the RGBD-SLAM?
Re: How to process RGBD-SLAM datasets with RTAB-Map?
Just tried the dataset. With default parameters, the odometry get lost exactly where there is a huge delay (~2 sec, normally 30 msec) between two images in the dataset, making odometry not able to follow:
Another observation is that there are not a lot of discriminative visual features and they are often >4 meters away. I then used 3D to 2D instead of 3D to 3D motion estimation (Vis/EstimationType=1) to use far features. I also set GFTT quality 0.001 instead of 0.01 (GFTT/QualityLevel=0.001) to get more features. Here is the rgbdslam.yaml I used too (don't forget to set the depth scale factor to 5):
Results: The odometry gets lost some times but recover the next frame. As the camera is quite shaky and visual feautres are far, odometry can be bad at some places. Overall, here is the full run (gray line is the ground truth):
I tried to reproduce your results following the instructions but despite using your intrinsics (ros default) or from the page
I can't get any poses remotely near. I also try changin between different odometry strategies but little change.
(It is with TORO graph optimization and I couldn't find the ignore covariance button) *(tried with g2o and vertigo and similar results)
What could I look for to tune it?
python evaluate_ate.py /home/antonioguerrero/ORB_SLAM2/datasets/rgbd_dataset_freiburg3_long_office_household/groundtruth.txt /home/antonioguerrero/Documents/RTAB-Map/poses_camera_f2f_0hz.txt * --plot atertabmap0f2f.png --offset 0 --scale 1 --verbose *frame to frame and 0 Hz also tried with 30 and 20 (the best result frame to map at 0 Hz)
compared_pose_pairs 83 pairs (none of the results reach 90 pairs)
absolute_translational_error.rmse 408041.042321 m
absolute_translational_error.mean 379314.518259 m
absolute_translational_error.median 375067.726482 m
absolute_translational_error.std 150392.780599 m
absolute_translational_error.min 53123.714529 m
absolute_translational_error.max 770123.529968 m
edit I think it could be problem from the images association because I have been having problems to run the associate.py version
(I just did it once this morning and I don't know how because though copying the exact command from the shell history it doesn't work again)
"ce0xx:~$ python associatedir.py /home/antonioguerrero/ORB_SLAM2/datasets/rgbd_dataset_freiburg3_long_office_household/rgb.txt /home/antonioguerrero/ORB_SLAM2/datasets/rgbd_dataset_freiburg3_long_office_household/depth.txt
1341847980.722988 rgb/1341847980.722988.png 1341847980.723020 depth/1341847980.723020.png
Traceback (most recent call last):
File "associatedir.py", line 135, in <module> shutil.move(" ".join(first_list[a]), "rgb_sync/" + " ".join(first_list[a]).split("/"))
File "/usr/lib/python2.7/shutil.py", line 302, in move
File "/usr/lib/python2.7/shutil.py", line 130, in copy2
File "/usr/lib/python2.7/shutil.py", line 82, in copyfile
with open(src, 'rb') as fsrc:
IOError: [Errno 2] No such file or directory: 'rgb/1341847980.722988.png'
I followed the same steps as the post and they should still work with the latest versions. When the mapping is completed (here I activated intermediate nodes creation like mentioned in the post), press Stop, then do Edit->"Download graph only" with global map optimized option and you should see something like this:
To export poses in RGBD-SLAM format, do File->Export poses... and select RGB-D SLAM format. Save the file to poses.txt for example. With the rgbd-slam evaluation tool:
$ python evaluate_ate.py groundtruth.txt poses.txt --plot figure.png --offset 0 --scale 1 --verbose
compared_pose_pairs 1199 pairs
absolute_translational_error.rmse 0.035687 m
absolute_translational_error.mean 0.034737 m
absolute_translational_error.median 0.033668 m
absolute_translational_error.std 0.008178 m
absolute_translational_error.min 0.013810 m
absolute_translational_error.max 0.084219 m
Note that based on the RGB calibration for fr3, you may use these calibration parameters instead (which fix the scale of the trajectory):
compared_pose_pairs 1252 pairs
absolute_translational_error.rmse 0.022903 m
absolute_translational_error.mean 0.018278 m
absolute_translational_error.median 0.014916 m
absolute_translational_error.std 0.013801 m
absolute_translational_error.min 0.001312 m
absolute_translational_error.max 0.079146 m