Detect more loop closures... stuck at local maximum?
I have noticed that automatic loop closure detection is sensitive to the constraints of existing loops. In some cases, it appears that a "bad" loop in the database can consistently cause a number of good loop closures to be rejected because the error is too large after optimization. The end effect is that the process converges on a maximum number that is actually fewer than what would be found if the one bad loop were rejected. Perhaps it would be a good idea to include some way to handle these bad loop closures based on the number of times they result in the rejection of new loops?
I haven't tested this as much as I would like to as I am having trouble getting the detection process to consistently go to completion without crashes. (Unfortunately I can't open databases when I have my logger level set to "debug", and there are no WARN or ERROR level events that occur). I have avoided crashes by disabling SBA from the main GUI before starting loop closure detection, but that pre-run settings dialog is missing from the db editor so I haven't been able update any of my databases with new loops using the auto detect option.
Re: Detect more loop closures... stuck at local maximum?
If you have a sample database to share you know that is crashing everytime you do "Detect more loop closures", I could try to reproduce the problem here. The Global SBA option in RTAB-Map GUI is not available in DatabaseViewer. You may detect more loop closures in DatabaseViewer, save the database, then re-open it with RTAB-Map to do Global SBA before exporting.
With default settings, the detector assumes that all loop closures accepted are good ones. It would be better to increase the acceptation thresholds to reject all bad ones. In particular, if you open the "Core Parameters" view, you may increase Vis/MinInliers. RGBD/OptimizeMaxError could be lowered too so that loop closures causing very large deformations are rejected.
Another approach (if Optimizer/Strategy is 1 or 2) is to enable Optimizer/Robust parameter (Vertigo approach). You should set RGBD/OptimizeMaxError to 0 if so (they are not compatible parameters). With this approach, bad and good loop closures would be all accepted. In the final optimization, the bad ones would be disabled by the robust optimization approach like in this tutorial.
Personally, I still prefer the first approach, as the Vertigo approach doesn't always reject the bad loop closures if the odometry covariance is not correctly set.