I've been using RTABmap quite a few times now on my ZenFone AR, and it works great. However there's one minor thing that I'm hoping to overcome. When using it for larger scans, the memory fills relatively quickly on the phone, and I have to save the db and start a new map. I then recreate it on desktop afterwards with the multi-mapping approach, and it works great. It does, however, take a few minutes for the phone to save the scan, and when I have 6 or more scans it tends to add up to quite a lot of idle time for me.
So I bought a kinect v2 so that I could do it directly on my laptop instead, but then I started thinking if it's possible to simply stream the input from the ZenFone AR to RTABmap on my laptop over WiFi (or alternately USB)?
That way I wouldn't need external powersupply like with the kinect, and I would be able to utilize all the sensors from the phone while using the bigger processing power and memory of the laptop?
I've looked a little into Tango ROS streamer, but was hoping for something that could work directly with the RTABmap desktop application?
Or any suggestions that could point me in the right direction is welcome :-)
Indeed the current example is using Tango ROS streamer. There is no built-in way to do this with rtabmap standalone version. In some way ROS is doing already the job of transferring/converting the data from Tango to rtabmap.
Tip for the phone, you can show only the point cloud to save some memory (mesh and texture won't be created). In the settings, you can also decrease the density of the point clouds to save more memory on visualization.