The wifi robot has an rgbd camera attached (Intel D435) which is publishing an rgbd ros topic
A computer (Ros master) creates the rtabmap SLAM using the robot's camera data
I want to be able to see the resulting pointcloud on miscellaneous non-ros, wifi mobile devices. What's the best way to do this?
My current idea is to use a nodeJS server on the master:
- Publish pointcloud via throttled web socket
- On device connect, all point cloud data is immediately downloaded (will throttle as necessary)
- Device pointcloud will receive new data as published by web socket
- Each device will render the downloaded point cloud data using webgl
How can I make this more concise / less network intensive?
Note - I've looked at ros_bridge to use as the point cloud transfer gateway but it seems too unreliable
From RTAB-Map viewpoint, you have the choice to publish /rtabmap/cloud_map, which is a sensor_msgs/PointCloud2 topic containing the whole map. This may be heavy to transfer when the map is getting bigger and bigger. Another way is to use /rtabmap/mapData topic instead, which contains only compressed Depth/RGB images of the latest node added to graph and the optimized graph's poses. This topic is already used by rviz's rtabmap_ros::MapCloud display and by rtabmapviz to reconstruct/update the point cloud on client side, limiting the bandwidth used.