how does the memory behaves for very long sessions (e.g. 24 hours or more)? Is there a limit for the long term memory besides the size of the hard disk? And if so, is it possible to throw away parts of the long term memory to prevent an overflow of the memory? Thanks for your great work!
Figure 19 illustrates hard drive usage with and
without graph reduction. Extrapolating linearly memory
usage with a 100 Gb hard drive, the robot could
navigate online approximately 110 hours without graph
reduction before filling up the hard drive. When debugging
data (not used for navigation) are not recorded in
the database, this estimate would increase to approximately
33 days (800 hours). This means that if the
robot is always visiting new locations at a mean velocity
of 1.4 km/h (as in this experiment), it could travel up
to 1120 km to map environments online. When graph
reduction is used, debugging data are not saved and
having the robot always revisiting the same areas like
in this experiment, it could do SPLAM continuously for
about 130 days before reaching the hard drive capacity.
Then on the todo list to totally delete stuff from the long-term memory:
However, even by limiting the rate at which the LTM grows, a
continuous SLAM approach in unbounded dynamic environments
will always add new data over time. A complementary
strategy would be to definitely forget some
parts of the global map, at the cost of not being able
to return to some locations.
That process could be done offline, though I am not sure what would be the criterion (only time?) to say that this location can be deleted and this one not.