1

Having a PI running to monitor internet connectivity locally is good, but it also comes with the drawback of having limited storage, especially when other stuff is running on it, too.

I am looking for a way to store metrics not only locally but remotely, too. Since the internet monitoring it also running if the internet connection is broken, the transmission of metrics needs to happen even if the internet connection is not working. Just delayed when the internet connection is back online.

Since the remote_write functionality is discarding after two hours of non-reachability, this is not ideal.

I was thinking about running a cron or something, that syncs the local storage to the remote and then deletes the local copy of the metrics, since I do not need them there.

Maybe Prometheus can not do that and I am just missing something?

1 Answer 1

0

This is described in the documentation

I would make a NAS with some disks and try to configure that as an extra path to store the metrics. Based on your requirements I feel that remote storage is not an option, because you internet doesn't have 24/7 uptime.

3
  • Appreciate the answer, but it does not answer my question. I do not want to buy a whole new NAS and let it running 24/7 just to collect some data, when I already have remote storage available. :)
    – func0der
    Nov 14 at 12:06
  • What maybe could do is some caching or queuing when the internet is not available. Or make sure you have a backup internet, in a corporate environment a fallback uplink is quite common
    – Turdie
    Nov 14 at 13:40
  • That is what the question is about. What would be a strategy to cache this for some time and sent it when internet is available. Sorry, if that was not clear.
    – func0der
    Nov 15 at 15:35

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .