Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

loading data from an old instance into a new golink db running on fly.io #63

Open
vielmetti opened this issue Feb 17, 2023 · 2 comments

Comments

@vielmetti
Copy link
Contributor

Situation: I have an existing golink installation, running on a Pi
locally. There are about 200 links there. I also have a brand new
fly.io installation of golink, following the instructions in the README.

I'd like to migrate the data from my old install to the new one.

One approach is simply to recreate the links, taking as input
the go/.export file and looping around a curl command that
does a PUT as described in go/.help . I think I can do that
with jq, awk, and sh.

A second would be to log into the fly.io console, upload some
files, then modify the Dockerfile to use the -snapshot approach
to get things started. I'm less clear on all of the details there.

Any other options I'm missing? I am pretty sure I'll lose use count
information in any circumstance. I have only a single user in this
tailnet so I won't lose who created the link.

@vielmetti
Copy link
Contributor Author

I used the first option here, with a script that looked like this.
In this case go is the first machine, and go-1 is the new
system. I'll resolve the name collision next.

curl go/.export > go.export.json
jq -c '[.Short,.Long]' go.export.json | sed -e 's/^\[/curl -d short=/' -e 's/,/ -d long=/' -e 's/\]$/ go-1/' | sh

You'd be encouraged to examine the output of the jq|sed operation -
or even make it all jq, if you are better in the use of that tool than I am -
before feeding it into sh.

@willnorris
Copy link
Member

Thanks for sharing that script. The intent is certainly for the -snapshot flag to allow restoring snapshots like this. Though if you're simply migrating between instances, simply copying the sqlite file is probably simplest and would retain stats as well. The snapshot file is kind of a disaster recovery option in case you lose the sqlite file entirely or it somehow gets corrupted. Actually, I think /.export might even predate our use of sqlite, so maybe it made more sense when we had a directory of json files.

We've never had to restore our database like this, so I've just never gotten around to writing up instructions for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants