Full backup
-
@danal said in Full backup:
Until this is implemented, going to the 3 places for gcode, system, and macros, then check all files and right click download zip, will be a yield a fairly complete protection of invested work.
And Display if you are using a Duet Maestro.
-
Recursive wget (and probably curl, automated browsers and many download managers) would also work great with http. Almost certainly better since the protocol is far more sane than ftp. All that would be required is some url with an html page with one link per file. Or one page per directory for a whole tree.
Something that looks like the view you get when opening a directory or a web server served directory. Especially if you leave out sort columns or parent directory links so they don't need to be filtered out from the spider.
-
@xtl FYI, I'm using this to backup my printer:
wget -r -nH ftp://username@myprinter.local/{macros,sys,www}
-
A recursive GET followed by a git add/commit would give you a pretty nice rolling history along with backups.
-
@resam Many thanks for that tip. It has helped me a lot.
-
And to toss in one more solution: I wrote a small command line tool in Go to do exactly that. You can find it at https://github.com/wilriker/duetbackup. It works over the existing HTTP interface that is enabled by default (in contrast to FTP).
Right now there are no binary releases, i.e. you'd have to compile it yourself but on request I can very easily provide binaries for Windows, Linux (also for Raspberry Pi) and Mac.
-
@wilriker your program works fine. Thank you
-
@wilriker A binary release would be great. I think many would use it.
-
@phaedrux Will do as soon as I get to my PC.
-
Here you go: I started a new thread for my little tool that also has the link to binary releases.
-
This post is deleted! -
@resam A capital -X works better on the wget command line. But it didn't download all the subfolders until I added -l 0.