Full backup
-
@dc42 said in Full backup:
@phaedrux said in Full backup:
@doctrucker Right, but to packup the files into a zip file and send it to you has to be done by the Duet.
What happens with the existing backup of /sys and /macros is that DWC downloads each file individually and then zips them up.
Ah I see. In that case, would it be possible to download the entire contents of the card and zip it up in the same way?
-
Possible, it's just a feature request for DWC as far as I can see.
I agree it would be useful!
-
Until this is implemented, going to the 3 places for gcode, system, and macros, then check all files and right click download zip, will be a yield a fairly complete protection of invested work.
-
and /menu if using 12864 display.
I'll stick with wget, if it gets a http url to hit instead that gets all the files, great, but ftp will also work as is.
-
Also, just verified in github that DWC is pulling individual files from Duet, and zipping them on the browser (dwc) side. As everyone was saying above. Definitely works this way.
DWC is using a unique service call, namely "rr_download,name=", so that it can get text or binary or whatever and add them to the zip.
Therefore, implementing this is a matter of adding a button somewhere, adding code to 'walk' the entire directory tree of the SD, and the body of that code would be identical to what zips gcode or macros or whatever today. Absolutely no changes to Duet/RepRap firmware needed.
Easy for Chris, and/or if there is enough commentary, I could give it a shot, and do it so that Chris can later pull it on git.
-
@danal said in Full backup:
Until this is implemented, going to the 3 places for gcode, system, and macros, then check all files and right click download zip, will be a yield a fairly complete protection of invested work.
And Display if you are using a Duet Maestro.
-
Recursive wget (and probably curl, automated browsers and many download managers) would also work great with http. Almost certainly better since the protocol is far more sane than ftp. All that would be required is some url with an html page with one link per file. Or one page per directory for a whole tree.
Something that looks like the view you get when opening a directory or a web server served directory. Especially if you leave out sort columns or parent directory links so they don't need to be filtered out from the spider.
-
@xtl FYI, I'm using this to backup my printer:
wget -r -nH ftp://username@myprinter.local/{macros,sys,www}
-
A recursive GET followed by a git add/commit would give you a pretty nice rolling history along with backups.
-
@resam Many thanks for that tip. It has helped me a lot.
-
And to toss in one more solution: I wrote a small command line tool in Go to do exactly that. You can find it at https://github.com/wilriker/duetbackup. It works over the existing HTTP interface that is enabled by default (in contrast to FTP).
Right now there are no binary releases, i.e. you'd have to compile it yourself but on request I can very easily provide binaries for Windows, Linux (also for Raspberry Pi) and Mac.
-
@wilriker your program works fine. Thank you
-
@wilriker A binary release would be great. I think many would use it.
-
@phaedrux Will do as soon as I get to my PC.
-
Here you go: I started a new thread for my little tool that also has the link to binary releases.
-
This post is deleted! -
@resam A capital -X works better on the wget command line. But it didn't download all the subfolders until I added -l 0.