Full backup
-
@bearer Never used or heard of wget before so thanks for the pointer
Would still be awesome with a backup utility though Don't know how into networking HW you guys here are, but take the Ubiquiti UNIFI/CloudKey setup. The CloudKey does a backup of the entire sdcard content at a set time intervall. That way one atleast have something to revert to if something goes horribly wrong. Dosn't help much if the sdcard f's up. But it's something, and the backup could be stored in the /sys/ whos's easily accessable for if one want to fetch it.
-
add cron or a scheduled task in windows and you have regular backups. toss git in there and you have revision control, potentially offsite with bitbucket/github/whatnot.
might not be a one-click solution, but you only need to set it up once.
-
Well you can down load the whole sys folder and each macro folder from the DWC. So that's pretty much all you need for a recovery.
-
@phaedrux said in Full backup:
I'm not sure, but I think part of the limitation with getting the DWC to do heavy file processing is the relatively weak CPU which is not suited to the task, plus the waiting for SD card reads and folder recursion.
I didn't think the duet was doing much of the leg work with regards to the website. I thought that the web page files were served to the browsing computer then they ran on that computer, commumicating with the duet using gcode, ftp, and maybe telnet?
-
@doctrucker Right, but to packup the files into a zip file and send it to you has to be done by the Duet.
-
@phaedrux said in Full backup:
@doctrucker Right, but to packup the files into a zip file and send it to you has to be done by the Duet.
What happens with the existing backup of /sys and /macros is that DWC downloads each file individually and then zips them up.
-
@dc42 ...and the processing is done on the remote computer rather than the duet right?
That's how I understood it anyway. All duet is doing is serving individual files to the remote computer.
-
@dc42 said in Full backup:
@phaedrux said in Full backup:
@doctrucker Right, but to packup the files into a zip file and send it to you has to be done by the Duet.
What happens with the existing backup of /sys and /macros is that DWC downloads each file individually and then zips them up.
Ah I see. In that case, would it be possible to download the entire contents of the card and zip it up in the same way?
-
Possible, it's just a feature request for DWC as far as I can see.
I agree it would be useful!
-
Until this is implemented, going to the 3 places for gcode, system, and macros, then check all files and right click download zip, will be a yield a fairly complete protection of invested work.
-
and /menu if using 12864 display.
I'll stick with wget, if it gets a http url to hit instead that gets all the files, great, but ftp will also work as is.
-
Also, just verified in github that DWC is pulling individual files from Duet, and zipping them on the browser (dwc) side. As everyone was saying above. Definitely works this way.
DWC is using a unique service call, namely "rr_download,name=", so that it can get text or binary or whatever and add them to the zip.
Therefore, implementing this is a matter of adding a button somewhere, adding code to 'walk' the entire directory tree of the SD, and the body of that code would be identical to what zips gcode or macros or whatever today. Absolutely no changes to Duet/RepRap firmware needed.
Easy for Chris, and/or if there is enough commentary, I could give it a shot, and do it so that Chris can later pull it on git.
-
@danal said in Full backup:
Until this is implemented, going to the 3 places for gcode, system, and macros, then check all files and right click download zip, will be a yield a fairly complete protection of invested work.
And Display if you are using a Duet Maestro.
-
Recursive wget (and probably curl, automated browsers and many download managers) would also work great with http. Almost certainly better since the protocol is far more sane than ftp. All that would be required is some url with an html page with one link per file. Or one page per directory for a whole tree.
Something that looks like the view you get when opening a directory or a web server served directory. Especially if you leave out sort columns or parent directory links so they don't need to be filtered out from the spider.
-
@xtl FYI, I'm using this to backup my printer:
wget -r -nH ftp://username@myprinter.local/{macros,sys,www}
-
A recursive GET followed by a git add/commit would give you a pretty nice rolling history along with backups.
-
@resam Many thanks for that tip. It has helped me a lot.
-
And to toss in one more solution: I wrote a small command line tool in Go to do exactly that. You can find it at https://github.com/wilriker/duetbackup. It works over the existing HTTP interface that is enabled by default (in contrast to FTP).
Right now there are no binary releases, i.e. you'd have to compile it yourself but on request I can very easily provide binaries for Windows, Linux (also for Raspberry Pi) and Mac.
-
@wilriker your program works fine. Thank you
-
@wilriker A binary release would be great. I think many would use it.