Greetings!
I'd like to share with the approach I use:
- I have laptop with linux, another linux in the office, Windows XP machine and server
- I have several folders to sync: some are mine that I want to share (i.e. I'm a master), another ones are managed by employees so I need them always fresh on my PC (i.e I'm a slave)
- the main point is that I wanna have the scheme working even if I have very slow or even missing internet connection
- so I developed a bash script that performs (as I call it) a multi-node sync between different places in the office. The script is clever enough to be configured who is master and even to copy files with another name in case of conflicts detected
- the script uses different runs of rsync and keeps history to resolve the conflicts, Windows also has possibility to run rsyncd on (tested on XP and checked with Google - musy work on Windows 10 too)
- crontab is used to run the script time to time
So what I get:
- 2GB in total are synced within second even on slowest connection due to rsync's ability to copy only those parts of files that are differenet
- due to master-slave configuring I even able to sych MS Access databases (I run them on virtual PC)
- together with usage of enxrypted folders all syncings are 100% secure, I even keep a copy of the data in Dropbox w/o doubts
- my Android also has sync/encrypt, so I have access to my files too
Feel free to contact me, we'll think if we can adopt some of ideas for you.
Regards
Sergey