I want to backup my public folder and its subdirectories and files.
I tried * */1 * * * tar -cvzf filename.tar.gz /public
and it creates the file, but it is only limited at 45 bites.
Something seems to be blocking the process, how can I do this?
I don't want to have to download the whole website through ftp which will take a long time.
I'll rather compress it and download it to a dedicated sever I have.
Trying to compress my public folder to a file
-
- New to forums
- Posts: 1
- https://www.youtube.com/channel/UC40BgXanDqOYoVCYFDSTfHA
- Joined: Tue Aug 12, 2014 11:44 pm
Re: Trying to compress my public folder to a file
The process is likely being stopped when it hits the maximum execution time.
The best way to back up files remotely is to use rsync. It supports compression and you can restart it where you left off if it dies.
This would sync a remote folder to a local one for you, and you could re-run it to re-sync.
The best way to back up files remotely is to use rsync. It supports compression and you can restart it where you left off if it dies.
This would sync a remote folder to a local one for you, and you could re-run it to re-sync.
Code: Select all
rsync -avHz --delete remote-machine:/usr/www/your-username/public/ local-folder/