Skip to main content

Help Split Backup into Parts

Comments

16 comments

  • cPRex Jurassic Moderator

    Hey there!  The main question is if the 900G backup is all for one large account, or if that is for many accounts.  Once I know that, I may be able to recommend something.

    0
  • Icaro Nadson

    it's a big account;

    889G Apr 27 19:16 xxxx.tar.gz < this bigger account
    552M Apr 26 02:08 xxxx.tar.gz
    432K Apr 26 02:09 xxxx.tar.gz
    320K Apr 26 02:08 xxxx.tar.gz
    323M Apr 26 02:05 xxxx.tar.gz
    1.9M Apr 26 02:02 xxxx.tar.gz
     63M Apr 26 02:06 xxxx.tar.gz
    9.5G Apr 28 01:54 xxxx.tar.gz
    550M Apr 26 02:02 xxxx.tar.gz
     75M Apr 26 02:09 xxxx.tar.gz

    I can resolve the other bills more easily.

    0
  • cPRex Jurassic Moderator

    Thanks for the additional details.

    I don't have any tools in cPanel that will allow you to split an individual account, although that sounds like an excellent feature request if you'd like to submit one over at features.cpanel.net.  If not, I'm happy to submit one for you.

    Would it help to exclude that account from the automated backups and then use /scripts/pkgacct to allow that to run separately? 

    0
  • Icaro Nadson

    I thought of it this way:

    cd /home/user
    tar -cf - /home/user/mail/domain.example.com/ | split -b 450G - backup_part_

    This way, I would avoid having to double the space needed for the splitting, since storing the entire .tar.gz and then dividing it requires almost twice the disk space.

    To do automatic backups by the system of just the accounts, databases, etc., and compress the files separately.

    That's the idea, but in practice I don't know what I should do.

    0
  • cPRex Jurassic Moderator

    That would work if you still used the automated system for the account and email data like you said, especially if mail is the biggest disk usage on the account.

    0
  • Icaro Nadson

    I understand,

    and analyzing, there would be another problem, the tar with the division would work well, but it would lose the compression of the files. Researching here, I can't use gz in these conditions and my total file is 1.4TB :(

    0
  • cPRex Jurassic Moderator

    What if you used an incremental backup instead?  The first backup would take a considerable amount of time to run, but subsequent backups would be much faster as it only deals with files that are new or changed.

    1
  • Icaro Nadson

    There is actually another way, I haven't tested it yet, but it would be like this:

    tar -cf - /home/user/mail/domain.example.com | gzip | split -b 450G - backup_tar_gz_part_

    I would disable the automatic backup of whm and with a script I would call /scripts/pkgacct to backup only the account without the files and then use the command above to backup the files, I think that this way I can solve this.

    0
  • Icaro Nadson

    I only need one full backup per week.

    0
  • cPRex Jurassic Moderator

    I think your last post is likely the best solution that also maintains the compression.

    0
  • Icaro Nadson

    How do I backup the account without the files?

    0
  • cPRex Jurassic Moderator

    You can run /scripts/pkgacct with the --skiphomedir option - that will create the necessary archive to restore the cPanel data (userdata, databases, etc) but will skip everything in the user's home directory, including email.

    Details on everything you can do with pkgacct can be found here: https://docs.cpanel.net/whm/scripts/the-pkgacct-script/

    0
  • Icaro Nadson

    Do I back up the entire /home/user folder?

    0
  • cPRex Jurassic Moderator

    Yes - if you use skiphomedir you would then do a manual backup of all of /home/username

    0
  • Icaro Nadson

    Thanks again

    0
  • cPRex Jurassic Moderator

    You're welcome!

    0

Please sign in to leave a comment.