Detailed method for automatic backup and deletion of website data under Linux

  
                

In order to protect the security of computer data, you need to periodically back up and delete the data of the website to prevent data loss caused by hacking. The following small series will introduce you how to regularly back up and delete website data under Linux. Let's learn.

demand is that, first of all website files and database to be automatically backed up every day, over a certain time and then the backup directory deletion, such as heaven and earth to retain the last 14 backup

just vps is redhat, There is no crond service installed by default. Enter the following command to install

yum install cronie

Create backup script

vi /root/bakweb.sh

Edit and enter the following

#! /bin/bash

find /home/bak/-name ‘*’ -type f -mtime +14 -exec rm {} \\;

tar zcvf /home/bak/Www.penglei.name_$(date +%F).tar.gz /var/www/html

mysqldump -u root --password=PASSWORD DBNAME 》

/home/bak/Sql.penglei.name_$(date +%F).sql

exit 0 Enter: wq to save and exit. The meaning of this script is to first select /home/bak as the website backup directory.

The first step is to delete the files under /home/bak for more than 14 days.

The second step is to back up the website directory /var/by www.jb51.name_+ date. Www/html;

In the third step, export the database by sql.penglei.name_+date as the file name, and replace PASSWORD and DBNAME with your own root password and database name.

Finally create a crond script that automatically runs the script at 5 am every morning to create a crond file

vi /etc/cron.d/bakweb Edit and enter the following

0 5 * * * root /root/bakweb.sh

The above is an introduction to the automatic backup and deletion of website data on the Linux system. For your computer security, the website data cannot be retained for a long time. Of course, you You can also manually back up and delete website data.

Copyright © Windows knowledge All Rights Reserved