MySQL Enterprise offers extra features like incremental backups but at a cost. However, there’s a method for achieving incremental backups in the free MySQL version. We gonna use binary logs. The MySQL binary log files store executed SQL statements, transaction details, and timestamps, enabling tasks like replication and point-in-time recovery.
Binary logs are off by default. Let’s open it first. Go to the MySQL config file with this command.
sudo nano /etc/mysql/mysql.conf.d/mysqld.cnf
https://lingash.medium.com/schedule-a-mysql-incremental-backup-and-send-it-to-s3-c3f1e02d5374
And command out the log bin.
log_bin = /var/log/mysql/mysql-bin.log
And restart MySQL.
sudo systemctl restart mysqld
now you can find the binary log files in /var/log/mysql
directory. So starting from this point, all MySql operations will be stored in binary log files.
you also need two packages, zip
and s3cmd
. zip
is a package used for compressing your MySQL file before sending it to S3. s3cmd
is a command line package used for creating s3 buckets, uploading, retrieving, and managing data. So let’s install these two packages.
sudo apt install zip
sudo apt install s3cmd
In Summary, what we gonna do is flush the binary logs first. it means MySql starts writing to the new binary log file. And we get all the binary log files without the last one which is just created. And zip it and send it to s3.
The final sh file will be like this
#path to directory with binary log files
binlogs_path=/var/log/mysql/
#path to backup storage directory
backup_folder=/var/backups/mysql
#start writing to new binary log file
mysql --user=root --password='123456' -e 'FLUSH BINARY LOGS;'
#get list of binary log files
binlogs=$(sudo mysql --user=root --password='123456' -e 'SHOW BINARY LOGS;' | grep mysql-bin. | sed 's/\s.*$//')
#get list of binary log for backup (all but the last one)
binlogs_without_Last=`echo "${binlogs}" | head -n -1`
#get the last active binary log file
binlog_Last=`echo "${binlogs}" | tail -n -1`
#form full path to binary log files
binlogs_fullPath=`echo "${binlogs_without_Last}" | xargs -I % echo $binlogs_path%`
#compress binary logs
zip -P "hello" $backup_folder/$(date +%Y-%m-%d).zip $binlogs_fullPath
# save in the s3
s3cmd put $backup_folder/$(date +%Y-%m-%d).zip "s3://test/incremental-backup/`date '+%Y/%m/%d/'`" --no-encrypt
#delete saved binary log files
echo $binlog_Last | xargs -I % sudo mysql --user=root --password='123456' -e 'PURGE BINARY LOGS TO "%"';
And run this script within crontab. Open crontab with this command
crontab -e
0 2 * * * /bin/bash /root/incremental-backup.sh
# plan it daily at 2am for example
And it will run every 2 am.
Now let’s dive into details and break down the code from the above script file.
First, create a new sh file. Sh files, also known as shell scripts, are used to compose and execute Linux commands within them.
nano incremental-backup.sh
And define the paths of binary logs and storage where you gonna store binary log zip files.
binlogs_path=/var/log/mysql/
backup_folder=/var/backups/mysql
And flush the logs file which means MySql will create a new log file. All the MySQL operations after flushing will be in that new log file.
mysql --user=root --password='123456' -e 'FLUSH BINARY LOGS;'
And get the list of all binary log files
binlogs=$(sudo mysql --user=root --password='123456' -e 'SHOW BINARY LOGS;' | grep mysql-bin. | sed 's/\s.*$//')
The grep
command is used to filter out the files that contain "mysql-bin." in their name. The sed
command is used to remove any whitespace characters from the end of each file’s name. The result files are like this mysql-bin.000001
mysql-bin.000002
mysql-bin.000003
And get all binary log files without the last one and the last binary log file.
binlogs_without_Last=`echo "${binlogs}" | head -n -1`
binlog_Last=`echo "${binlogs}" | tail -n -1`
And create the complete paths for binary log files. Like this /var/log/mysql-bin.000001
binlogs_fullPath=`echo "${binlogs_without_Last}" | xargs -I % echo $binlogs_path%`
And compresses selected binary log files with a ‘hello’ password, creating a dated archive in the backup folder.
zip -P "hello" $backup_folder/$(date +%Y-%m-%d).zip $binlogs_fullPath
Then send that compressed file to S3.
s3cmd put $backup_folder/$(date +%Y-%m-%d).zip "s3://test/incremental-backup/`date '+%Y/%m/%d/'`" --no-encrypt
s3cmd put
: This is a command used to upload a file to an S3 bucket. Before using this command, you need to configure your s3 server. Follow the instruction in this link to configure. Don’t worry. It is pretty easy!
"s3://test/full-backup/date '+%Y/%m/%d/'"
: This specifies the destination path within the S3 bucket where you want to upload the file. Here's how this part works:
s3://test/full-backup/
: This is the base path in the S3 bucket where you want to upload your backup.date '+%Y/%m/%d/'
: This is a command that inserts the current date in the formatYYYY/MM/DD
. For example, if the current date is August 12, 2023, this part will be replaced with2023/08/12
.- The resulting destination path will be something like
s3://test/full-backup/2023/08/12/
, indicating the specific date-based directory in the S3 bucket where the backup will be stored.
--no-encrypt
: This option specifies that the uploaded file should not be encrypted.
And delete all the saved binary log files.
echo $binlog_Last | xargs -I % sudo mysql --user=root --password='123456' -e 'PURGE BINARY LOGS TO "%"';
You can restore the data with the below command. you can also explore further details in this blog.
mysqlbinlog /mysql-bin.000001 | sudo mysql - user=root - password=123456
That is all…
Tidak ada komentar:
Posting Komentar