Splitting large MySQL dump files

MySQL

One of the depressed things with operating with MySQL is of importing giant sql dump files.
Either you get a “max execution time exceeded” error from PHP or a”Max_allowed_packet_size” from MySQL.

The goal is separate the large “INSERT” statements into smaller size files.
Every statement has to be in single line as shown below:

INSERT INTO `dt_codes` VALUES ...
INSERT INTO `dt_codes` VALUES ...
INSERT INTO `dt_codes` VALUES ...
.

To create the dump file we can use the following command or (using phpMyAdmin)

mysqldump -uUSER -pPASS --databases DATABASE_NAME --tables TABLE_NAME \
--extended-insert = FALSE > dump.sql

Once you have the big dump file you can use the following PHP script for splitting the file into smaller chunks.

<?php
 
set_time_limit(600);
ini_set("auto_detect_line_endings", true);
 
/* Number of 'insert' statements per file */
$max_lines_per_split = 50000;
 
$dump_file = "dump.sql";
$split_file = "dump-split-%d.sql";
$dump_directory = "./sql-dump/";
 
$line_count = 0;
$file_count = 1;
$total_lines = 0;
 
$handle = @fopen($dump_file, "r");
$buffer = "";
 
if ($handle) {
    while(($line = fgets($handle)) !== false) {
        /* Only read 'insert' statements */
        if(!preg_match("/insert/i", $line)) continue;
        $buffer .= $line;
        $line_count++;
 
        /* Copy buffer to the split file */
        if($line_count >= $max_lines_per_split) {
            $file_name = $dump_directory . sprintf($split_file, $file_count);
            $out_write = @fopen($file_name, "w+");
            fputs($out_write, $buffer);
            fclose($out_write);
            $buffer = '';
            $line_count = 0;
            $file_count++;
        }
    }
 
    if($buffer) {
        /* Write out the remaining buffer */
        $file_name = $dump_directory . sprintf($split_file, $file_count);
        $out_write = @fopen($file_name, "w+");
        fputs($out_write, $buffer);
        fclose($out_write);
    }
 
    fclose($handle);
    echo "done.";
}

then gzip the files to reduce the size:

gzip dump-split-*
About This Author

I'm a software engineer with many years of experience, open source enthusiast, now I'm creating and contributing to awesome PHP web projects. I love coding as much as learning, and I enjoy trying new languages and patterns. My passion revolves around (but is not limited to) back-end development.

You are not signed in. Sign in to post comments.