Databases 6 min read

Efficiently Inserting 10 Million Records into MySQL with PHP

This tutorial explains how to use PHP and PDO to quickly insert ten million rows into a remote MySQL database by preparing the connection, employing batch inserts, managing transactions, tuning MySQL settings, and optionally using the LOAD DATA statement for optimal performance.

php中文网 Courses
php中文网 Courses
php中文网 Courses
Efficiently Inserting 10 Million Records into MySQL with PHP

In development, inserting massive amounts of data into a database can cause performance issues, especially when the database is remote. This article explores how to use PHP to quickly insert 10 million records into a MySQL database and shares optimization methods for best performance.

1. Database connection and preparation

Before starting, ensure your system meets the following requirements:

PHP and MySQL are installed

Database connection is configured

Using the PDO extension to connect to the database is a good choice because it provides a high‑performance and secure database access interface.

Below is an example of using PDO to connect to a MySQL database:

<code>// 数据库连接配置
$host = 'localhost';
$dbname = 'your_database';
$user = 'your_username';
$password = 'your_password';

// 连接数据库
try {
    $pdo = new PDO("mysql:host=$host;dbname=$dbname", $user, $password);
} catch (PDOException $e) {
    die("Error: " . $e->getMessage());
}
</code>

2. Batch insert

Inserting records one by one can cause a large query overhead, while batch insertion can significantly improve performance. Batch insertion inserts multiple records in a single operation using prepared statements and bound parameters.

Below is an example of batch insertion with PDO:

<code>// 数据数组
$data = [
    ['column1' => 'value1', 'column2' => 'value2'],
    ['column1' => 'value3', 'column2' => 'value4'],
];

// 准备 SQL 语句
$sql = "INSERT INTO your_table (column1, column2) VALUES (?, ?)";

// 创建准备好的语句
$stmt = $pdo->prepare($sql);

// 绑定数据
foreach ($data as $row) {
    $stmt->bindParam(1, $row['column1']);
    $stmt->bindParam(2, $row['column2']);
}

// 执行 SQL 语句
$stmt->execute();
</code>

3. Transaction management

A transaction is a series of SQL statements that either all succeed or all fail, preventing interruptions during insertion. Using transactions can markedly improve insert performance for large‑scale data because they reduce I/O overhead and lock contention.

Below is an example of transaction management with PDO:

<code>try {
    // 开始事务
    $pdo->beginTransaction();

    // 执行插入操作
    foreach ($data as $row) {
        $stmt->execute([$row['column1'], $row['column2']]);
    }

    // 提交事务
    $pdo->commit();
} catch (PDOException $e) {
    // 回滚事务
    $pdo->rollBack();
    die("Error: " . $e->getMessage());
}
</code>

4. Adjust database configuration

Adjusting MySQL configuration according to actual needs can boost insert performance. For example, increase the max_allowed_packet parameter to support larger packets, or adjust innodb_buffer_pool_size to improve InnoDB engine performance.

<code># 在 MySQL 配置文件中
max_allowed_packet = 256M
innodb_buffer_pool_size = 512M
</code>

5. Use LOAD DATA statement for fast data import

If the data comes from a file, you can use MySQL's LOAD DATA statement for rapid import. The LOAD DATA command can load an entire file into a table in one operation, which is usually much faster than inserting rows one by one.

Below is an example of using LOAD DATA for data import:

<code>$sql = "LOAD DATA LOCAL INFILE 'path/to/your/file.csv' INTO TABLE your_table FIELDS TERMINATED BY ','";

$pdo->exec($sql);
</code>

By properly preparing, batch inserting, managing transactions, and tuning MySQL configuration, you can efficiently insert massive data into MySQL with PHP, improving performance and reducing resource consumption, enabling your application to handle large volumes of data gracefully.

PerformancetransactionMySQLPHPBulk InsertPDO
php中文网 Courses
Written by

php中文网 Courses

php中文网's platform for the latest courses and technical articles, helping PHP learners advance quickly.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.