Supercharge your Laravel apps with Cloudflare's edge database
π Deploy your database to Cloudflare's global edge network β‘ 20x faster bulk inserts with automatic raw SQL optimization π One-command MySQL to D1 migration π― Zero-config Eloquent ORM support β just worksβ’
- β Drop-in Replacement β Works with existing Laravel database code
- π Full Eloquent ORM β Models, relationships, migrations, seeds... everything!
- π‘οΈ Foreign Key Constraints β Automatically enabled (unlike standard SQLite)
- π Schema Builder Support β Create/modify tables with familiar Laravel syntax
- π 20x Faster Bulk Inserts β Automatic raw SQL conversion (47s β 2.3s on 250 rows)
- π¦ Smart Chunking β Leverages D1's 100KB SQL limit vs 100-parameter limit
- π Zero Configuration β Works transparently with
insert(),insertOrIgnore(),upsert() - π― Zero Overhead β Direct REST API communication with D1
- β±οΈ Production Tested β Validated with real-world workloads
- π Edge Database β Data stored on Cloudflare's global network
- πΊοΈ Low Latency β 50-150ms reads from anywhere in the world
- π Scales to Zero β Pay only for what you use
- π° Free Tier Friendly β 500MB storage, 5M reads/day included
- π¦ One-Command Migration β Migrate entire MySQL database with single command
- π― Smart Schema Conversion β Automatic MySQLβSQLite type mapping
- β Data Integrity β Automatic validation and verification
- π High Performance β 33x faster with intelligent batching
- π― Laravel 11 & 12 Compatible β Tested with modern Laravel versions
- π§ͺ Full Test Coverage β 57 automated tests, 100% passing
- π Comprehensive Docs β Every feature explained with examples
- π‘ Easy Setup β 5-minute configuration, no complex setup
composer require erimeilis/laravel-cloudflare-d1The package will automatically register via Laravel's package discovery.
Requirements:
- PHP 8.2 or higher
- Laravel 11.x or 12.x
- Cloudflare account (free tier works!)
Before using this package, you need to set up a D1 database in your Cloudflare account and get your credentials.
-
Sign up/Login to Cloudflare
- Go to dash.cloudflare.com
- Sign up for a free account or log in
-
Create a D1 Database
- In the Cloudflare dashboard, navigate to Workers & Pages β D1 SQL Database
- Click "Create database"
- Enter a database name (e.g.,
my-laravel-db) - Click "Create"
-
Note Your Database ID
- After creation, you'll see your database listed
- Click on your database name
- Copy the Database ID (looks like:
a1b2c3d4-e5f6-7890-abcd-ef1234567890)
You can find your Account ID using any of these methods:
Method 1: From the URL (Easiest) β‘
- Go to your Cloudflare dashboard: dash.cloudflare.com
- Look at the URL in your browser's address bar
- The Account ID is the string of characters immediately after
dash.cloudflare.com/- Example:
dash.cloudflare.com/1234567890abcdef1234567890abcdef/workers-and-pages - Your Account ID:
1234567890abcdef1234567890abcdef
- Example:
Method 2: Workers & Pages Section
- Go to dash.cloudflare.com
- Navigate to Workers & Pages in the left sidebar
- Look for the Account details section on the right
- Click Click to copy next to your Account ID
Method 3: Account Overview API Section
- Go to your Account Home in the dashboard
- Scroll down to the API section at the bottom of the page
- You'll see your Account ID displayed there
- Go to dash.cloudflare.com/profile/api-tokens
- Click "Create Token"
- Use the "Edit Cloudflare Workers" template OR create a custom token with these permissions:
- Account β D1 β Edit
- Click "Continue to summary"
- Click "Create Token"
β οΈ IMPORTANT: Copy your token immediately - you won't see it again!
You need three values:
- π CLOUDFLARE_ACCOUNT_ID: From dashboard URL or Workers & Pages section (see Method 1 above)
- πΎ CLOUDFLARE_D1_DATABASE_ID: From D1 database details page
- π CLOUDFLARE_D1_API_TOKEN: Generated via API Tokens page
Add these to your .env file:
# Get from: Dashboard sidebar
CLOUDFLARE_ACCOUNT_ID=1234567890abcdef1234567890abcdef
# Get from: D1 database details page
CLOUDFLARE_D1_DATABASE_ID=a1b2c3d4-e5f6-7890-abcd-ef1234567890
# Get from: API Tokens page (create new token)
CLOUDFLARE_D1_API_TOKEN=your_secret_token_hereAdd to config/database.php:
'connections' => [
// ... existing connections
'd1' => [
'driver' => 'd1',
'account_id' => env('CLOUDFLARE_ACCOUNT_ID'),
'database_id' => env('CLOUDFLARE_D1_DATABASE_ID'),
'api_token' => env('CLOUDFLARE_D1_API_TOKEN'),
'prefix' => '',
'prefix_indexes' => true,
],
],php artisan vendor:publish --provider="EriMeilis\CloudflareD1\D1ServiceProvider" --tag="config"This creates config/cloudflare-d1.php for advanced configuration.
Want to verify everything works? Here's a 2-minute test:
php artisan tinker// Test the connection
DB::connection('d1')->select('SELECT 1 as test');
// Should return: [{"test": 1}]Create a simple migration:
php artisan make:migration create_test_users_tableEdit the migration:
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
protected $connection = 'd1';
public function up(): void
{
Schema::connection('d1')->create('test_users', function (Blueprint $table) {
$table->id();
$table->string('name');
$table->string('email')->unique();
$table->timestamps();
});
}
public function down(): void
{
Schema::connection('d1')->dropIfExists('test_users');
}
};Run the migration:
php artisan migrate --database=d1php artisan tinker// Insert
DB::connection('d1')->table('test_users')->insert([
'name' => 'Alice',
'email' => 'alice@example.com',
'created_at' => now(),
'updated_at' => now(),
]);
// Query
$users = DB::connection('d1')->table('test_users')->get();
// Should return your inserted record!
// Test batching (10x faster!)
DB::connection('d1')->transaction(function () {
for ($i = 1; $i <= 10; $i++) {
DB::connection('d1')->table('test_users')->insert([
'name' => "User {$i}",
'email' => "user{$i}@example.com",
'created_at' => now(),
'updated_at' => now(),
]);
}
});
// All 10 INSERTs executed in ONE batch API call! πCreate a model:
php artisan make:model TestUsernamespace App\Models;
use Illuminate\Database\Eloquent\Model;
class TestUser extends Model
{
protected $connection = 'd1';
protected $table = 'test_users';
protected $fillable = ['name', 'email'];
}Use it:
php artisan tinker// Create
$user = App\Models\TestUser::create([
'name' => 'Bob',
'email' => 'bob@example.com'
]);
// Find
$user = App\Models\TestUser::find(1);
// Update
$user->update(['name' => 'Bob Updated']);
// All
$users = App\Models\TestUser::all();β If all tests pass, you're ready to use D1 in your Laravel app!
Use D1 exactly like any other Laravel database:
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
class User extends Model
{
protected $connection = 'd1';
protected $fillable = ['name', 'email'];
}
// Usage
User::create(['name' => 'Alice', 'email' => 'alice@example.com']);
$users = User::where('active', true)->get();use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class extends Migration
{
protected $connection = 'd1';
public function up(): void
{
Schema::connection('d1')->create('users', function (Blueprint $table) {
$table->id();
$table->string('name');
$table->string('email')->unique();
$table->timestamps();
});
}
public function down(): void
{
Schema::connection('d1')->dropIfExists('users');
}
};Run migrations:
php artisan migrate --database=d1use Illuminate\Support\Facades\DB;
// Select
$users = DB::connection('d1')
->table('users')
->where('active', true)
->get();
// Insert
DB::connection('d1')
->table('users')
->insert([
'name' => 'Bob',
'email' => 'bob@example.com',
]);
// Update
DB::connection('d1')
->table('users')
->where('id', 1)
->update(['name' => 'Bob Updated']);
// Delete
DB::connection('d1')
->table('users')
->where('id', 1)
->delete();DB::connection('d1')->transaction(function () {
foreach ($users as $userData) {
User::create($userData);
}
});
// All INSERTs executed in a single batch API call!Performance: 10 INSERTs go from ~1000ms to ~150ms
// β Bad: N+1 queries
$users = User::all();
foreach ($users as $user) {
echo $user->posts->count();
}
// β
Good: 2 queries total
$users = User::with('posts')->get();User::chunk(1000, function ($users) {
foreach ($users as $user) {
// Process
}
});// config/database.php
'connections' => [
'd1_primary' => [
'driver' => 'd1',
'database_id' => env('D1_PRIMARY_DATABASE_ID'),
// ...
],
'd1_analytics' => [
'driver' => 'd1',
'database_id' => env('D1_ANALYTICS_DATABASE_ID'),
// ...
],
],// config/cloudflare-d1.php
'batch' => [
'enabled' => true,
'size' => 100, // Max queries per batch (1-100)
],// config/cloudflare-d1.php
'cache' => [
'enabled' => true,
'driver' => 'redis',
'ttl' => 300, // 5 minutes
],- π Custom PDO Driver: Translates PDO calls to D1 REST API requests
- π¦ Query Batching: Accumulates queries in transactions β single batch API call
- π SQLite Grammar: D1 uses SQLite syntax, so we extend Laravel's SQLite grammar
- π Foreign Keys: Automatically enabled (disabled by default in SQLite)
Laravel Eloquent/Query Builder
β
D1 Connection
β
D1 PDO
β
Query Batcher (batching enabled in transactions)
β
D1 API Client
β
Cloudflare D1 REST API
- β No FULLTEXT indexes β Use Laravel Scout for full-text search
- β No stored procedures β Move logic to application layer
β οΈ Limited ALTER TABLE β Some schema changes require table rebuild- β 100 parameter limit per query β Automatically handled by this package
- πΎ Database size: 10 GB max (Paid plan), 500 MB (Free plan)
- π― Best for: Read-heavy workloads, globally distributed apps
- β±οΈ Write latency: ~50-200ms per query (50-150ms with batching)
- π Read latency: ~50-150ms per query
- β‘ Batch operations: 10-11x faster for multiple operations
D1/SQLite has foreign keys disabled by default. This package automatically enables them, but if you encounter issues:
// Manually enable
DB::connection('d1')->statement('PRAGMA foreign_keys = ON');
// Or disable for specific operations
DB::connection('d1')->disableForeignKeyConstraints();
// ... operations ...
DB::connection('d1')->enableForeignKeyConstraints();Enable query logging to identify slow queries:
// config/cloudflare-d1.php
'monitoring' => [
'slow_query_threshold' => 1000, // Log queries > 1000ms
'log_api_requests' => true,
],Error: "D1 API request failed: Unauthorized" or "Invalid credentials"
This means your Cloudflare credentials are incorrect or missing. Verify them:
# Check environment variables are loaded
php artisan tinker
>>> env('CLOUDFLARE_ACCOUNT_ID')
>>> env('CLOUDFLARE_D1_DATABASE_ID')
>>> env('CLOUDFLARE_D1_API_TOKEN')If any return null, check:
-
Environment file: Ensure
.envhas the correct values (no quotes needed) -
Config cache: Clear Laravel's config cache
php artisan config:clear
-
Credential format:
- Account ID: 32-character hexadecimal (e.g.,
1234567890abcdef1234567890abcdef) - Database ID: UUID format (e.g.,
a1b2c3d4-e5f6-7890-abcd-ef1234567890) - API Token: Long alphanumeric string starting with token identifier
- Account ID: 32-character hexadecimal (e.g.,
-
API Token permissions: Ensure your token has D1 Edit permissions
- Go to API Tokens
- Click on your token
- Verify it has "Account - D1 - Edit" permission
Error: "Database not found" or "Database ID invalid"
-
Verify the database ID is correct:
- Go to Cloudflare D1 Dashboard
- Navigate to Workers & Pages β D1 SQL Database
- Click on your database
- Copy the Database ID from the details page
-
Ensure the database exists and is associated with the correct account
Common Mistakes:
- β Using quotes around values in
.env:CLOUDFLARE_ACCOUNT_ID="abc123"(wrong) - β
No quotes:
CLOUDFLARE_ACCOUNT_ID=abc123(correct) - β Missing
.enventry after adding toconfig/database.php - β Using old cached config after changing
.env(runphp artisan config:clear) - β API token without sufficient permissions
Migrate your existing MySQL database to Cloudflare D1 with one command!
# Migrate all tables
php artisan d1:migrate-from-mysql
# Preview first (dry run)
php artisan d1:migrate-from-mysql --dry-run
# Migrate specific tables only
php artisan d1:migrate-from-mysql --tables=users,posts,comments
# Exclude specific tables
php artisan d1:migrate-from-mysql --exclude=logs,cacheβ Schema Conversion
- All MySQL data types β SQLite equivalents
- AUTO_INCREMENT β AUTOINCREMENT
- ENUM β TEXT with CHECK constraints
- Foreign keys with CASCADE actions
- Indexes (regular and UNIQUE)
β Data Migration
- Memory-efficient chunked export
- Batch INSERT operations (33x faster)
- Progress tracking
- Automatic validation
- One-Command Migration: Complete database migration in a single Artisan command
- Selective Migration: Choose which tables to migrate with
--tablesor--exclude - Structure Only: Migrate schema without data using
--structure-only - Data Only: Migrate data into existing tables using
--data-only - Dry Run: Preview migration plan without executing
- Validation: Automatic row count and data integrity verification
For detailed migration instructions, schema conversion details, troubleshooting, and best practices, see MIGRATION_GUIDE.md.
composer testContributions are welcome! Please:
- π΄ Fork the repository
- πΏ Create a feature branch
- β Add tests for new functionality
- π Submit a pull request
MIT License - see LICENSE file
Made with ππ using Laravel and Cloudflare D1