Optimized Large File Transfer in Windows Networks: Reliable Methods for SQL Server Backup Migration


2 views

For years I struggled with transferring SQL Server backup files between Windows Server 2000 machines. The 2GB transfer limit kept haunting me across protocols:

// Failed attempts log:
1. Windows Explorer copy: Fails at 1.8-2.1GB
2. Apache HTTP download: Crashes at ~2GB 
3. FTP transfer: Connection drops at 2GB mark
4. Robocopy: Same 2GB barrier

The root cause lies in 32-bit filesystem limitations in Windows 2000 era protocols. Modern solutions must account for:

  • Signed 32-bit integer limits (2,147,483,647 bytes)
  • SMB 1.0 protocol restrictions
  • TCP window scaling issues

Here's the PowerShell script I now use for daily 20GB+ transfers:

# SQL Backup Transfer Script
$source = "D:\SQLBackups\daily.bak"
$dest = "\\testserver\backup\"
$chunkSize = 1GB

$fileStream = [System.IO.File]::OpenRead($source)
$buffer = New-Object byte[] $chunkSize
$totalRead = 0

while (($bytesRead = $fileStream.Read($buffer, 0, $buffer.Length)) -gt 0) {
    $chunkFile = "$dest\chunk_$totalRead.bin"
    [System.IO.File]::WriteAllBytes($chunkFile, $buffer[0..($bytesRead-1)])
    $totalRead += $bytesRead
}

$fileStream.Close()

1. BITS (Background Intelligent Transfer Service)

bitsadmin /transfer myjob /download /priority normal ^
    http://server/daily.bak C:\backups\daily.bak

2. rsync for Windows (Delta copying)

cwrsync -avz --partial --progress /cygdrive/d/SQLBackups/ username@testserver:/backups/

3. Database Native Tools

SQL Server's own backup to network location:

BACKUP DATABASE MyDB 
TO DISK = '\\testserver\backup\MyDB.bak'
WITH COMPRESSION, CHECKSUM

Create a scheduled task that runs this batch script nightly:

@echo off
set timestamp=%date:~-4%%date:~3,2%%date:~0,2%
robocopy D:\SQLBackups \\testserver\backup *.bak /MT:16 /Z /R:3 /W:30 ^
    /LOG:C:\transfer_logs\backup_%timestamp%.txt
if %ERRORLEVEL% gtr 3 (
    powershell -ExecutionPolicy Bypass -File C:\scripts\alert_admin.ps1
)

Many Windows administrators hit this frustrating wall - when transferring files larger than 4GB across a network, operations fail mysteriously around the 2GB mark. This stems from several legacy limitations:

  • FAT32 file system limitations (max 4GB file size)
  • 32-bit application memory constraints
  • TCP/IP socket buffer overflows

Here are the most robust methods I've tested for transferring multi-gigabyte SQL backup files:

1. Robocopy with Restartable Mode

robocopy \\source\backups \\destination\backups *.bak /MT:16 /Z /R:1 /W:30 /NP /LOG:C:\transfer.log

Key parameters:

  • /MT - Multithreading for performance
  • /Z - Restartable mode (crucial for large files)
  • /R:1 - Minimal retries to prevent hangs

2. SFTP with WinSCP Scripting

Create a batch file with this WinSCP script:

option batch abort
option confirm off
open sftp://user:password@backupserver -hostkey="ssh-rsa 2048 xx:xx:xx..."
put "D:\SQLBackups\*.bak" "/remote/backups/"
close
exit

3. PowerShell Stream Transfer

For programmatic control, use PowerShell's streaming capabilities:


$source = "\\sqlserver\backups\daily.bak"
$dest = "\\testserver\restore\daily.bak"
$bufferSize = 1MB

$sourceStream = [System.IO.File]::OpenRead($source)
$destStream = [System.IO.File]::Create($dest)

try {
    $bytesRead = 1
    $buffer = New-Object byte[] $bufferSize
    
    while ($bytesRead -gt 0) {
        $bytesRead = $sourceStream.Read($buffer, 0, $bufferSize)
        $destStream.Write($buffer, 0, $bytesRead)
        $percent = ($sourceStream.Position / $sourceStream.Length) * 100
        Write-Progress -Activity "Transferring" -Status "$percent% Complete" -PercentComplete $percent
    }
} finally {
    $sourceStream.Close()
    $destStream.Close()
}

Create a scheduled task that combines compression and transfer:

@echo off
set BACKUP_FILE=SQL_%DATE:~10,4%%DATE:~4,2%%DATE:~7,2%.bak
set DEST=\\testserver\backups\%BACKUP_FILE%

7z a -v2g -mx1 "D:\temp\split.7z" "E:\SQLBackups\daily.bak"
robocopy D:\temp \\testserver\backups split.* /Z /MT:8

if %ERRORLEVEL% EQU 0 (
    del D:\temp\split.*
    sqlcmd -S SQLServer -Q "EXEC sp_send_dbmail @recipients='admin@domain.com', @subject='Backup Transfer Complete'"
)

When transfers fail, check these common culprits:

Issue Diagnostic Command Solution
Packet loss ping -t server | find "Lost" Enable QoS on routers
Slow transfers netsh int tcp show global Disable TCP autotuning
Authentication klist purge Use fresh credentials