How to Find and List the Largest Files on a Windows System Using PowerShell and Command Line


3 views

As a Windows user or system administrator, you might often need to identify large files consuming disk space. Whether it's for cleanup, optimization, or simply curiosity, finding these files efficiently can save significant time.

PowerShell provides powerful cmdlets to scan and sort files by size. Here's a simple one-liner that recursively searches from a given directory and sorts by size:


Get-ChildItem -Path "C:\" -Recurse -File -ErrorAction SilentlyContinue | 
Sort-Object -Property Length -Descending | 
Select-Object -First 20 -Property Name, Directory, @{Name="SizeGB";Expression={[math]::Round($_.Length / 1GB, 2)}}

This command will:

  • Scan all files starting from C:\
  • Ignore inaccessible folders (SilentlyContinue)
  • Sort by file size in descending order
  • Display top 20 largest files with their paths and sizes in GB

For a more comprehensive solution with better performance on large drives:


$outputFile = "C:\temp\LargeFilesReport.csv"
$minSizeGB = 1

Get-ChildItem -Path "C:\" -Recurse -File -ErrorAction SilentlyContinue | 
Where-Object { $_.Length -gt ($minSizeGB * 1GB) } | 
Sort-Object -Property Length -Descending | 
Select-Object Name, Directory, @{Name="SizeGB";Expression={[math]::Round($_.Length / 1GB, 2)}}, LastWriteTime | 
Export-Csv -Path $outputFile -NoTypeInformation

For those preferring the traditional command prompt, here's a solution using built-in tools:


@echo off
setlocal enabledelayedexpansion
set "output=C:\temp\large_files.txt"
set "drive=C:"
set "minSizeMB=500"

echo Scanning %drive% for files larger than %minSizeMB% MB...
echo File Path,Size (MB),Modified Date > "%output%"

for /f "delims=" %%F in ('dir /s /b /a-d "%drive%\*"') do (
  set "size=%%~zF"
  set /a "sizeMB=!size!/1048576"
  if !sizeMB! gtr %minSizeMB% (
    echo "%%F",!sizeMB!,%%~tF >> "%output%"
  )
)

Several excellent GUI tools can visualize disk usage:

  • WinDirStat: Creates a treemap visualization
  • TreeSize Free: Shows folder sizes in a hierarchical view
  • WizTree: Extremely fast scanning using MFT reading

When scanning large drives:

  • Exclude system folders like C:\Windows\WinSxS unless specifically needed
  • Run scans during off-peak hours for production systems
  • Consider using the /L parameter with dir command to avoid following mount points

For servers or workstations needing regular maintenance:


# Scheduled task script to find and optionally delete temp files older than 30 days
$cutoffDate = (Get-Date).AddDays(-30)
$largeTempFiles = Get-ChildItem -Path "C:\temp","C:\Windows\Temp" -Recurse -File | 
                  Where-Object { $_.Length -gt 100MB -and $_.LastWriteTime -lt $cutoffDate }

$largeTempFiles | ForEach-Object {
    Write-Host "Found old large file: $($_.FullName) ($([math]::Round($_.Length / 1MB)) MB)"
    # Uncomment to actually delete
    # $_ | Remove-Item -Force -WhatIf
}

When performing disk cleanup or storage optimization on Windows systems, locating large files across the entire filesystem is a common challenge. Unlike directory-specific searches, system-wide scans require special techniques to efficiently crawl the entire disk.

Windows provides several built-in tools for this purpose:

PowerShell Method

The most efficient approach uses PowerShell's Get-ChildItem cmdlet with recursive searching:

Get-ChildItem -Path "C:\" -Recurse -File -ErrorAction SilentlyContinue | 
Sort-Object -Property Length -Descending | 
Select-Object -First 20 FullName, @{Name="SizeGB";Expression={[math]::Round($_.Length/1GB,2)}}

This command:

  • Scans the C: drive recursively
  • Ignores directory errors
  • Sorts by file size (descending)
  • Returns top 20 files with size in GB

CMD Alternative

For systems without PowerShell, we can use a batch script:

@echo off
setlocal enabledelayedexpansion
set "target=C:\"
set "output=LargeFilesList.txt"

echo Scanning %target% for large files...
for /f "tokens=*" %%a in ('dir /s /b /a-d /o-s "%target%"') do (
    set "size=%%~za"
    if !size! gtr 1073741824 (
        echo %%~fa - !size! bytes >> "%output%"
    )
)
echo Results saved to %output%

Parallel Processing with PowerShell

For faster scanning on multi-core systems:

$fileList = Get-ChildItem -Path "C:\" -Recurse -File -ErrorAction SilentlyContinue | 
ForEach-Object -Parallel {
    [PSCustomObject]@{
        Path = $_.FullName
        Size = $_.Length
    }
} -ThrottleLimit 8

$fileList | Sort-Object -Property Size -Descending | Select-Object -First 50

Filtering by File Extension

To focus on specific file types:

$extensions = @(".mp4",".avi",".iso",".vhd")
Get-ChildItem -Path "C:\" -Include $extensions -Recurse -File | 
Sort-Object -Property Length -Descending | 
Select-Object -First 10 FullName, @{Name="SizeMB";Expression={[math]::Round($_.Length/1MB,2)}}

System-wide scans can be resource-intensive. Best practices include:

  • Run scans during low-usage periods
  • Exclude system directories (e.g., C:\Windows\WinSxS)
  • Use -Depth parameter to limit recursion levels
  • Consider running as Administrator for complete access

For users preferring GUI solutions:

  • WinDirStat (open source)
  • TreeSize (commercial)
  • WizTree (optimized for speed)