Managing storage capacity in enterprise file servers often becomes reactive rather than proactive. When you receive a "low disk space" alert, identifying the source of sudden growth becomes a forensic exercise. The ideal solution would track folder sizes at regular intervals while being:
- Lightweight enough to run on production servers
- Capable of historical comparison
- Flexible for specific folder monitoring
- Visualization-ready for trend analysis
For Windows environments, PowerShell provides excellent capabilities without requiring additional software. Here's a script that logs folder sizes to CSV and detects growth patterns:
# FolderSizeTracker.ps1
$targetFolders = @(
"D:\Shares\Finance",
"D:\Shares\Engineering",
"D:\Shares\Marketing",
"D:\Shares\HR",
"D:\Shares\Operations"
)
$outputFile = "D:\Admin\FolderSizes_$(Get-Date -Format 'yyyyMMdd').csv"
$results = @()
foreach ($folder in $targetFolders) {
$size = (Get-ChildItem $folder -Recurse | Measure-Object -Property Length -Sum).Sum / 1GB
$results += [PSCustomObject]@{
Date = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Folder = $folder
SizeGB = [math]::Round($size, 2)
}
}
$results | Export-Csv -Path $outputFile -NoTypeInformation -Append
Combine the PowerShell script with Task Scheduler (running weekly) and Power BI for visualization. Create a Power Query that consolidates all historical CSV files:
let
Source = Folder.Files("D:\Admin"),
Filtered = Table.SelectRows(Source, each Text.StartsWith([Name], "FolderSizes")),
Combined = Table.Combine(List.Transform(Filtered[Content], each Csv.Document(_,[Delimiter=",", Encoding=1252]))),
Typed = Table.TransformColumnTypes(Combined,{{"Date", type datetime}, {"Folder", type text}, {"SizeGB", type number}})
in
Typed
For environments where PowerShell isn't available or practical:
TreeSize Free
Jam Software's TreeSize Free can be scheduled to run with command-line parameters:
TreeSizeFree.exe /scan "D:\Shares" /s /export=csv /output=D:\Admin\TreeSizeReport.csv
WinDirStat with Automation
While primarily GUI-based, WinDirStat can be automated:
wds_stat.py --path "D:\Shares" --output json --file report.json
For enterprise deployments, consider storing data in SQL:
CREATE TABLE folder_sizes (
id INT IDENTITY(1,1) PRIMARY KEY,
scan_date DATETIME NOT NULL,
folder_path NVARCHAR(512) NOT NULL,
size_gb DECIMAL(10,2) NOT NULL,
file_count INT NOT NULL
);
-- PowerShell adaptation would insert records
$query = @"
INSERT INTO folder_sizes (scan_date, folder_path, size_gb, file_count)
VALUES ('{0}', '{1}', {2}, {3})
"@ -f (Get-Date), $folder.FullName, $sizeGB, $fileCount
Add threshold-based alerting to your scripts:
# Compare with last week's size
$lastWeek = Import-Csv $lastWeekFile | Where-Object {$_.Folder -eq $folder}
if ($sizeGB -gt ($lastWeek.SizeGB * 1.5)) {
Send-MailMessage -Subject "Rapid growth detected in $folder" -Body "50% increase detected"
}
As infrastructure scales, tracking storage consumption becomes critical yet tedious. Traditional methods like manual directory size checks or built-in OS tools provide only point-in-time snapshots without historical context. Here are robust scripting solutions I've implemented across enterprise environments.
This PowerShell script logs folder sizes with timestamps to CSV for trend analysis:
# FolderSizeTracker.ps1
$logPath = "C:\StorageLogs\size_log.csv"
$targetFolders = @(
"\\fileserver\shared\finance",
"\\fileserver\shared\engineering",
"\\fileserver\shared\marketing"
)
$results = foreach ($folder in $targetFolders) {
$size = (Get-ChildItem $folder -Recurse | Measure-Object -Property Length -Sum).Sum / 1GB
[PSCustomObject]@{
Timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
Folder = $folder
SizeGB = [math]::Round($size, 2)
}
}
$results | Export-Csv -Path $logPath -Append -NoTypeInformation
For mixed environments, this Python script offers greater flexibility:
# folder_monitor.py
import os
import csv
from datetime import datetime
def get_folder_size(start_path):
total_size = 0
for dirpath, dirnames, filenames in os.walk(start_path):
for f in filenames:
fp = os.path.join(dirpath, f)
total_size += os.path.getsize(fp)
return total_size / (1024 ** 3) # Convert to GB
def log_sizes(folders, output_file):
with open(output_file, 'a', newline='') as csvfile:
writer = csv.writer(csvfile)
for folder in folders:
size_gb = round(get_folder_size(folder), 2)
writer.writerow([
datetime.now().strftime("%Y-%m-%d %H:%M:%S"),
folder,
size_gb
])
if __name__ == "__main__":
folders_to_monitor = [
r"/mnt/shared/finance",
r"/mnt/shared/engineering",
r"/mnt/shared/marketing"
]
log_sizes(folders_to_monitor, "storage_growth_log.csv")
For automated monitoring:
- Windows: Create scheduled task running the PowerShell script daily
- Linux: Add cron job (e.g.,
0 2 * * * python3 /path/to/folder_monitor.py
)
The CSV output can be imported into:
- Excel/Power BI for trend charts
- Grafana with CSV data source plugin
- Custom matplotlib/Pandas visualizations
For large environments consider:
- SolarWinds Storage Resource Monitor
- Paessler PRTG with file sensor
- NetApp OCUM for NAS systems