Many developers encounter this situation where the find
command returns exit code 0 even when no matching files are found:
$ find /tmp -name "nonexistent_file"
$ echo $?
0
This behavior is actually by design according to POSIX standards - the exit status indicates whether the command executed successfully, not whether it found matches.
In shell scripting, we often need to check if files exist before proceeding with operations. The default behavior can lead to silent failures:
#!/bin/bash
find . -name "*.log" -exec rm {} \;
# Script continues even if no logs were found
Method 1: Using grep with find
Pipe find's output to grep which returns non-zero for no matches:
find /tmp -name "something" | grep "."
if [[ $? != 0 ]]; then
echo "No files found" >&2
exit 1
fi
Method 2: Counting Results with wc
count=$(find /tmp -name "something" | wc -l)
if [[ $count -eq 0 ]]; then
echo "Error: No matches found" >&2
exit 1
fi
Method 3: Using find's -quit Option
Modern versions of find support -quit which exits immediately after first match:
if ! find /tmp -name "something" -quit; then
echo "No files found" >&2
exit 1
fi
For regular use, create a bash function:
find_or_fail() {
local path="$1"
local pattern="$2"
local matches=$(find "$path" -name "$pattern" | wc -l)
if [[ $matches -eq 0 ]]; then
echo "No files matching '$pattern' found in $path" >&2
return 1
fi
return 0
}
# Usage:
find_or_fail "/tmp" "*.backup" || exit 1
The different methods have varying performance impacts:
grep
method stops at first match (fastest)wc -l
processes all results-quit
is both fast and clean but requires GNU find
Consider these alternatives for file searching with better exit code behavior:
# Using fd (modern find alternative)
fd "pattern" /search/path
echo $? # Returns 1 if no matches
When working with the Unix/Linux find
command, many developers encounter an unexpected behavior:
$ find /tmp -name something
$ echo $?
0
Even when no matching files are found, find
returns exit status 0 (success). This can cause issues in scripts where you need to handle the "not found" case differently from successful execution.
Consider these common scenarios:
- Build systems checking for prerequisite files
- Deployment scripts verifying expected artifacts
- Cron jobs that need to alert when expected files are missing
The default behavior means you can't reliably use the exit status alone to detect whether files were found.
Method 1: Using -quit with wc
Count the matches and check the count:
if [ $(find /tmp -name something -print -quit | wc -l) -eq 0 ]; then
echo "No files found" >&2
exit 1
fi
Method 2: Store Results in Variable
A more efficient approach that avoids running find twice:
files=$(find /tmp -name something)
if [ -z "$files" ]; then
echo "Error: No matching files" >&2
exit 1
fi
Method 3: Using find with -exec
Execute a command only if files are found:
find /tmp -name something -exec false {} +
if [ $? -eq 0 ]; then
echo "No files found" >&2
exit 1
fi
For production-grade scripts, consider this robust pattern:
#!/bin/bash
search_root="/tmp"
pattern="something"
# Find files and store results
found_files=$(find "$search_root" -name "$pattern" -print)
if [ -z "$found_files" ]; then
echo "ERROR: No files matching '$pattern' found under $search_root" >&2
exit 1
elif [ $(echo "$found_files" | wc -l) -gt 10 ]; then
echo "WARNING: Found more than 10 matching files" >&2
fi
# Process found files
echo "$found_files" | while read -r file; do
# Processing logic here
echo "Processing: $file"
done
When dealing with large directories:
- Method 2 (variable storage) is most memory-efficient
- Add
-maxdepth
to limit search depth when possible - Use
-xdev
to prevent crossing filesystem boundaries
Example with performance optimizations:
find /tmp -maxdepth 3 -xdev -name "*.log" -print -quit | grep -q .
if [ $? -ne 0 ]; then
exit 1
fi