When working with JSON data files containing single extremely long lines (common in minified JSON or log files), standard grep output becomes problematic. The matched line content often wraps across multiple terminal screens, making it difficult to quickly identify which files contain your search pattern.
The -l
(lowercase L) flag tells grep to only output filenames containing matches:
grep -l "keyword" *.json
For case-insensitive searches combined with filename-only output:
grep -il "keyword" *.json
When you need both filenames and some context, consider these variations:
# Show filename with line number but no content
grep -n "keyword" *.json | cut -d: -f1 | uniq
# Show filename plus 5 characters of context
grep -l "keyword" *.json | xargs -I{} bash -c 'echo -n "{}: "; grep -o "keyword" {} | head -c5'
Imagine searching for API keys in minified configuration files:
grep -l "api_key" config/*.json
# Outputs:
# config/prod.json
# config/staging.json
For directories with thousands of files, combine with find for better performance:
find . -name "*.json" -exec grep -l "keyword" {} +
Make the output more readable with color highlighting:
grep -l --color=always "keyword" *.json | grep --color=always ".*"
When you just need to know how many files contain the pattern:
grep -l "keyword" *.json | wc -l
The -l
flag is your best friend when dealing with grep searches against files containing extremely long lines. Combine it with other grep options to create powerful search pipelines that give you exactly the information you need.
When working with JSON files containing single extremely long lines, a standard grep
search can produce messy output that overwhelms your terminal. Consider this common scenario:
grep "example_keyword" *.json
This might output hundreds or thousands of characters per matching file, making it difficult to quickly identify which files contain the keyword.
The simplest solution is to use grep's -l
(lowercase L) flag, which tells grep to only output filenames of files containing matches:
grep -l "example_keyword" *.json
This will display something like:
data1.json
data3.json
For more control or when using tools without an equivalent flag, consider these alternatives:
# Using awk
grep "example_keyword" *.json | awk -F: '{print $1}' | uniq
# Using cut
grep "example_keyword" *.json | cut -d: -f1 | uniq
# Using find + grep
find . -name "*.json" -exec grep -l "example_keyword" {} +
When dealing with thousands of files, these methods have different performance characteristics:
grep -l
stops reading a file after the first match- The
find
approach handles large directories more efficiently - Piped solutions (
awk
/cut
) process all matching lines
Imagine searching for API keys in configuration files:
grep -l "\"api_key\":" config/*.json
This quickly identifies all JSON files containing API key definitions without dumping their entire contents.