SC2012: Don't Parse ls Output in Shell Scripts

Why parsing ls output is unreliable. Use find, glob patterns, or stat for safe file enumeration in bash scripts.

Anti-Patterns

Detailed Explanation

Why You Should Never Parse ls

The output of ls is designed for human consumption, not for scripts. It is unreliable with filenames containing spaces, newlines, or special characters, and its format varies between systems.

The Problem

# BAD: Breaks on filenames with spaces
for file in $(ls /path/to/dir); do
  process "$file"
done

# Filename "my file.txt" becomes two iterations: "my" and "file.txt"

Even worse with newlines in filenames:

# A file named "line1\nline2.txt" becomes two iterations
touch $'line1\nline2.txt'
for file in $(ls); do
  echo "File: $file"  # Two lines of output for one file
done

Safe Alternatives

Glob patterns (simplest):

for file in /path/to/dir/*; do
  [ -e "$file" ] || continue  # Handle empty directory
  process "$file"
done

# Specific extensions
for file in /path/to/dir/*.log; do
  [ -e "$file" ] || continue
  process "$file"
done

find command (recursive, with predicates):

# Null-terminated for safety
find /path -name "*.log" -print0 | while IFS= read -r -d '' file; do
  process "$file"
done

# Or use -exec
find /path -name "*.log" -exec process {} \;

stat for file information:

# Instead of parsing ls -l for file size:
size=$(stat -f%z "$file" 2>/dev/null || stat -c%s "$file" 2>/dev/null)

# Instead of ls -t for newest file:
newest=$(find /path -maxdepth 1 -type f -printf '%T@ %p\n' | sort -rn | head -1 | cut -d' ' -f2-)

Common ls Parsing Mistakes

Bad Pattern Safe Alternative
`ls *.txt wc -l`
`ls -t head -1`
`ls -la grep "^d"`
for f in $(ls) for f in *

Use Case

Any script that needs to list, iterate, or count files. Backup scripts, log rotation, file management automation, and deployment scripts that process files in directories.

Try It — Shell Script Linter

Open full tool