I once wrote a script that processed thousands of log files but ended up with dozens of unnecessary temporary files. It worked, but it was slow and messy.
Then I realized I could eliminate every temp file just by using pipes (|
) and redirection (>
, >>
, <
) properly. The result? The same task ran 50% faster, with zero clutter.
If you are still writing to temp files for processing, it's time to unlock the full power of Bash pipes and redirection.
Need a structured Bash reference for automation?
👉 Get the Bash Cheat Sheet for $3.99
1. Stop Using Temporary Files: Process Data on the Fly
Many scripts use unnecessary temp files, slowing down execution and cluttering directories.
❌ The Wrong Way: Using Temp Files
cat access.log | grep "404" > errors.txt
sort errors.txt | uniq -c > count.txt
cat count.txt
✅ The Right Way: Using Pipes
grep "404" access.log | sort | uniq -c
🔹 Why it's better:
- Eliminates unnecessary files (
errors.txt
,count.txt
) - Faster execution (data flows between commands without writing to disk)
2. Redirecting Output: Save, Append, and Merge Streams
Save Output to a File (>
and >>
)
ls -lh > file_list.txt # Overwrites file
ls -lh >> file_list.txt # Appends to file
Capture Both stdout
and stderr
command > output.log 2>&1
Or shorter:
command &> output.log
🔹 Why it's useful: Logs both normal output and errors into a single file.
3. Filter and Transform Data in Real-Time
Pipes let you chain commands together, making complex data processing effortless.
Find Top 10 Most Visited Pages
cat access.log | awk '{print $7}' | sort | uniq -c | sort -nr | head -10
🔹 What’s happening here?
-
awk '{print $7}'
extracts the URL path from each log entry -
sort | uniq -c
counts occurrences of each unique path -
sort -nr | head -10
displays the most visited pages
4. Redirecting Input: Feed Data from a File or Another Command
Use a File as Input (<
)
sort < unsorted_list.txt
Use a Command’s Output as Input (|
)
ps aux | grep nginx
🔹 Why it's useful: Avoids needing intermediate files when processing data.
5. Merging and Splitting Streams: /dev/null
, tee
, and More
Suppress Output (/dev/null
)
command > /dev/null 2>&1
🔹 Why? Runs the command silently, ignoring both stdout
and stderr
.
Save Output While Still Showing It (tee
)
command | tee output.log
🔹 Why? Useful for logging while monitoring.
6. Combining Multiple Commands with xargs
Find and Delete Large Files
find /home -type f -size +100M | xargs rm -v
🔹 Why it's better: xargs
processes multiple files in one command, unlike find -exec rm {} \;
.
7. Using yes
to Automate Interactive Commands
Force Overwrite Without Prompting
yes | cp -i file.txt /backup/
🔹 Why? Avoids manually confirming every file overwrite.
8. Process Substitution: Pass Command Output as a File
Compare Two Command Outputs
diff <(ls /dir1) <(ls /dir2)
🔹 Why it's useful: No need to create temporary files just to compare results.
9. Real-Time Log Monitoring with tail -f
and grep
Monitor Logs for Errors in Real-Time
tail -f /var/log/syslog | grep "ERROR"
🔹 Why? Instantly filters live logs instead of manually scrolling.
10. Nested Pipes: When One Isn’t Enough
Find the Most Common IPs Making Requests
cat access.log | awk '{print $1}' | sort | uniq -c | sort -nr | head -10
🔹 Why? Finds high-traffic IPs without needing a database.
Final Thoughts: Level Up Your Bash Game
Mastering pipes and redirection will make every Linux task faster and more efficient.
Quick Recap:
✅ Use pipes (|
) to eliminate temporary files
✅ Redirect output (>
, >>
) for logging
✅ Merge and split streams (/dev/null
, tee
)
✅ Use xargs
for batch processing
✅ Leverage process substitution (<()
) for comparisons
Want a Structured Bash Reference?
If you need a Beginner Friendly Bash guide with easy to follow tips and explanations, check out my Bash Cheat Sheet:
👉 Download the Bash Cheat Sheet for just $3.99
Discussion: What’s Your Favorite Bash Piping Trick?
Drop a comment below and share how you use pipes and redirection to automate tasks!
Top comments (0)