I have been using the terminal daily for over a decade. I thought I knew my way around. Then a coworker shared a one-liner that did in 3 seconds what my janky shell script did in 40 lines. I mass-deleted half my aliases that week.
Here are the tricks that did it. Some of these are genuinely obscure. Others are hiding in plain sight in tools you already use every day.
1. Process Substitution (The One That Changes Everything)
You probably know pipes. You might know redirects. But process substitution is the thing most developers never learn, and it is absurdly powerful.
diff <(sort file1.txt) <(sort file2.txt)
The <(...) syntax runs a command and presents its output as if it were a file. This means you can use it anywhere a filename is expected. Want to diff two remote files without downloading them first?
diff <(curl -s https://api.example.com/v1/config) <(curl -s https://api.example.com/v2/config)
Or compare two branches of a JSON config:
diff <(git show main:config.json | jq -S .) <(git show dev:config.json | jq -S .)
Once you internalize this, you stop creating temp files forever.
2. xargs -P: Instant Parallelism
Most people know xargs. Almost nobody uses the -P flag.
find . -name '*.png' | xargs -P 8 -I {} optipng {}
That -P 8 runs 8 processes in parallel. Compressing 200 images? This turns a 10-minute job into a 1-minute job. It works with anything:
cat urls.txt | xargs -P 10 -I {} curl -sO {}
Ten parallel downloads, no scripting, no background jobs to manage.
3. The Curly Brace Expansion Nobody Uses Fully
You have probably done mkdir -p src/{components,utils,hooks}. But brace expansion goes deeper than most realize.
Rename a file without typing the path twice:
mv /long/annoying/path/config.json{,.bak}
That expands to mv /long/annoying/path/config.json /long/annoying/path/config.json.bak.
Sequence expressions:
echo {01..12} # 01 02 03 04 05 06 07 08 09 10 11 12
echo {a..z..3} # a d g j m p s v y
mkdir day-{01..31} # 31 directories in one shot
Nested expansion for combinatorial generation:
echo {api,web}-{dev,staging,prod}
# api-dev api-staging api-prod web-dev web-staging web-prod
4. Column Selection With cut and awk That Actually Makes Sense
Stop writing janky loops to parse columnar output.
# Get just PIDs of processes matching 'node'
ps aux | awk '/node/ {print $2}'
# Grab the 3rd field from a CSV
cut -d',' -f3 data.csv
# Get last field regardless of how many fields exist
awk '{print $NF}' file.txt
The $NF trick is the one that saves you the most time. NF means "number of fields" so $NF is always the last column. No more counting fields.
5. The !! and !$ History Shortcuts
You forgot sudo:
apt install something
# Permission denied
sudo !!
!! repeats the entire last command. But !$ is even more useful -- it grabs the last argument of the previous command:
mkdir -p /some/deeply/nested/new/directory
cd !$
No retyping, no reaching for the mouse to copy-paste a path.
6. The column Command: Instant Pretty Tables
You have data that looks like garbage:
echo -e "name,age,city\nalice,30,nyc\nbob,25,sf" | column -t -s','
Output:
name age city
alice 30 nyc
bob 25 sf
Pipe any CSV, TSV, or delimited output through column -t and it becomes readable. I use this daily for quick data inspection.
7. comm: The Set Operations Tool You Never Knew Existed
Want lines that are only in file A, only in file B, or in both? comm does set operations on sorted files.
comm -23 <(sort file1.txt) <(sort file2.txt) # Only in file1
comm -13 <(sort file1.txt) <(sort file2.txt) # Only in file2
comm -12 <(sort file1.txt) <(sort file2.txt) # In both files
Notice the process substitution from trick #1. These compose beautifully together. The flags suppress columns: -2 suppresses "only in file2", -3 suppresses "in both". So -23 gives you "only in file1."
This is how I diff environment variable lists between servers, compare package versions, and find missing translations.
8. tar Can Stream Over SSH
Forget scp for directories. This is faster and does not require disk space on either side for intermediary files:
tar czf - /local/directory | ssh user@remote 'tar xzf - -C /remote/path'
The - means stdin/stdout. You are creating a tar stream, piping it over SSH, and extracting on the other side in real time. No temp files, no zipping then transferring then unzipping.
9. tee For When You Need Both
Want to see output AND save it to a file?
long-running-build 2>&1 | tee build.log
But the real power move is using tee mid-pipeline to debug what is flowing through:
cat access.log | tee /dev/stderr | grep 'ERROR' | wc -l
This prints the full stream to stderr (so you see it) while the pipeline continues to count errors. Invaluable when a pipeline is giving unexpected results and you need to see the intermediate data.
10. grep -c and sort | uniq -c | sort -rn: The Analysis Pipeline
This three-command pipeline is the most underrated data analysis tool on any system:
awk '{print $1}' access.log | sort | uniq -c | sort -rn | head -20
This gives you the top 20 most frequent values in column 1. Change the awk field number and you can instantly answer questions like:
- What IPs hit us most? (
$1) - What endpoints are hottest? (
$7) - What status codes dominate? (
$9)
You can answer questions that would take an Elasticsearch query or a Python script in under 5 seconds with this pattern.
11. watch: Auto-Refresh Any Command
watch -n 2 'kubectl get pods | grep -v Running'
This re-runs the command every 2 seconds and shows you the output. Add -d to highlight what changed between refreshes. I use this constantly for monitoring deployments, watching disk space during big operations, and tracking build progress.
12. Redirect stderr and stdout Independently
Most devs know 2>&1. Fewer know you can send them to completely different places:
command > output.log 2> errors.log
Or discard errors while keeping output:
find / -name 'config.yml' 2>/dev/null
Or swap them entirely (stdout becomes stderr and vice versa):
command 3>&1 1>&2 2>&3
That last one uses file descriptor 3 as a temp swap variable. It is the kind of thing that looks unhinged until you need it, and then you really need it.
13. rsync --dry-run: Preview Before You Wreck
rsync -avhn source/ destination/
The -n flag is --dry-run. It shows you exactly what rsync would do without doing it. I run this before every significant file sync. The one time it saves you from overwriting production assets pays for the habit permanently.
14. Trap: Clean Up After Yourself
tmpfile=$(mktemp)
trap 'rm -f "$tmpfile"' EXIT
# Use $tmpfile however you want
# It gets cleaned up automatically when the script exits
The trap builtin runs a command when the shell receives a signal. EXIT fires when the script ends for any reason -- normal exit, error, even Ctrl+C. No more orphaned temp files cluttering /tmp.
The Pattern
Notice that none of these tricks require installing anything. They are all built into standard Linux distributions and have been for decades. The command line is not a relic. It is a composable toolkit where every piece connects to every other piece through text streams.
The developers who move fastest in the terminal are not typing faster. They are combining these primitives in ways that eliminate entire categories of manual work.
I keep a set of cheatsheets pinned next to my monitor for exactly these kinds of tricks -- organized by tool, with flags, examples, and real-world patterns I actually use. I put together a CLI Cheatsheet Bundle that covers the commands, pipelines, and workflows that took me years to collect. If you want to shortcut the process of building this muscle memory, it is all there.
Top comments (0)