DEV Community

loading...

The linux commands that help me work

skorotkiewicz profile image Sebastian Korotkiewicz Updated on ・1 min read

Here are some commands to help me with my work.
And what are your useful commands that you use?

Searching for files and folders with the given name

grep -i -n -r 'Search string' /var/www/path/

Shows how many times is the search text in the text and returns the result in numbers.

cat access.log|grep 'Search string' | wc -l

Remove all .gz from /var/log/

find /var/log/ -name "*.gz" -type f -delete

Find and replace text in all files in a directory

find ./ -type f -exec sed -i 's/string1/string2/g' {} \;

find /var/log/ -type f -exec sed -i 's/string1/string2/g' {} \;

Unpack tar files

tar -xJfv file.tar.xz
tar -xvf file.tar.bz2

while

while true; do COMMEND; done

Live viewing of web server logs

tail -f /var/log/nginx/korotkiewicz-access.log

Discussion (15)

pic
Editor guide
Collapse
bobbyiliev profile image
Bobby Iliev • Edited

Great commands! I use them a lot too!

Here are 15 networking commands that might help you a lot as well:

Top 15 Linux Networking tools that you should know

Collapse
djkianoosh profile image
Kianoosh Raika

A few improvements to those commands I've found recently over the last few years...

grep: ripgrep or silver searcher, which know to ignore source control files by default and are just faster
bat: a syntax highlighting cat

also awk is surprisingly useful

Collapse
oraclesean profile image
Oracle Sean ♠️

I use cut to split and parse content a lot. The syntax is short, sweet and easy to remember:

cat file.csv | cut -d, -f2

This prints the second field (-f2) delimited by a comma (-d,). Useful for parsing CSV files.

awk is wonderfully powerful, too. Use it for simple result printing and formatting:

awk '{print $2}' file.csv # Print the second column, using the default space delimiter
awk -F',' '{print $2}' file.csv # Print the second column, using the comma as the delimiter

But it does so much more. Since we're talking about searching text with find, sed and grep, awk lets you search for text based on its specific position in a file:

awk -F',' '$2 == Jones {print $0}' file.csv # Print lines where the second field = Jones
awk -F',' -v x="$VAR" '$2 == x {print $0}' file.csv # Print lines where the second field = a shell variable $VAR; variables are assigned/passed using the `-v` flag.

Test can be combined:

ps -ef | awk -v program="$VAR" -v mysid="$$" '$2 != mysid && $3 != mysid && $0 ~ program {print $0}'

This shows all running processes not associated with your own PID that match the shell variable $VAR. $$ is a special shell construct containing the session's process ID.

  • Show processes (ps -ef) with PID and Parent PID in columns 2 and 3 respectively;
  • Pipe (|) output to awk;
  • Assign $VAR and $$ to awk variables program and mysid;
  • Combine three tests with the "and" operator (&&; the "or" operator is ||): ** Check columns 2 and 3 do not match mysid (!= mysid); ** Check the whole line ($0) for a match with program;
  • Print the entire matching line ('{print $0}') when all three conditions are met. Use this to test whether another process is already running a script (without using a lock file). Use basename to get the script name, look for other processes running it and exit. As a bonus, exclude processes editing the file and not running it.
Collapse
tomfern profile image
Tomas Fernandez
grep 'Search string' access.log | wc -l

is the same as (cat is not needed here):

cat access.log|grep 'Search string' | wc -l

Also I prefer not using -v in tar. It shows a lot of noise and I may miss some important error.

Thanks for sharing your commands. 😀

Collapse
polmonroig profile image
Pol Monroig Company

In fact you can even remove the wc command with the -c flag of grep 😉

Collapse
tomfern profile image
Tomas Fernandez

You're right! I'm always forgetting it can count.

Collapse
derekjhopper profile image
Derek Hopper

One that comes in handy frequently for me is zgrep. It's the same as grep, but for files that are gzipped. It's helpful when logrotate is in use.

Collapse
slimdave profile image
David Aldridge

My fingers will do ls -altr in my sleep.

Collapse
oraclesean profile image
Oracle Sean ♠️

Mine is ps -ef

Collapse
skorotkiewicz profile image
Sebastian Korotkiewicz Author

My fingers ls -lashx.

Collapse
dwilmer profile image
Daan Wilmer
tail -n 20 error.log

to get the last 20 lines of the error log is perhaps the most used command on my machine.

Collapse
skorotkiewicz profile image
Sebastian Korotkiewicz Author • Edited

You're right, I forgot tail! I use tail -f file.log to live track of my logs.

Ideal for live viewing of web server logs.

tail -f /var/log/nginx/korotkiewicz-access.log
Collapse
ashleyjsheridan profile image
Ashley Sheridan

history | grep "partial" to search your command history for some long command you've used before but don't remember exactly. partial is just whatever part you can remember

Collapse
skorotkiewicz profile image
Sebastian Korotkiewicz Author

You can also press CTRL + R and start writing the first letters of the command, and again press CTRL + R to find the next matches, next, and next...

Collapse
scorp13 profile image
scorp13

I have a little terminal commands cheat sheet for myself, maybe it will be useful to someone else.